Sep 9 06:04:56.923461 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 9 03:39:34 -00 2025 Sep 9 06:04:56.923476 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 06:04:56.923483 kernel: BIOS-provided physical RAM map: Sep 9 06:04:56.923487 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Sep 9 06:04:56.923491 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Sep 9 06:04:56.923495 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Sep 9 06:04:56.923500 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Sep 9 06:04:56.923504 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Sep 9 06:04:56.923508 kernel: BIOS-e820: [mem 0x0000000040400000-0x00000000819ccfff] usable Sep 9 06:04:56.923513 kernel: BIOS-e820: [mem 0x00000000819cd000-0x00000000819cdfff] ACPI NVS Sep 9 06:04:56.923517 kernel: BIOS-e820: [mem 0x00000000819ce000-0x00000000819cefff] reserved Sep 9 06:04:56.923521 kernel: BIOS-e820: [mem 0x00000000819cf000-0x000000008afccfff] usable Sep 9 06:04:56.923525 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Sep 9 06:04:56.923530 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Sep 9 06:04:56.923535 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Sep 9 06:04:56.923540 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Sep 9 06:04:56.923545 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Sep 9 06:04:56.923550 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Sep 9 06:04:56.923555 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Sep 9 06:04:56.923559 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Sep 9 06:04:56.923564 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Sep 9 06:04:56.923568 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Sep 9 06:04:56.923573 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Sep 9 06:04:56.923578 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Sep 9 06:04:56.923582 kernel: NX (Execute Disable) protection: active Sep 9 06:04:56.923587 kernel: APIC: Static calls initialized Sep 9 06:04:56.923592 kernel: SMBIOS 3.2.1 present. Sep 9 06:04:56.923597 kernel: DMI: Supermicro SYS-5019C-MR/X11SCM-F, BIOS 1.9 09/16/2022 Sep 9 06:04:56.923602 kernel: DMI: Memory slots populated: 1/4 Sep 9 06:04:56.923606 kernel: tsc: Detected 3400.000 MHz processor Sep 9 06:04:56.923611 kernel: tsc: Detected 3399.906 MHz TSC Sep 9 06:04:56.923616 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 06:04:56.923621 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 06:04:56.923626 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Sep 9 06:04:56.923630 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Sep 9 06:04:56.923635 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 06:04:56.923641 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Sep 9 06:04:56.923646 kernel: Using GB pages for direct mapping Sep 9 06:04:56.923651 kernel: ACPI: Early table checksum verification disabled Sep 9 06:04:56.923656 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Sep 9 06:04:56.923662 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Sep 9 06:04:56.923670 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Sep 9 06:04:56.923675 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Sep 9 06:04:56.923681 kernel: ACPI: FACS 0x000000008C66CF80 000040 Sep 9 06:04:56.923702 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Sep 9 06:04:56.923707 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Sep 9 06:04:56.923712 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Sep 9 06:04:56.923717 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Sep 9 06:04:56.923722 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Sep 9 06:04:56.923727 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Sep 9 06:04:56.923733 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Sep 9 06:04:56.923738 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Sep 9 06:04:56.923743 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 9 06:04:56.923748 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Sep 9 06:04:56.923753 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Sep 9 06:04:56.923758 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 9 06:04:56.923763 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 9 06:04:56.923767 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Sep 9 06:04:56.923773 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Sep 9 06:04:56.923778 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Sep 9 06:04:56.923783 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Sep 9 06:04:56.923788 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Sep 9 06:04:56.923793 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Sep 9 06:04:56.923798 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Sep 9 06:04:56.923803 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Sep 9 06:04:56.923808 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Sep 9 06:04:56.923813 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Sep 9 06:04:56.923819 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Sep 9 06:04:56.923824 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Sep 9 06:04:56.923828 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Sep 9 06:04:56.923833 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Sep 9 06:04:56.923838 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Sep 9 06:04:56.923843 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Sep 9 06:04:56.923848 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Sep 9 06:04:56.923853 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Sep 9 06:04:56.923858 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Sep 9 06:04:56.923864 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Sep 9 06:04:56.923869 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Sep 9 06:04:56.923874 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Sep 9 06:04:56.923879 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Sep 9 06:04:56.923883 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Sep 9 06:04:56.923888 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Sep 9 06:04:56.923893 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Sep 9 06:04:56.923898 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Sep 9 06:04:56.923903 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Sep 9 06:04:56.923909 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Sep 9 06:04:56.923913 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Sep 9 06:04:56.923918 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Sep 9 06:04:56.923923 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Sep 9 06:04:56.923928 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Sep 9 06:04:56.923933 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Sep 9 06:04:56.923938 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Sep 9 06:04:56.923943 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Sep 9 06:04:56.923947 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Sep 9 06:04:56.923953 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Sep 9 06:04:56.923958 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Sep 9 06:04:56.923963 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Sep 9 06:04:56.923968 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Sep 9 06:04:56.923973 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Sep 9 06:04:56.923977 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Sep 9 06:04:56.923982 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Sep 9 06:04:56.923987 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Sep 9 06:04:56.923992 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Sep 9 06:04:56.923997 kernel: No NUMA configuration found Sep 9 06:04:56.924003 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Sep 9 06:04:56.924007 kernel: NODE_DATA(0) allocated [mem 0x86eff8dc0-0x86effffff] Sep 9 06:04:56.924012 kernel: Zone ranges: Sep 9 06:04:56.924017 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 06:04:56.924022 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 9 06:04:56.924027 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Sep 9 06:04:56.924032 kernel: Device empty Sep 9 06:04:56.924037 kernel: Movable zone start for each node Sep 9 06:04:56.924042 kernel: Early memory node ranges Sep 9 06:04:56.924047 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Sep 9 06:04:56.924052 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Sep 9 06:04:56.924057 kernel: node 0: [mem 0x0000000040400000-0x00000000819ccfff] Sep 9 06:04:56.924062 kernel: node 0: [mem 0x00000000819cf000-0x000000008afccfff] Sep 9 06:04:56.924067 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Sep 9 06:04:56.924075 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Sep 9 06:04:56.924081 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Sep 9 06:04:56.924086 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Sep 9 06:04:56.924091 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 06:04:56.924097 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Sep 9 06:04:56.924103 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Sep 9 06:04:56.924108 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Sep 9 06:04:56.924113 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Sep 9 06:04:56.924118 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Sep 9 06:04:56.924124 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Sep 9 06:04:56.924129 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Sep 9 06:04:56.924134 kernel: ACPI: PM-Timer IO Port: 0x1808 Sep 9 06:04:56.924140 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Sep 9 06:04:56.924145 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Sep 9 06:04:56.924151 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Sep 9 06:04:56.924156 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Sep 9 06:04:56.924161 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Sep 9 06:04:56.924166 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Sep 9 06:04:56.924171 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Sep 9 06:04:56.924177 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Sep 9 06:04:56.924182 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Sep 9 06:04:56.924188 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Sep 9 06:04:56.924193 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Sep 9 06:04:56.924198 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Sep 9 06:04:56.924203 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Sep 9 06:04:56.924209 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Sep 9 06:04:56.924214 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Sep 9 06:04:56.924219 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Sep 9 06:04:56.924224 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Sep 9 06:04:56.924229 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 9 06:04:56.924235 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 06:04:56.924241 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 06:04:56.924246 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 9 06:04:56.924251 kernel: TSC deadline timer available Sep 9 06:04:56.924256 kernel: CPU topo: Max. logical packages: 1 Sep 9 06:04:56.924262 kernel: CPU topo: Max. logical dies: 1 Sep 9 06:04:56.924267 kernel: CPU topo: Max. dies per package: 1 Sep 9 06:04:56.924272 kernel: CPU topo: Max. threads per core: 2 Sep 9 06:04:56.924277 kernel: CPU topo: Num. cores per package: 8 Sep 9 06:04:56.924282 kernel: CPU topo: Num. threads per package: 16 Sep 9 06:04:56.924288 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Sep 9 06:04:56.924294 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Sep 9 06:04:56.924299 kernel: Booting paravirtualized kernel on bare hardware Sep 9 06:04:56.924304 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 06:04:56.924310 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Sep 9 06:04:56.924315 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Sep 9 06:04:56.924320 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Sep 9 06:04:56.924325 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Sep 9 06:04:56.924331 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 06:04:56.924337 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 06:04:56.924343 kernel: random: crng init done Sep 9 06:04:56.924348 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Sep 9 06:04:56.924353 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Sep 9 06:04:56.924358 kernel: Fallback order for Node 0: 0 Sep 9 06:04:56.924363 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8363245 Sep 9 06:04:56.924369 kernel: Policy zone: Normal Sep 9 06:04:56.924374 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 06:04:56.924380 kernel: software IO TLB: area num 16. Sep 9 06:04:56.924385 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Sep 9 06:04:56.924390 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 06:04:56.924396 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 06:04:56.924401 kernel: Dynamic Preempt: voluntary Sep 9 06:04:56.924406 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 06:04:56.924412 kernel: rcu: RCU event tracing is enabled. Sep 9 06:04:56.924417 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Sep 9 06:04:56.924422 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 06:04:56.924428 kernel: Rude variant of Tasks RCU enabled. Sep 9 06:04:56.924434 kernel: Tracing variant of Tasks RCU enabled. Sep 9 06:04:56.924439 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 06:04:56.924444 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Sep 9 06:04:56.924449 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 06:04:56.924455 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 06:04:56.924460 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Sep 9 06:04:56.924465 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Sep 9 06:04:56.924470 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 06:04:56.924477 kernel: Console: colour VGA+ 80x25 Sep 9 06:04:56.924482 kernel: printk: legacy console [tty0] enabled Sep 9 06:04:56.924487 kernel: printk: legacy console [ttyS1] enabled Sep 9 06:04:56.924492 kernel: ACPI: Core revision 20240827 Sep 9 06:04:56.924498 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Sep 9 06:04:56.924503 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 06:04:56.924508 kernel: DMAR: Host address width 39 Sep 9 06:04:56.924513 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Sep 9 06:04:56.924518 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Sep 9 06:04:56.924524 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Sep 9 06:04:56.924530 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Sep 9 06:04:56.924535 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Sep 9 06:04:56.924540 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Sep 9 06:04:56.924546 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Sep 9 06:04:56.924551 kernel: x2apic enabled Sep 9 06:04:56.924556 kernel: APIC: Switched APIC routing to: cluster x2apic Sep 9 06:04:56.924562 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Sep 9 06:04:56.924567 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Sep 9 06:04:56.924573 kernel: CPU0: Thermal monitoring enabled (TM1) Sep 9 06:04:56.924578 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Sep 9 06:04:56.924583 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Sep 9 06:04:56.924588 kernel: process: using mwait in idle threads Sep 9 06:04:56.924593 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 06:04:56.924599 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 9 06:04:56.924604 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Sep 9 06:04:56.924609 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Sep 9 06:04:56.924614 kernel: RETBleed: Mitigation: Enhanced IBRS Sep 9 06:04:56.924619 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 06:04:56.924624 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 06:04:56.924630 kernel: TAA: Mitigation: TSX disabled Sep 9 06:04:56.924635 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Sep 9 06:04:56.924641 kernel: SRBDS: Mitigation: Microcode Sep 9 06:04:56.924646 kernel: GDS: Vulnerable: No microcode Sep 9 06:04:56.924651 kernel: active return thunk: its_return_thunk Sep 9 06:04:56.924656 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 9 06:04:56.924661 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 06:04:56.924667 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 06:04:56.924695 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 06:04:56.924700 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Sep 9 06:04:56.924705 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Sep 9 06:04:56.924711 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 06:04:56.924734 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Sep 9 06:04:56.924739 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Sep 9 06:04:56.924744 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Sep 9 06:04:56.924750 kernel: Freeing SMP alternatives memory: 32K Sep 9 06:04:56.924755 kernel: pid_max: default: 32768 minimum: 301 Sep 9 06:04:56.924760 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 06:04:56.924765 kernel: landlock: Up and running. Sep 9 06:04:56.924770 kernel: SELinux: Initializing. Sep 9 06:04:56.924775 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 06:04:56.924780 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 06:04:56.924786 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Sep 9 06:04:56.924792 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Sep 9 06:04:56.924797 kernel: ... version: 4 Sep 9 06:04:56.924802 kernel: ... bit width: 48 Sep 9 06:04:56.924808 kernel: ... generic registers: 4 Sep 9 06:04:56.924813 kernel: ... value mask: 0000ffffffffffff Sep 9 06:04:56.924818 kernel: ... max period: 00007fffffffffff Sep 9 06:04:56.924823 kernel: ... fixed-purpose events: 3 Sep 9 06:04:56.924828 kernel: ... event mask: 000000070000000f Sep 9 06:04:56.924834 kernel: signal: max sigframe size: 2032 Sep 9 06:04:56.924840 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Sep 9 06:04:56.924845 kernel: rcu: Hierarchical SRCU implementation. Sep 9 06:04:56.924850 kernel: rcu: Max phase no-delay instances is 400. Sep 9 06:04:56.924855 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Sep 9 06:04:56.924861 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Sep 9 06:04:56.924866 kernel: smp: Bringing up secondary CPUs ... Sep 9 06:04:56.924871 kernel: smpboot: x86: Booting SMP configuration: Sep 9 06:04:56.924876 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Sep 9 06:04:56.924882 kernel: Transient Scheduler Attacks: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 9 06:04:56.924888 kernel: smp: Brought up 1 node, 16 CPUs Sep 9 06:04:56.924894 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Sep 9 06:04:56.924899 kernel: Memory: 32695152K/33452980K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 732528K reserved, 0K cma-reserved) Sep 9 06:04:56.924904 kernel: devtmpfs: initialized Sep 9 06:04:56.924910 kernel: x86/mm: Memory block size: 128MB Sep 9 06:04:56.924915 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x819cd000-0x819cdfff] (4096 bytes) Sep 9 06:04:56.924920 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Sep 9 06:04:56.924926 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 06:04:56.924932 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Sep 9 06:04:56.924937 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 06:04:56.924942 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 06:04:56.924947 kernel: audit: initializing netlink subsys (disabled) Sep 9 06:04:56.924953 kernel: audit: type=2000 audit(1757397889.041:1): state=initialized audit_enabled=0 res=1 Sep 9 06:04:56.924958 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 06:04:56.924963 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 06:04:56.924968 kernel: cpuidle: using governor menu Sep 9 06:04:56.924973 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 06:04:56.924979 kernel: dca service started, version 1.12.1 Sep 9 06:04:56.924985 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Sep 9 06:04:56.924990 kernel: PCI: Using configuration type 1 for base access Sep 9 06:04:56.924995 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 06:04:56.925001 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 06:04:56.925006 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 06:04:56.925011 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 06:04:56.925016 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 06:04:56.925022 kernel: ACPI: Added _OSI(Module Device) Sep 9 06:04:56.925028 kernel: ACPI: Added _OSI(Processor Device) Sep 9 06:04:56.925033 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 06:04:56.925038 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Sep 9 06:04:56.925043 kernel: ACPI: Dynamic OEM Table Load: Sep 9 06:04:56.925049 kernel: ACPI: SSDT 0xFFFF9B48020D2400 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Sep 9 06:04:56.925054 kernel: ACPI: Dynamic OEM Table Load: Sep 9 06:04:56.925059 kernel: ACPI: SSDT 0xFFFF9B48021A4000 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Sep 9 06:04:56.925064 kernel: ACPI: Dynamic OEM Table Load: Sep 9 06:04:56.925069 kernel: ACPI: SSDT 0xFFFF9B4800246300 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Sep 9 06:04:56.925075 kernel: ACPI: Dynamic OEM Table Load: Sep 9 06:04:56.925081 kernel: ACPI: SSDT 0xFFFF9B48021A7800 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Sep 9 06:04:56.925086 kernel: ACPI: Dynamic OEM Table Load: Sep 9 06:04:56.925091 kernel: ACPI: SSDT 0xFFFF9B48001A2000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Sep 9 06:04:56.925096 kernel: ACPI: Dynamic OEM Table Load: Sep 9 06:04:56.925102 kernel: ACPI: SSDT 0xFFFF9B48020D1000 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Sep 9 06:04:56.925107 kernel: ACPI: Interpreter enabled Sep 9 06:04:56.925112 kernel: ACPI: PM: (supports S0 S5) Sep 9 06:04:56.925117 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 06:04:56.925122 kernel: HEST: Enabling Firmware First mode for corrected errors. Sep 9 06:04:56.925129 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Sep 9 06:04:56.925134 kernel: HEST: Table parsing has been initialized. Sep 9 06:04:56.925139 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Sep 9 06:04:56.925144 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 06:04:56.925150 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 06:04:56.925155 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Sep 9 06:04:56.925160 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Sep 9 06:04:56.925166 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Sep 9 06:04:56.925171 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Sep 9 06:04:56.925177 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Sep 9 06:04:56.925182 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Sep 9 06:04:56.925188 kernel: ACPI: \_TZ_.FN00: New power resource Sep 9 06:04:56.925193 kernel: ACPI: \_TZ_.FN01: New power resource Sep 9 06:04:56.925198 kernel: ACPI: \_TZ_.FN02: New power resource Sep 9 06:04:56.925203 kernel: ACPI: \_TZ_.FN03: New power resource Sep 9 06:04:56.925209 kernel: ACPI: \_TZ_.FN04: New power resource Sep 9 06:04:56.925214 kernel: ACPI: \PIN_: New power resource Sep 9 06:04:56.925219 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Sep 9 06:04:56.925294 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 06:04:56.925344 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Sep 9 06:04:56.925392 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Sep 9 06:04:56.925400 kernel: PCI host bridge to bus 0000:00 Sep 9 06:04:56.925448 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 06:04:56.925491 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 06:04:56.925536 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 06:04:56.925578 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Sep 9 06:04:56.925620 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Sep 9 06:04:56.925662 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Sep 9 06:04:56.925761 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Sep 9 06:04:56.925819 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Sep 9 06:04:56.925871 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 9 06:04:56.925920 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Sep 9 06:04:56.925967 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 9 06:04:56.926016 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Sep 9 06:04:56.926068 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Sep 9 06:04:56.926116 kernel: pci 0000:00:08.0: BAR 0 [mem 0x9551f000-0x9551ffff 64bit] Sep 9 06:04:56.926167 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Sep 9 06:04:56.926217 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9551e000-0x9551efff 64bit] Sep 9 06:04:56.926269 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Sep 9 06:04:56.926317 kernel: pci 0000:00:14.0: BAR 0 [mem 0x95500000-0x9550ffff 64bit] Sep 9 06:04:56.926364 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Sep 9 06:04:56.926415 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Sep 9 06:04:56.926464 kernel: pci 0000:00:14.2: BAR 0 [mem 0x95512000-0x95513fff 64bit] Sep 9 06:04:56.926514 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9551d000-0x9551dfff 64bit] Sep 9 06:04:56.926571 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Sep 9 06:04:56.926622 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 9 06:04:56.926680 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Sep 9 06:04:56.926728 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 9 06:04:56.926779 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Sep 9 06:04:56.926829 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9551a000-0x9551afff 64bit] Sep 9 06:04:56.926876 kernel: pci 0000:00:16.0: PME# supported from D3hot Sep 9 06:04:56.926927 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Sep 9 06:04:56.926975 kernel: pci 0000:00:16.1: BAR 0 [mem 0x95519000-0x95519fff 64bit] Sep 9 06:04:56.927022 kernel: pci 0000:00:16.1: PME# supported from D3hot Sep 9 06:04:56.927073 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Sep 9 06:04:56.927121 kernel: pci 0000:00:16.4: BAR 0 [mem 0x95518000-0x95518fff 64bit] Sep 9 06:04:56.927170 kernel: pci 0000:00:16.4: PME# supported from D3hot Sep 9 06:04:56.927220 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Sep 9 06:04:56.927268 kernel: pci 0000:00:17.0: BAR 0 [mem 0x95510000-0x95511fff] Sep 9 06:04:56.927314 kernel: pci 0000:00:17.0: BAR 1 [mem 0x95517000-0x955170ff] Sep 9 06:04:56.927363 kernel: pci 0000:00:17.0: BAR 2 [io 0x6050-0x6057] Sep 9 06:04:56.927410 kernel: pci 0000:00:17.0: BAR 3 [io 0x6040-0x6043] Sep 9 06:04:56.927457 kernel: pci 0000:00:17.0: BAR 4 [io 0x6020-0x603f] Sep 9 06:04:56.927503 kernel: pci 0000:00:17.0: BAR 5 [mem 0x95516000-0x955167ff] Sep 9 06:04:56.927550 kernel: pci 0000:00:17.0: PME# supported from D3hot Sep 9 06:04:56.927603 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Sep 9 06:04:56.927652 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 9 06:04:56.927725 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Sep 9 06:04:56.927800 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Sep 9 06:04:56.927849 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 9 06:04:56.927898 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 9 06:04:56.927946 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Sep 9 06:04:56.927993 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Sep 9 06:04:56.928047 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Sep 9 06:04:56.928098 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 9 06:04:56.928146 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 9 06:04:56.928194 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Sep 9 06:04:56.928241 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Sep 9 06:04:56.928295 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Sep 9 06:04:56.928344 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 9 06:04:56.928391 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Sep 9 06:04:56.928445 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 PCIe Root Port Sep 9 06:04:56.928493 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 9 06:04:56.928540 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Sep 9 06:04:56.928587 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Sep 9 06:04:56.928635 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Sep 9 06:04:56.928752 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Sep 9 06:04:56.928802 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Sep 9 06:04:56.928856 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Sep 9 06:04:56.928909 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Sep 9 06:04:56.928957 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x95514000-0x955140ff 64bit] Sep 9 06:04:56.929004 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Sep 9 06:04:56.929059 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Sep 9 06:04:56.929110 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Sep 9 06:04:56.929167 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 9 06:04:56.929217 kernel: pci 0000:01:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Sep 9 06:04:56.929267 kernel: pci 0000:01:00.0: ROM [mem 0x95200000-0x952fffff pref] Sep 9 06:04:56.929315 kernel: pci 0000:01:00.0: PME# supported from D3cold Sep 9 06:04:56.929364 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 9 06:04:56.929414 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 9 06:04:56.929467 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Sep 9 06:04:56.929520 kernel: pci 0000:01:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Sep 9 06:04:56.929569 kernel: pci 0000:01:00.1: ROM [mem 0x95100000-0x951fffff pref] Sep 9 06:04:56.929618 kernel: pci 0000:01:00.1: PME# supported from D3cold Sep 9 06:04:56.929666 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Sep 9 06:04:56.929763 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Sep 9 06:04:56.929812 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 9 06:04:56.929862 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 9 06:04:56.929918 kernel: pci 0000:03:00.0: working around ROM BAR overlap defect Sep 9 06:04:56.929968 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 9 06:04:56.930017 kernel: pci 0000:03:00.0: BAR 0 [mem 0x95400000-0x9547ffff] Sep 9 06:04:56.930065 kernel: pci 0000:03:00.0: BAR 2 [io 0x5000-0x501f] Sep 9 06:04:56.930116 kernel: pci 0000:03:00.0: BAR 3 [mem 0x95480000-0x95483fff] Sep 9 06:04:56.930165 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Sep 9 06:04:56.930214 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 9 06:04:56.930270 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Sep 9 06:04:56.930319 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Sep 9 06:04:56.930369 kernel: pci 0000:04:00.0: BAR 0 [mem 0x95300000-0x9537ffff] Sep 9 06:04:56.930417 kernel: pci 0000:04:00.0: BAR 2 [io 0x4000-0x401f] Sep 9 06:04:56.930466 kernel: pci 0000:04:00.0: BAR 3 [mem 0x95380000-0x95383fff] Sep 9 06:04:56.930515 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Sep 9 06:04:56.930563 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 9 06:04:56.930614 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 9 06:04:56.930667 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Sep 9 06:04:56.930781 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 9 06:04:56.930831 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Sep 9 06:04:56.930880 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Sep 9 06:04:56.930929 kernel: pci 0000:06:00.0: enabling Extended Tags Sep 9 06:04:56.930978 kernel: pci 0000:06:00.0: supports D1 D2 Sep 9 06:04:56.931029 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 9 06:04:56.931078 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 9 06:04:56.931132 kernel: pci_bus 0000:07: extended config space not accessible Sep 9 06:04:56.931191 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Sep 9 06:04:56.931243 kernel: pci 0000:07:00.0: BAR 0 [mem 0x94000000-0x94ffffff] Sep 9 06:04:56.931294 kernel: pci 0000:07:00.0: BAR 1 [mem 0x95000000-0x9501ffff] Sep 9 06:04:56.931344 kernel: pci 0000:07:00.0: BAR 2 [io 0x3000-0x307f] Sep 9 06:04:56.931457 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 06:04:56.931511 kernel: pci 0000:07:00.0: supports D1 D2 Sep 9 06:04:56.931563 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Sep 9 06:04:56.931613 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 9 06:04:56.931621 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Sep 9 06:04:56.931627 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Sep 9 06:04:56.931633 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Sep 9 06:04:56.931638 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Sep 9 06:04:56.931645 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Sep 9 06:04:56.931651 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Sep 9 06:04:56.931656 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Sep 9 06:04:56.931662 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Sep 9 06:04:56.931670 kernel: iommu: Default domain type: Translated Sep 9 06:04:56.931676 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 06:04:56.931681 kernel: PCI: Using ACPI for IRQ routing Sep 9 06:04:56.931714 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 06:04:56.931719 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Sep 9 06:04:56.931726 kernel: e820: reserve RAM buffer [mem 0x819cd000-0x83ffffff] Sep 9 06:04:56.931751 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Sep 9 06:04:56.931779 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Sep 9 06:04:56.931785 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Sep 9 06:04:56.931790 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Sep 9 06:04:56.931873 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Sep 9 06:04:56.931973 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Sep 9 06:04:56.932045 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 06:04:56.932054 kernel: vgaarb: loaded Sep 9 06:04:56.932089 kernel: clocksource: Switched to clocksource tsc-early Sep 9 06:04:56.932095 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 06:04:56.932101 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 06:04:56.932107 kernel: pnp: PnP ACPI init Sep 9 06:04:56.932175 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Sep 9 06:04:56.932232 kernel: pnp 00:02: [dma 0 disabled] Sep 9 06:04:56.932287 kernel: pnp 00:03: [dma 0 disabled] Sep 9 06:04:56.932341 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Sep 9 06:04:56.932403 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Sep 9 06:04:56.932454 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Sep 9 06:04:56.932503 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Sep 9 06:04:56.932547 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Sep 9 06:04:56.932591 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Sep 9 06:04:56.932657 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Sep 9 06:04:56.932709 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Sep 9 06:04:56.932756 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Sep 9 06:04:56.932801 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Sep 9 06:04:56.932850 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Sep 9 06:04:56.932895 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Sep 9 06:04:56.932940 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Sep 9 06:04:56.932989 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Sep 9 06:04:56.933043 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Sep 9 06:04:56.933093 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Sep 9 06:04:56.933142 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Sep 9 06:04:56.933195 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Sep 9 06:04:56.933205 kernel: pnp: PnP ACPI: found 9 devices Sep 9 06:04:56.933211 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 06:04:56.933217 kernel: NET: Registered PF_INET protocol family Sep 9 06:04:56.933225 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 06:04:56.933231 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Sep 9 06:04:56.933237 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 06:04:56.933245 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 06:04:56.933251 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 9 06:04:56.933257 kernel: TCP: Hash tables configured (established 262144 bind 65536) Sep 9 06:04:56.933263 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 06:04:56.933269 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 06:04:56.933275 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 06:04:56.933282 kernel: NET: Registered PF_XDP protocol family Sep 9 06:04:56.933335 kernel: pci 0000:00:15.0: BAR 0 [mem 0x95515000-0x95515fff 64bit]: assigned Sep 9 06:04:56.933406 kernel: pci 0000:00:15.1: BAR 0 [mem 0x9551b000-0x9551bfff 64bit]: assigned Sep 9 06:04:56.933458 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x9551c000-0x9551cfff 64bit]: assigned Sep 9 06:04:56.933510 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 9 06:04:56.933565 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 9 06:04:56.933618 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Sep 9 06:04:56.933698 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Sep 9 06:04:56.933752 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Sep 9 06:04:56.933802 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Sep 9 06:04:56.933851 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Sep 9 06:04:56.933900 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Sep 9 06:04:56.933954 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Sep 9 06:04:56.934012 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Sep 9 06:04:56.934066 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Sep 9 06:04:56.934116 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Sep 9 06:04:56.934166 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Sep 9 06:04:56.934216 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Sep 9 06:04:56.934266 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Sep 9 06:04:56.934316 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Sep 9 06:04:56.934367 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Sep 9 06:04:56.934417 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Sep 9 06:04:56.934467 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Sep 9 06:04:56.934519 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Sep 9 06:04:56.934568 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Sep 9 06:04:56.934614 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Sep 9 06:04:56.934659 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 06:04:56.934706 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 06:04:56.934750 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 06:04:56.934793 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Sep 9 06:04:56.934837 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Sep 9 06:04:56.934887 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Sep 9 06:04:56.934935 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Sep 9 06:04:56.934986 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Sep 9 06:04:56.935033 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Sep 9 06:04:56.935084 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 9 06:04:56.935129 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Sep 9 06:04:56.935181 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Sep 9 06:04:56.935230 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Sep 9 06:04:56.935278 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Sep 9 06:04:56.935325 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Sep 9 06:04:56.935334 kernel: PCI: CLS 64 bytes, default 64 Sep 9 06:04:56.935340 kernel: DMAR: No ATSR found Sep 9 06:04:56.935346 kernel: DMAR: No SATC found Sep 9 06:04:56.935352 kernel: DMAR: dmar0: Using Queued invalidation Sep 9 06:04:56.935403 kernel: pci 0000:00:00.0: Adding to iommu group 0 Sep 9 06:04:56.935453 kernel: pci 0000:00:01.0: Adding to iommu group 1 Sep 9 06:04:56.935503 kernel: pci 0000:00:08.0: Adding to iommu group 2 Sep 9 06:04:56.935553 kernel: pci 0000:00:12.0: Adding to iommu group 3 Sep 9 06:04:56.935602 kernel: pci 0000:00:14.0: Adding to iommu group 4 Sep 9 06:04:56.935652 kernel: pci 0000:00:14.2: Adding to iommu group 4 Sep 9 06:04:56.935705 kernel: pci 0000:00:15.0: Adding to iommu group 5 Sep 9 06:04:56.935754 kernel: pci 0000:00:15.1: Adding to iommu group 5 Sep 9 06:04:56.935803 kernel: pci 0000:00:16.0: Adding to iommu group 6 Sep 9 06:04:56.935855 kernel: pci 0000:00:16.1: Adding to iommu group 6 Sep 9 06:04:56.935904 kernel: pci 0000:00:16.4: Adding to iommu group 6 Sep 9 06:04:56.935953 kernel: pci 0000:00:17.0: Adding to iommu group 7 Sep 9 06:04:56.936003 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Sep 9 06:04:56.936052 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Sep 9 06:04:56.936102 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Sep 9 06:04:56.936151 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Sep 9 06:04:56.936202 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Sep 9 06:04:56.936251 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Sep 9 06:04:56.936300 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Sep 9 06:04:56.936349 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Sep 9 06:04:56.936398 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Sep 9 06:04:56.936448 kernel: pci 0000:01:00.0: Adding to iommu group 1 Sep 9 06:04:56.936499 kernel: pci 0000:01:00.1: Adding to iommu group 1 Sep 9 06:04:56.936549 kernel: pci 0000:03:00.0: Adding to iommu group 15 Sep 9 06:04:56.936602 kernel: pci 0000:04:00.0: Adding to iommu group 16 Sep 9 06:04:56.936652 kernel: pci 0000:06:00.0: Adding to iommu group 17 Sep 9 06:04:56.936708 kernel: pci 0000:07:00.0: Adding to iommu group 17 Sep 9 06:04:56.936716 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Sep 9 06:04:56.936723 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 9 06:04:56.936729 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Sep 9 06:04:56.936735 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Sep 9 06:04:56.936741 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Sep 9 06:04:56.936746 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Sep 9 06:04:56.936754 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Sep 9 06:04:56.936807 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Sep 9 06:04:56.936816 kernel: Initialise system trusted keyrings Sep 9 06:04:56.936822 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Sep 9 06:04:56.936828 kernel: Key type asymmetric registered Sep 9 06:04:56.936834 kernel: Asymmetric key parser 'x509' registered Sep 9 06:04:56.936839 kernel: tsc: Refined TSC clocksource calibration: 3407.999 MHz Sep 9 06:04:56.936845 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd336761, max_idle_ns: 440795243819 ns Sep 9 06:04:56.936853 kernel: clocksource: Switched to clocksource tsc Sep 9 06:04:56.936859 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 06:04:56.936865 kernel: io scheduler mq-deadline registered Sep 9 06:04:56.936871 kernel: io scheduler kyber registered Sep 9 06:04:56.936876 kernel: io scheduler bfq registered Sep 9 06:04:56.936925 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Sep 9 06:04:56.936976 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Sep 9 06:04:56.937026 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Sep 9 06:04:56.937076 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Sep 9 06:04:56.937128 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Sep 9 06:04:56.937177 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Sep 9 06:04:56.937233 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Sep 9 06:04:56.937242 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Sep 9 06:04:56.937248 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Sep 9 06:04:56.937254 kernel: pstore: Using crash dump compression: deflate Sep 9 06:04:56.937260 kernel: pstore: Registered erst as persistent store backend Sep 9 06:04:56.937266 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 06:04:56.937273 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 06:04:56.937279 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 06:04:56.937285 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 9 06:04:56.937291 kernel: hpet_acpi_add: no address or irqs in _CRS Sep 9 06:04:56.937341 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Sep 9 06:04:56.937350 kernel: i8042: PNP: No PS/2 controller found. Sep 9 06:04:56.937394 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Sep 9 06:04:56.937440 kernel: rtc_cmos rtc_cmos: registered as rtc0 Sep 9 06:04:56.937487 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-09-09T06:04:55 UTC (1757397895) Sep 9 06:04:56.937531 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Sep 9 06:04:56.937540 kernel: intel_pstate: Intel P-state driver initializing Sep 9 06:04:56.937546 kernel: intel_pstate: Disabling energy efficiency optimization Sep 9 06:04:56.937552 kernel: intel_pstate: HWP enabled Sep 9 06:04:56.937558 kernel: NET: Registered PF_INET6 protocol family Sep 9 06:04:56.937563 kernel: Segment Routing with IPv6 Sep 9 06:04:56.937569 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 06:04:56.937576 kernel: NET: Registered PF_PACKET protocol family Sep 9 06:04:56.937582 kernel: Key type dns_resolver registered Sep 9 06:04:56.937588 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Sep 9 06:04:56.937594 kernel: microcode: Current revision: 0x000000f4 Sep 9 06:04:56.937599 kernel: IPI shorthand broadcast: enabled Sep 9 06:04:56.937605 kernel: sched_clock: Marking stable (3707336226, 1493964347)->(6802973634, -1601673061) Sep 9 06:04:56.937611 kernel: registered taskstats version 1 Sep 9 06:04:56.937617 kernel: Loading compiled-in X.509 certificates Sep 9 06:04:56.937623 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 884b9ad6a330f59ae6e6488b20a5491e41ff24a3' Sep 9 06:04:56.937629 kernel: Demotion targets for Node 0: null Sep 9 06:04:56.937635 kernel: Key type .fscrypt registered Sep 9 06:04:56.937641 kernel: Key type fscrypt-provisioning registered Sep 9 06:04:56.937647 kernel: ima: Allocated hash algorithm: sha1 Sep 9 06:04:56.937652 kernel: ima: No architecture policies found Sep 9 06:04:56.937658 kernel: clk: Disabling unused clocks Sep 9 06:04:56.937664 kernel: Warning: unable to open an initial console. Sep 9 06:04:56.937681 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 9 06:04:56.937687 kernel: Write protecting the kernel read-only data: 24576k Sep 9 06:04:56.937695 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 9 06:04:56.937700 kernel: Run /init as init process Sep 9 06:04:56.937706 kernel: with arguments: Sep 9 06:04:56.937712 kernel: /init Sep 9 06:04:56.937718 kernel: with environment: Sep 9 06:04:56.937723 kernel: HOME=/ Sep 9 06:04:56.937729 kernel: TERM=linux Sep 9 06:04:56.937735 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 06:04:56.937741 systemd[1]: Successfully made /usr/ read-only. Sep 9 06:04:56.937750 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 06:04:56.937757 systemd[1]: Detected architecture x86-64. Sep 9 06:04:56.937763 systemd[1]: Running in initrd. Sep 9 06:04:56.937769 systemd[1]: No hostname configured, using default hostname. Sep 9 06:04:56.937775 systemd[1]: Hostname set to . Sep 9 06:04:56.937781 systemd[1]: Initializing machine ID from random generator. Sep 9 06:04:56.937787 systemd[1]: Queued start job for default target initrd.target. Sep 9 06:04:56.937794 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 06:04:56.937800 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 06:04:56.937807 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 06:04:56.937813 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 06:04:56.937819 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 06:04:56.937825 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 06:04:56.937833 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 06:04:56.937840 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 06:04:56.937846 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 06:04:56.937852 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 06:04:56.937858 systemd[1]: Reached target paths.target - Path Units. Sep 9 06:04:56.937865 systemd[1]: Reached target slices.target - Slice Units. Sep 9 06:04:56.937871 systemd[1]: Reached target swap.target - Swaps. Sep 9 06:04:56.937877 systemd[1]: Reached target timers.target - Timer Units. Sep 9 06:04:56.937883 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 06:04:56.937890 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 06:04:56.937896 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 06:04:56.937902 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 06:04:56.937908 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 06:04:56.937914 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 06:04:56.937921 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 06:04:56.937927 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 06:04:56.937933 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 06:04:56.937939 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 06:04:56.937946 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 06:04:56.937952 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 06:04:56.937958 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 06:04:56.937964 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 06:04:56.937971 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 06:04:56.937989 systemd-journald[298]: Collecting audit messages is disabled. Sep 9 06:04:56.938005 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 06:04:56.938012 systemd-journald[298]: Journal started Sep 9 06:04:56.938025 systemd-journald[298]: Runtime Journal (/run/log/journal/85cc6afbef5d48d29eddacce9e6e1bb3) is 8M, max 640.1M, 632.1M free. Sep 9 06:04:56.932285 systemd-modules-load[301]: Inserted module 'overlay' Sep 9 06:04:56.950300 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 06:04:57.004891 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 06:04:57.004907 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 06:04:57.004916 kernel: Bridge firewalling registered Sep 9 06:04:56.955276 systemd-modules-load[301]: Inserted module 'br_netfilter' Sep 9 06:04:56.987326 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 06:04:57.015122 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 06:04:57.039022 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 06:04:57.045146 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 06:04:57.065006 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 06:04:57.101369 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 06:04:57.114458 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 06:04:57.119970 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 06:04:57.126079 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 06:04:57.126387 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 06:04:57.127090 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 06:04:57.127809 systemd-tmpfiles[318]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 06:04:57.129349 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 06:04:57.130283 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 06:04:57.131813 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 06:04:57.142911 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 06:04:57.149452 systemd-resolved[336]: Positive Trust Anchors: Sep 9 06:04:57.149457 systemd-resolved[336]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 06:04:57.149480 systemd-resolved[336]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 06:04:57.150965 systemd-resolved[336]: Defaulting to hostname 'linux'. Sep 9 06:04:57.172931 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 06:04:57.189024 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 06:04:57.201652 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 06:04:57.347553 dracut-cmdline[340]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 06:04:57.553715 kernel: SCSI subsystem initialized Sep 9 06:04:57.566674 kernel: Loading iSCSI transport class v2.0-870. Sep 9 06:04:57.578686 kernel: iscsi: registered transport (tcp) Sep 9 06:04:57.602410 kernel: iscsi: registered transport (qla4xxx) Sep 9 06:04:57.602430 kernel: QLogic iSCSI HBA Driver Sep 9 06:04:57.613017 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 06:04:57.657582 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 06:04:57.672170 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 06:04:57.797449 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 06:04:57.809132 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 06:04:57.883727 kernel: raid6: avx2x4 gen() 24890 MB/s Sep 9 06:04:57.904727 kernel: raid6: avx2x2 gen() 39782 MB/s Sep 9 06:04:57.930755 kernel: raid6: avx2x1 gen() 45834 MB/s Sep 9 06:04:57.930771 kernel: raid6: using algorithm avx2x1 gen() 45834 MB/s Sep 9 06:04:57.957873 kernel: raid6: .... xor() 24550 MB/s, rmw enabled Sep 9 06:04:57.957889 kernel: raid6: using avx2x2 recovery algorithm Sep 9 06:04:57.978700 kernel: xor: automatically using best checksumming function avx Sep 9 06:04:58.081714 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 06:04:58.084603 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 06:04:58.094819 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 06:04:58.144587 systemd-udevd[552]: Using default interface naming scheme 'v255'. Sep 9 06:04:58.147738 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 06:04:58.155466 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 06:04:58.218500 dracut-pre-trigger[564]: rd.md=0: removing MD RAID activation Sep 9 06:04:58.275776 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 06:04:58.278844 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 06:04:58.413606 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 06:04:58.429683 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 06:04:58.440160 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 06:04:58.498395 kernel: ACPI: bus type USB registered Sep 9 06:04:58.498415 kernel: usbcore: registered new interface driver usbfs Sep 9 06:04:58.498423 kernel: usbcore: registered new interface driver hub Sep 9 06:04:58.498429 kernel: usbcore: registered new device driver usb Sep 9 06:04:58.498436 kernel: AES CTR mode by8 optimization enabled Sep 9 06:04:58.498443 kernel: pps_core: LinuxPPS API ver. 1 registered Sep 9 06:04:58.498450 kernel: libata version 3.00 loaded. Sep 9 06:04:58.498457 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Sep 9 06:04:58.498464 kernel: PTP clock support registered Sep 9 06:04:58.444759 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 06:04:58.444889 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 06:04:58.643189 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 9 06:04:58.643281 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Sep 9 06:04:58.643346 kernel: ahci 0000:00:17.0: version 3.0 Sep 9 06:04:58.643416 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Sep 9 06:04:58.643479 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Sep 9 06:04:58.643566 kernel: ahci 0000:00:17.0: 7/7 ports implemented (port mask 0x7f) Sep 9 06:04:58.643651 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Sep 9 06:04:58.643719 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Sep 9 06:04:58.643780 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Sep 9 06:04:58.643839 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Sep 9 06:04:58.643897 kernel: scsi host0: ahci Sep 9 06:04:58.643961 kernel: hub 1-0:1.0: USB hub found Sep 9 06:04:58.644032 kernel: scsi host1: ahci Sep 9 06:04:58.644090 kernel: hub 1-0:1.0: 16 ports detected Sep 9 06:04:58.644156 kernel: scsi host2: ahci Sep 9 06:04:58.644215 kernel: hub 2-0:1.0: USB hub found Sep 9 06:04:58.644282 kernel: scsi host3: ahci Sep 9 06:04:58.644344 kernel: hub 2-0:1.0: 10 ports detected Sep 9 06:04:58.644408 kernel: scsi host4: ahci Sep 9 06:04:58.644465 kernel: scsi host5: ahci Sep 9 06:04:58.644521 kernel: scsi host6: ahci Sep 9 06:04:58.644575 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 lpm-pol 0 Sep 9 06:04:58.644583 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 lpm-pol 0 Sep 9 06:04:58.644590 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 lpm-pol 0 Sep 9 06:04:58.644598 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 lpm-pol 0 Sep 9 06:04:58.644605 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 lpm-pol 0 Sep 9 06:04:58.644612 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 lpm-pol 0 Sep 9 06:04:58.644618 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 lpm-pol 0 Sep 9 06:04:58.678407 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 06:04:58.700742 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Sep 9 06:04:58.700752 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Sep 9 06:04:58.719133 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 06:04:58.767919 kernel: igb 0000:03:00.0: added PHC on eth0 Sep 9 06:04:58.768012 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 9 06:04:58.768081 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:75:b6 Sep 9 06:04:58.768147 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Sep 9 06:04:58.768214 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 9 06:04:58.719385 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 06:04:58.794738 kernel: igb 0000:04:00.0: added PHC on eth1 Sep 9 06:04:58.794832 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Sep 9 06:04:58.794901 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 00:25:90:bd:75:b7 Sep 9 06:04:58.794965 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Sep 9 06:04:58.795028 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Sep 9 06:04:58.845373 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 06:04:58.863930 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Sep 9 06:04:58.956700 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 9 06:04:58.956748 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 9 06:04:58.962675 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 9 06:04:58.968699 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 9 06:04:58.974706 kernel: ata7: SATA link down (SStatus 0 SControl 300) Sep 9 06:04:58.980708 kernel: hub 1-14:1.0: USB hub found Sep 9 06:04:58.980815 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 9 06:04:58.983716 kernel: hub 1-14:1.0: 4 ports detected Sep 9 06:04:58.989702 kernel: ata2.00: Model 'Micron_5200_MTFDDAK480TDN', rev ' D1MU020', applying quirks: zeroaftertrim Sep 9 06:04:59.011262 kernel: ata2.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Sep 9 06:04:59.011719 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Sep 9 06:04:59.027778 kernel: ata1.00: Model 'Micron_5200_MTFDDAK480TDN', rev ' D1MU020', applying quirks: zeroaftertrim Sep 9 06:04:59.035337 kernel: ata1.00: ATA-10: Micron_5200_MTFDDAK480TDN, D1MU020, max UDMA/133 Sep 9 06:04:59.048905 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 9 06:04:59.048921 kernel: ata2.00: Features: NCQ-prio Sep 9 06:04:59.062335 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Sep 9 06:04:59.062351 kernel: ata1.00: Features: NCQ-prio Sep 9 06:04:59.071866 kernel: ata2.00: configured for UDMA/133 Sep 9 06:04:59.077703 kernel: ata1.00: configured for UDMA/133 Sep 9 06:04:59.077720 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Sep 9 06:04:59.086727 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5200_MTFD U020 PQ: 0 ANSI: 5 Sep 9 06:04:59.101676 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Sep 9 06:04:59.101822 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Sep 9 06:04:59.107679 kernel: ata1.00: Enabling discard_zeroes_data Sep 9 06:04:59.112122 kernel: ata2.00: Enabling discard_zeroes_data Sep 9 06:04:59.112146 kernel: sd 0:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 9 06:04:59.131817 kernel: sd 1:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Sep 9 06:04:59.131941 kernel: sd 0:0:0:0: [sdb] 4096-byte physical blocks Sep 9 06:04:59.132038 kernel: sd 0:0:0:0: [sdb] Write Protect is off Sep 9 06:04:59.132135 kernel: sd 1:0:0:0: [sda] 4096-byte physical blocks Sep 9 06:04:59.137056 kernel: sd 0:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Sep 9 06:04:59.147075 kernel: sd 1:0:0:0: [sda] Write Protect is off Sep 9 06:04:59.147177 kernel: sd 0:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 9 06:04:59.152209 kernel: sd 1:0:0:0: [sda] Mode Sense: 00 3a 00 00 Sep 9 06:04:59.152293 kernel: sd 0:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Sep 9 06:04:59.161236 kernel: sd 1:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 9 06:04:59.167820 kernel: ata1.00: Enabling discard_zeroes_data Sep 9 06:04:59.184166 kernel: sd 1:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Sep 9 06:04:59.195932 kernel: ata2.00: Enabling discard_zeroes_data Sep 9 06:04:59.217358 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 06:04:59.217377 kernel: GPT:9289727 != 937703087 Sep 9 06:04:59.223639 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 06:04:59.227496 kernel: GPT:9289727 != 937703087 Sep 9 06:04:59.232916 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 06:04:59.238158 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 9 06:04:59.243191 kernel: sd 0:0:0:0: [sdb] Attached SCSI disk Sep 9 06:04:59.274717 kernel: sd 1:0:0:0: [sda] Attached SCSI disk Sep 9 06:04:59.274805 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Sep 9 06:04:59.335712 kernel: mlx5_core 0000:01:00.0: PTM is not supported by PCIe Sep 9 06:04:59.335825 kernel: mlx5_core 0000:01:00.0: firmware version: 14.31.1014 Sep 9 06:04:59.344829 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 9 06:04:59.358873 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5200_MTFDDAK480TDN EFI-SYSTEM. Sep 9 06:04:59.375834 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5200_MTFDDAK480TDN ROOT. Sep 9 06:04:59.425728 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 06:04:59.425748 kernel: usbcore: registered new interface driver usbhid Sep 9 06:04:59.425760 kernel: usbhid: USB HID core driver Sep 9 06:04:59.425771 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Sep 9 06:04:59.390938 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5200_MTFDDAK480TDN USR-A. Sep 9 06:04:59.435731 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5200_MTFDDAK480TDN USR-A. Sep 9 06:04:59.505717 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Sep 9 06:04:59.505845 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Sep 9 06:04:59.505862 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Sep 9 06:04:59.466544 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5200_MTFDDAK480TDN OEM. Sep 9 06:04:59.516288 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 06:04:59.556637 disk-uuid[763]: Primary Header is updated. Sep 9 06:04:59.556637 disk-uuid[763]: Secondary Entries is updated. Sep 9 06:04:59.556637 disk-uuid[763]: Secondary Header is updated. Sep 9 06:04:59.585737 kernel: ata1.00: Enabling discard_zeroes_data Sep 9 06:04:59.585753 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 9 06:04:59.625719 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(128) max mc(2048) Sep 9 06:04:59.636543 kernel: mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged Sep 9 06:04:59.898753 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 9 06:04:59.917285 kernel: mlx5_core 0000:01:00.1: PTM is not supported by PCIe Sep 9 06:04:59.917730 kernel: mlx5_core 0000:01:00.1: firmware version: 14.31.1014 Sep 9 06:04:59.918055 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Sep 9 06:05:00.235710 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(128) max mc(2048) Sep 9 06:05:00.248632 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Sep 9 06:05:00.543755 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Sep 9 06:05:00.557727 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Sep 9 06:05:00.557845 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Sep 9 06:05:00.582877 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 06:05:00.606852 kernel: ata1.00: Enabling discard_zeroes_data Sep 9 06:05:00.606866 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Sep 9 06:05:00.606874 disk-uuid[764]: The operation has completed successfully. Sep 9 06:05:00.586538 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 06:05:00.616787 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 06:05:00.650875 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 06:05:00.660565 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 06:05:00.669160 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 06:05:00.669251 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 06:05:00.732358 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 06:05:00.764300 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 06:05:00.790754 sh[833]: Success Sep 9 06:05:00.818793 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 06:05:00.818814 kernel: device-mapper: uevent: version 1.0.3 Sep 9 06:05:00.828011 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 06:05:00.840724 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 9 06:05:00.887787 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 06:05:00.905923 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 06:05:00.929490 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 06:05:00.977774 kernel: BTRFS: device fsid 9ca60a92-6b53-4529-adc0-1f4392d2ad56 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (846) Sep 9 06:05:00.977787 kernel: BTRFS info (device dm-0): first mount of filesystem 9ca60a92-6b53-4529-adc0-1f4392d2ad56 Sep 9 06:05:00.977794 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 06:05:00.996619 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 9 06:05:00.996634 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 06:05:01.002737 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 06:05:01.004906 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 06:05:01.012038 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 06:05:01.037932 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 06:05:01.038472 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 06:05:01.054537 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 06:05:01.098675 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sdb6 (8:22) scanned by mount (869) Sep 9 06:05:01.117257 kernel: BTRFS info (device sdb6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 06:05:01.117278 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 9 06:05:01.131873 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 9 06:05:01.131891 kernel: BTRFS info (device sdb6): turning on async discard Sep 9 06:05:01.137561 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 06:05:01.167971 kernel: BTRFS info (device sdb6): enabling free space tree Sep 9 06:05:01.168019 kernel: BTRFS info (device sdb6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 06:05:01.158373 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 06:05:01.179952 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 06:05:01.199622 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 06:05:01.241643 systemd-networkd[1016]: lo: Link UP Sep 9 06:05:01.241646 systemd-networkd[1016]: lo: Gained carrier Sep 9 06:05:01.244026 systemd-networkd[1016]: Enumeration completed Sep 9 06:05:01.244067 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 06:05:01.244630 systemd-networkd[1016]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 06:05:01.256746 systemd[1]: Reached target network.target - Network. Sep 9 06:05:01.272524 systemd-networkd[1016]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 06:05:01.302433 ignition[1015]: Ignition 2.22.0 Sep 9 06:05:01.300423 systemd-networkd[1016]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 06:05:01.302437 ignition[1015]: Stage: fetch-offline Sep 9 06:05:01.305207 unknown[1015]: fetched base config from "system" Sep 9 06:05:01.302458 ignition[1015]: no configs at "/usr/lib/ignition/base.d" Sep 9 06:05:01.305211 unknown[1015]: fetched user config from "system" Sep 9 06:05:01.302463 ignition[1015]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 9 06:05:01.306373 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 06:05:01.302508 ignition[1015]: parsed url from cmdline: "" Sep 9 06:05:01.324090 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Sep 9 06:05:01.302510 ignition[1015]: no config URL provided Sep 9 06:05:01.324604 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 06:05:01.302514 ignition[1015]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 06:05:01.302537 ignition[1015]: parsing config with SHA512: 74425bd693af0d741feac6285dded8374285e8f88e6571c8b8461d1350b4f8d04ee6391361043bffc8459efc7931dcbeea02fb4ceaa10c13d124b967a3877a0e Sep 9 06:05:01.305410 ignition[1015]: fetch-offline: fetch-offline passed Sep 9 06:05:01.305413 ignition[1015]: POST message to Packet Timeline Sep 9 06:05:01.305415 ignition[1015]: POST Status error: resource requires networking Sep 9 06:05:01.305445 ignition[1015]: Ignition finished successfully Sep 9 06:05:01.377764 ignition[1031]: Ignition 2.22.0 Sep 9 06:05:01.377773 ignition[1031]: Stage: kargs Sep 9 06:05:01.377934 ignition[1031]: no configs at "/usr/lib/ignition/base.d" Sep 9 06:05:01.493879 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Sep 9 06:05:01.377947 ignition[1031]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 9 06:05:01.379031 ignition[1031]: kargs: kargs passed Sep 9 06:05:01.496365 systemd-networkd[1016]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 06:05:01.379037 ignition[1031]: POST message to Packet Timeline Sep 9 06:05:01.379054 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #1 Sep 9 06:05:01.379850 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:52204->[::1]:53: read: connection refused Sep 9 06:05:01.580289 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #2 Sep 9 06:05:01.583057 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:57167->[::1]:53: read: connection refused Sep 9 06:05:01.753791 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Sep 9 06:05:01.758530 systemd-networkd[1016]: eno1: Link UP Sep 9 06:05:01.758955 systemd-networkd[1016]: eno2: Link UP Sep 9 06:05:01.759318 systemd-networkd[1016]: enp1s0f0np0: Link UP Sep 9 06:05:01.759770 systemd-networkd[1016]: enp1s0f0np0: Gained carrier Sep 9 06:05:01.768191 systemd-networkd[1016]: enp1s0f1np1: Link UP Sep 9 06:05:01.769302 systemd-networkd[1016]: enp1s0f1np1: Gained carrier Sep 9 06:05:01.808835 systemd-networkd[1016]: enp1s0f0np0: DHCPv4 address 139.178.90.255/31, gateway 139.178.90.254 acquired from 145.40.83.140 Sep 9 06:05:01.983569 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #3 Sep 9 06:05:01.984948 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:48999->[::1]:53: read: connection refused Sep 9 06:05:02.786162 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #4 Sep 9 06:05:02.787336 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:57775->[::1]:53: read: connection refused Sep 9 06:05:03.464311 systemd-networkd[1016]: enp1s0f0np0: Gained IPv6LL Sep 9 06:05:03.528322 systemd-networkd[1016]: enp1s0f1np1: Gained IPv6LL Sep 9 06:05:04.389042 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #5 Sep 9 06:05:04.390229 ignition[1031]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:60988->[::1]:53: read: connection refused Sep 9 06:05:07.592968 ignition[1031]: GET https://metadata.packet.net/metadata: attempt #6 Sep 9 06:05:08.643789 ignition[1031]: GET result: OK Sep 9 06:05:09.039571 ignition[1031]: Ignition finished successfully Sep 9 06:05:09.045598 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 06:05:09.057644 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 06:05:09.103913 ignition[1049]: Ignition 2.22.0 Sep 9 06:05:09.103919 ignition[1049]: Stage: disks Sep 9 06:05:09.103996 ignition[1049]: no configs at "/usr/lib/ignition/base.d" Sep 9 06:05:09.104001 ignition[1049]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 9 06:05:09.104417 ignition[1049]: disks: disks passed Sep 9 06:05:09.104420 ignition[1049]: POST message to Packet Timeline Sep 9 06:05:09.104428 ignition[1049]: GET https://metadata.packet.net/metadata: attempt #1 Sep 9 06:05:10.114462 ignition[1049]: GET result: OK Sep 9 06:05:10.553171 ignition[1049]: Ignition finished successfully Sep 9 06:05:10.558936 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 06:05:10.569476 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 06:05:10.586934 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 06:05:10.605942 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 06:05:10.626967 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 06:05:10.645959 systemd[1]: Reached target basic.target - Basic System. Sep 9 06:05:10.665447 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 06:05:10.716850 systemd-fsck[1068]: ROOT: clean, 15/553520 files, 52789/553472 blocks Sep 9 06:05:10.726083 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 06:05:10.740008 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 06:05:10.848717 kernel: EXT4-fs (sdb9): mounted filesystem d2d7815e-fa16-4396-ab9d-ac540c1d8856 r/w with ordered data mode. Quota mode: none. Sep 9 06:05:10.848699 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 06:05:10.857080 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 06:05:10.884240 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 06:05:10.892708 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 06:05:10.917293 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 9 06:05:10.928327 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Sep 9 06:05:11.001889 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1077) Sep 9 06:05:11.001904 kernel: BTRFS info (device sdb6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 06:05:11.001911 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 9 06:05:11.001921 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 9 06:05:11.001929 kernel: BTRFS info (device sdb6): turning on async discard Sep 9 06:05:11.001936 kernel: BTRFS info (device sdb6): enabling free space tree Sep 9 06:05:10.991884 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 06:05:10.991904 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 06:05:11.037734 coreos-metadata[1079]: Sep 09 06:05:11.037 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 9 06:05:11.012665 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 06:05:11.044844 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 06:05:11.083747 coreos-metadata[1080]: Sep 09 06:05:11.054 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 9 06:05:11.063844 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 06:05:11.115190 initrd-setup-root[1109]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 06:05:11.123810 initrd-setup-root[1116]: cut: /sysroot/etc/group: No such file or directory Sep 9 06:05:11.133743 initrd-setup-root[1123]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 06:05:11.143747 initrd-setup-root[1130]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 06:05:11.183200 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 06:05:11.192803 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 06:05:11.215096 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 06:05:11.232875 kernel: BTRFS info (device sdb6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 06:05:11.234501 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 06:05:11.266990 ignition[1197]: INFO : Ignition 2.22.0 Sep 9 06:05:11.266990 ignition[1197]: INFO : Stage: mount Sep 9 06:05:11.279795 ignition[1197]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 06:05:11.279795 ignition[1197]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 9 06:05:11.279795 ignition[1197]: INFO : mount: mount passed Sep 9 06:05:11.279795 ignition[1197]: INFO : POST message to Packet Timeline Sep 9 06:05:11.279795 ignition[1197]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 9 06:05:11.274184 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 06:05:12.118986 coreos-metadata[1079]: Sep 09 06:05:12.118 INFO Fetch successful Sep 9 06:05:12.127963 coreos-metadata[1080]: Sep 09 06:05:12.119 INFO Fetch successful Sep 9 06:05:12.157397 coreos-metadata[1079]: Sep 09 06:05:12.157 INFO wrote hostname ci-4452.0.0-n-7ab43648c0 to /sysroot/etc/hostname Sep 9 06:05:12.158652 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 06:05:12.181937 systemd[1]: flatcar-static-network.service: Deactivated successfully. Sep 9 06:05:12.210890 ignition[1197]: INFO : GET result: OK Sep 9 06:05:12.181984 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Sep 9 06:05:12.573485 ignition[1197]: INFO : Ignition finished successfully Sep 9 06:05:12.577497 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 06:05:12.593969 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 06:05:12.626019 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 06:05:12.676703 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1222) Sep 9 06:05:12.694245 kernel: BTRFS info (device sdb6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 06:05:12.694261 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Sep 9 06:05:12.709466 kernel: BTRFS info (device sdb6): enabling ssd optimizations Sep 9 06:05:12.709482 kernel: BTRFS info (device sdb6): turning on async discard Sep 9 06:05:12.715583 kernel: BTRFS info (device sdb6): enabling free space tree Sep 9 06:05:12.717268 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 06:05:12.752766 ignition[1239]: INFO : Ignition 2.22.0 Sep 9 06:05:12.752766 ignition[1239]: INFO : Stage: files Sep 9 06:05:12.764853 ignition[1239]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 06:05:12.764853 ignition[1239]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 9 06:05:12.764853 ignition[1239]: DEBUG : files: compiled without relabeling support, skipping Sep 9 06:05:12.764853 ignition[1239]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 06:05:12.764853 ignition[1239]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 06:05:12.764853 ignition[1239]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 06:05:12.764853 ignition[1239]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 06:05:12.764853 ignition[1239]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 06:05:12.764853 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 06:05:12.764853 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 9 06:05:12.756438 unknown[1239]: wrote ssh authorized keys file for user: core Sep 9 06:05:12.943705 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 06:05:13.094108 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 06:05:13.110929 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 9 06:05:13.635976 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 06:05:13.990193 ignition[1239]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 06:05:13.990193 ignition[1239]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 06:05:14.018917 ignition[1239]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 06:05:14.018917 ignition[1239]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 06:05:14.018917 ignition[1239]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 06:05:14.018917 ignition[1239]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 06:05:14.018917 ignition[1239]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 06:05:14.018917 ignition[1239]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 06:05:14.018917 ignition[1239]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 06:05:14.018917 ignition[1239]: INFO : files: files passed Sep 9 06:05:14.018917 ignition[1239]: INFO : POST message to Packet Timeline Sep 9 06:05:14.018917 ignition[1239]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 9 06:05:14.872069 ignition[1239]: INFO : GET result: OK Sep 9 06:05:15.281629 ignition[1239]: INFO : Ignition finished successfully Sep 9 06:05:15.286330 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 06:05:15.301633 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 06:05:15.310435 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 06:05:15.343543 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 06:05:15.343685 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 06:05:15.368197 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 06:05:15.382176 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 06:05:15.402962 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 06:05:15.429921 initrd-setup-root-after-ignition[1279]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 06:05:15.429921 initrd-setup-root-after-ignition[1279]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 06:05:15.443968 initrd-setup-root-after-ignition[1283]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 06:05:15.480515 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 06:05:15.480571 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 06:05:15.497962 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 06:05:15.516906 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 06:05:15.527001 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 06:05:15.528285 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 06:05:15.614434 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 06:05:15.628831 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 06:05:15.693352 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 06:05:15.704387 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 06:05:15.723369 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 06:05:15.740450 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 06:05:15.740888 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 06:05:15.776121 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 06:05:15.785274 systemd[1]: Stopped target basic.target - Basic System. Sep 9 06:05:15.802279 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 06:05:15.819265 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 06:05:15.838258 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 06:05:15.858271 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 06:05:15.878291 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 06:05:15.896391 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 06:05:15.915437 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 06:05:15.934288 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 06:05:15.952384 systemd[1]: Stopped target swap.target - Swaps. Sep 9 06:05:15.969295 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 06:05:15.969719 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 06:05:15.993561 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 06:05:16.011423 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 06:05:16.031280 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 06:05:16.031715 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 06:05:16.052297 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 06:05:16.052712 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 06:05:16.090850 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 06:05:16.090965 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 06:05:16.099926 systemd[1]: Stopped target paths.target - Path Units. Sep 9 06:05:16.124949 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 06:05:16.125187 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 06:05:16.144243 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 06:05:16.160337 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 06:05:16.178283 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 06:05:16.178568 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 06:05:16.199352 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 06:05:16.199637 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 06:05:16.215368 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 06:05:16.215778 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 06:05:16.223478 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 06:05:16.223873 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 06:05:16.250471 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 9 06:05:16.359860 ignition[1304]: INFO : Ignition 2.22.0 Sep 9 06:05:16.359860 ignition[1304]: INFO : Stage: umount Sep 9 06:05:16.359860 ignition[1304]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 06:05:16.359860 ignition[1304]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Sep 9 06:05:16.359860 ignition[1304]: INFO : umount: umount passed Sep 9 06:05:16.359860 ignition[1304]: INFO : POST message to Packet Timeline Sep 9 06:05:16.359860 ignition[1304]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Sep 9 06:05:16.250886 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 06:05:16.269825 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 06:05:16.282361 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 06:05:16.291050 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 06:05:16.291134 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 06:05:16.333836 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 06:05:16.333910 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 06:05:16.361679 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 06:05:16.362303 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 06:05:16.362365 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 06:05:16.380845 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 06:05:16.380929 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 06:05:17.395966 ignition[1304]: INFO : GET result: OK Sep 9 06:05:17.785203 ignition[1304]: INFO : Ignition finished successfully Sep 9 06:05:17.789345 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 06:05:17.789657 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 06:05:17.803834 systemd[1]: Stopped target network.target - Network. Sep 9 06:05:17.817937 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 06:05:17.818106 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 06:05:17.836063 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 06:05:17.836226 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 06:05:17.852201 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 06:05:17.852382 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 06:05:17.869090 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 06:05:17.869255 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 06:05:17.887072 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 06:05:17.887258 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 06:05:17.905462 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 06:05:17.924143 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 06:05:17.940764 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 06:05:17.941053 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 06:05:17.962384 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 06:05:17.962512 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 06:05:17.962560 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 06:05:17.984696 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 06:05:17.985129 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 06:05:18.012021 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 06:05:18.012089 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 06:05:18.033066 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 06:05:18.047856 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 06:05:18.047889 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 06:05:18.048026 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 06:05:18.048051 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 06:05:18.065041 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 06:05:18.065076 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 06:05:18.091013 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 06:05:18.091107 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 06:05:18.112427 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 06:05:18.134494 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 06:05:18.134712 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 06:05:18.135729 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 06:05:18.136097 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 06:05:18.153446 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 06:05:18.153592 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 06:05:18.169061 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 06:05:18.169169 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 06:05:18.178200 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 06:05:18.178375 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 06:05:18.213172 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 06:05:18.213331 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 06:05:18.250967 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 06:05:18.251154 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 06:05:18.282027 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 06:05:18.305737 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 06:05:18.305770 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 06:05:18.315865 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 06:05:18.315900 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 06:05:18.325009 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 06:05:18.619813 systemd-journald[298]: Received SIGTERM from PID 1 (systemd). Sep 9 06:05:18.325055 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 06:05:18.355144 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 06:05:18.355267 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 06:05:18.374964 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 06:05:18.375107 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 06:05:18.398390 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 06:05:18.398544 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 9 06:05:18.398661 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 06:05:18.398809 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 06:05:18.400021 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 06:05:18.400259 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 06:05:18.458102 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 06:05:18.458387 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 06:05:18.471407 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 06:05:18.491133 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 06:05:18.548344 systemd[1]: Switching root. Sep 9 06:05:18.751834 systemd-journald[298]: Journal stopped Sep 9 06:05:20.499405 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 06:05:20.499420 kernel: SELinux: policy capability open_perms=1 Sep 9 06:05:20.499428 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 06:05:20.499434 kernel: SELinux: policy capability always_check_network=0 Sep 9 06:05:20.499438 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 06:05:20.499443 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 06:05:20.499450 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 06:05:20.499455 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 06:05:20.499460 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 06:05:20.499467 kernel: audit: type=1403 audit(1757397918.877:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 06:05:20.499473 systemd[1]: Successfully loaded SELinux policy in 96.247ms. Sep 9 06:05:20.499480 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.650ms. Sep 9 06:05:20.499487 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 06:05:20.499492 systemd[1]: Detected architecture x86-64. Sep 9 06:05:20.499500 systemd[1]: Detected first boot. Sep 9 06:05:20.499506 systemd[1]: Hostname set to . Sep 9 06:05:20.499512 systemd[1]: Initializing machine ID from random generator. Sep 9 06:05:20.499518 zram_generator::config[1357]: No configuration found. Sep 9 06:05:20.499524 systemd[1]: Populated /etc with preset unit settings. Sep 9 06:05:20.499531 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 06:05:20.499537 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 06:05:20.499543 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 06:05:20.499549 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 06:05:20.499555 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 06:05:20.499561 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 06:05:20.499567 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 06:05:20.499573 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 06:05:20.499580 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 06:05:20.499586 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 06:05:20.499593 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 06:05:20.499599 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 06:05:20.499605 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 06:05:20.499611 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 06:05:20.499617 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 06:05:20.499623 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 06:05:20.499629 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 06:05:20.499637 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 06:05:20.499643 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Sep 9 06:05:20.499649 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 06:05:20.499655 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 06:05:20.499662 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 06:05:20.499672 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 06:05:20.499679 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 06:05:20.499687 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 06:05:20.499693 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 06:05:20.499699 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 06:05:20.499705 systemd[1]: Reached target slices.target - Slice Units. Sep 9 06:05:20.499738 systemd[1]: Reached target swap.target - Swaps. Sep 9 06:05:20.499759 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 06:05:20.499766 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 06:05:20.499772 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 06:05:20.499780 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 06:05:20.499787 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 06:05:20.499793 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 06:05:20.499799 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 06:05:20.499806 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 06:05:20.499813 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 06:05:20.499819 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 06:05:20.499826 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 06:05:20.499832 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 06:05:20.499838 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 06:05:20.499845 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 06:05:20.499851 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 06:05:20.499858 systemd[1]: Reached target machines.target - Containers. Sep 9 06:05:20.499865 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 06:05:20.499872 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 06:05:20.499878 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 06:05:20.499885 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 06:05:20.499891 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 06:05:20.499897 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 06:05:20.499904 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 06:05:20.499910 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 06:05:20.499916 kernel: ACPI: bus type drm_connector registered Sep 9 06:05:20.499923 kernel: fuse: init (API version 7.41) Sep 9 06:05:20.499929 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 06:05:20.499935 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 06:05:20.499942 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 06:05:20.499948 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 06:05:20.499955 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 06:05:20.499961 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 06:05:20.499967 kernel: loop: module loaded Sep 9 06:05:20.499974 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 06:05:20.499981 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 06:05:20.499988 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 06:05:20.499994 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 06:05:20.500010 systemd-journald[1460]: Collecting audit messages is disabled. Sep 9 06:05:20.500026 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 06:05:20.500034 systemd-journald[1460]: Journal started Sep 9 06:05:20.500048 systemd-journald[1460]: Runtime Journal (/run/log/journal/94457052d4ef43399028e44e0e338423) is 8M, max 640.1M, 632.1M free. Sep 9 06:05:19.352836 systemd[1]: Queued start job for default target multi-user.target. Sep 9 06:05:19.364624 systemd[1]: Unnecessary job was removed for dev-sdb6.device - /dev/sdb6. Sep 9 06:05:19.364923 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 06:05:20.528719 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 06:05:20.549737 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 06:05:20.570854 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 06:05:20.570876 systemd[1]: Stopped verity-setup.service. Sep 9 06:05:20.595718 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 06:05:20.603717 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 06:05:20.613175 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 06:05:20.621969 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 06:05:20.631943 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 06:05:20.641944 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 06:05:20.651934 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 06:05:20.660933 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 06:05:20.671034 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 06:05:20.681060 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 06:05:20.692122 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 06:05:20.692328 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 06:05:20.703229 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 06:05:20.703501 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 06:05:20.715544 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 06:05:20.716000 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 06:05:20.725573 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 06:05:20.726054 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 06:05:20.736526 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 06:05:20.736979 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 06:05:20.746551 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 06:05:20.747020 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 06:05:20.756630 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 06:05:20.767595 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 06:05:20.779722 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 06:05:20.790713 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 06:05:20.802738 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 06:05:20.820720 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 06:05:20.830566 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 06:05:20.862125 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 06:05:20.871892 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 06:05:20.871941 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 06:05:20.883799 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 06:05:20.896882 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 06:05:20.906169 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 06:05:20.919728 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 06:05:20.949905 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 06:05:20.959763 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 06:05:20.965914 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 06:05:20.970612 systemd-journald[1460]: Time spent on flushing to /var/log/journal/94457052d4ef43399028e44e0e338423 is 12.079ms for 1393 entries. Sep 9 06:05:20.970612 systemd-journald[1460]: System Journal (/var/log/journal/94457052d4ef43399028e44e0e338423) is 8M, max 195.6M, 187.6M free. Sep 9 06:05:20.993976 systemd-journald[1460]: Received client request to flush runtime journal. Sep 9 06:05:20.982766 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 06:05:20.996052 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 06:05:21.020182 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 06:05:21.030311 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 06:05:21.040676 kernel: loop0: detected capacity change from 0 to 224512 Sep 9 06:05:21.046783 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 06:05:21.056827 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 06:05:21.070013 systemd-tmpfiles[1500]: ACLs are not supported, ignoring. Sep 9 06:05:21.070023 systemd-tmpfiles[1500]: ACLs are not supported, ignoring. Sep 9 06:05:21.071673 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 06:05:21.073479 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 06:05:21.083908 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 06:05:21.093903 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 06:05:21.103917 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 06:05:21.118727 kernel: loop1: detected capacity change from 0 to 128016 Sep 9 06:05:21.121229 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 06:05:21.131446 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 06:05:21.151942 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 06:05:21.168699 kernel: loop2: detected capacity change from 0 to 8 Sep 9 06:05:21.170268 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 06:05:21.170678 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 06:05:21.187081 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 06:05:21.196520 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 06:05:21.211764 kernel: loop3: detected capacity change from 0 to 110984 Sep 9 06:05:21.229512 systemd-tmpfiles[1518]: ACLs are not supported, ignoring. Sep 9 06:05:21.229521 systemd-tmpfiles[1518]: ACLs are not supported, ignoring. Sep 9 06:05:21.230996 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 06:05:21.267716 kernel: loop4: detected capacity change from 0 to 224512 Sep 9 06:05:21.294712 kernel: loop5: detected capacity change from 0 to 128016 Sep 9 06:05:21.323290 kernel: loop6: detected capacity change from 0 to 8 Sep 9 06:05:21.323560 kernel: loop7: detected capacity change from 0 to 110984 Sep 9 06:05:21.324746 ldconfig[1491]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 06:05:21.326227 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 06:05:21.333575 (sd-merge)[1522]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Sep 9 06:05:21.333886 (sd-merge)[1522]: Merged extensions into '/usr'. Sep 9 06:05:21.337121 systemd[1]: Reload requested from client PID 1497 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 06:05:21.337129 systemd[1]: Reloading... Sep 9 06:05:21.364748 zram_generator::config[1547]: No configuration found. Sep 9 06:05:21.489828 systemd[1]: Reloading finished in 152 ms. Sep 9 06:05:21.508679 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 06:05:21.519053 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 06:05:21.547725 systemd[1]: Starting ensure-sysext.service... Sep 9 06:05:21.554646 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 06:05:21.578794 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 06:05:21.589654 systemd-tmpfiles[1605]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 06:05:21.589681 systemd-tmpfiles[1605]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 06:05:21.589863 systemd-tmpfiles[1605]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 06:05:21.590022 systemd-tmpfiles[1605]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 06:05:21.590494 systemd-tmpfiles[1605]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 06:05:21.590665 systemd-tmpfiles[1605]: ACLs are not supported, ignoring. Sep 9 06:05:21.590700 systemd-tmpfiles[1605]: ACLs are not supported, ignoring. Sep 9 06:05:21.593003 systemd-tmpfiles[1605]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 06:05:21.593007 systemd-tmpfiles[1605]: Skipping /boot Sep 9 06:05:21.596541 systemd-tmpfiles[1605]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 06:05:21.596545 systemd-tmpfiles[1605]: Skipping /boot Sep 9 06:05:21.603530 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 06:05:21.612939 systemd-udevd[1606]: Using default interface naming scheme 'v255'. Sep 9 06:05:21.615323 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 06:05:21.630377 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 06:05:21.641687 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 06:05:21.656815 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 06:05:21.657221 augenrules[1689]: No rules Sep 9 06:05:21.674878 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 06:05:21.678793 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Sep 9 06:05:21.678860 kernel: ACPI: button: Sleep Button [SLPB] Sep 9 06:05:21.680010 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 06:05:21.688035 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Sep 9 06:05:21.694677 kernel: IPMI message handler: version 39.2 Sep 9 06:05:21.694728 kernel: ACPI: button: Power Button [PWRF] Sep 9 06:05:21.699680 kernel: ipmi device interface Sep 9 06:05:21.699721 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 06:05:21.715684 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Sep 9 06:05:21.715885 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Sep 9 06:05:21.755328 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Sep 9 06:05:21.755579 kernel: ipmi_si: IPMI System Interface driver Sep 9 06:05:21.755603 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Sep 9 06:05:21.756504 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 06:05:21.770824 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 06:05:21.774657 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Sep 9 06:05:21.774864 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Sep 9 06:05:21.781093 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Sep 9 06:05:21.789621 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Sep 9 06:05:21.798889 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Sep 9 06:05:21.809037 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Sep 9 06:05:21.816188 kernel: ipmi_si: Adding ACPI-specified kcs state machine Sep 9 06:05:21.826442 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Sep 9 06:05:21.826838 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 06:05:21.841532 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 06:05:21.841680 kernel: iTCO_vendor_support: vendor-support=0 Sep 9 06:05:21.848681 kernel: MACsec IEEE 802.1AE Sep 9 06:05:21.868029 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Sep 9 06:05:21.868210 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Sep 9 06:05:21.870824 systemd[1]: Reload requested from client PID 1604 ('systemctl') (unit ensure-sysext.service)... Sep 9 06:05:21.870841 systemd[1]: Reloading... Sep 9 06:05:21.881677 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Sep 9 06:05:21.903687 kernel: intel_rapl_common: Found RAPL domain package Sep 9 06:05:21.903789 zram_generator::config[1778]: No configuration found. Sep 9 06:05:21.909700 kernel: intel_rapl_common: Found RAPL domain core Sep 9 06:05:21.924897 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Sep 9 06:05:21.925028 kernel: intel_rapl_common: Found RAPL domain dram Sep 9 06:05:22.040706 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Sep 9 06:05:22.047710 kernel: ipmi_ssif: IPMI SSIF Interface driver Sep 9 06:05:22.048918 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5200_MTFDDAK480TDN OEM. Sep 9 06:05:22.058739 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Sep 9 06:05:22.058861 systemd[1]: Reloading finished in 187 ms. Sep 9 06:05:22.092738 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 06:05:22.108887 systemd[1]: Finished ensure-sysext.service. Sep 9 06:05:22.133239 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 9 06:05:22.141727 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 06:05:22.142339 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 06:05:22.149788 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 06:05:22.167049 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 06:05:22.173293 augenrules[1843]: /sbin/augenrules: No change Sep 9 06:05:22.176593 augenrules[1861]: No rules Sep 9 06:05:22.189870 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 06:05:22.200206 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 06:05:22.215874 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 06:05:22.225775 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 06:05:22.231854 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 06:05:22.241703 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 06:05:22.252030 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 06:05:22.261535 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 06:05:22.262145 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 06:05:22.262943 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 06:05:22.279320 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 06:05:22.287779 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 06:05:22.287796 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 06:05:22.288264 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 06:05:22.295806 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 06:05:22.305880 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 06:05:22.306088 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 06:05:22.306190 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 06:05:22.306333 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 06:05:22.306421 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 06:05:22.306556 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 06:05:22.306643 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 06:05:22.306784 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 06:05:22.306871 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 06:05:22.307020 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 06:05:22.307180 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 06:05:22.309190 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 06:05:22.309234 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 06:05:22.337207 systemd-resolved[1690]: Positive Trust Anchors: Sep 9 06:05:22.337213 systemd-resolved[1690]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 06:05:22.337237 systemd-resolved[1690]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 06:05:22.339185 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 06:05:22.343487 systemd-resolved[1690]: Using system hostname 'ci-4452.0.0-n-7ab43648c0'. Sep 9 06:05:22.351436 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 06:05:22.357192 systemd-networkd[1870]: lo: Link UP Sep 9 06:05:22.357195 systemd-networkd[1870]: lo: Gained carrier Sep 9 06:05:22.359779 systemd-networkd[1870]: bond0: netdev ready Sep 9 06:05:22.360785 systemd-networkd[1870]: Enumeration completed Sep 9 06:05:22.371649 systemd-networkd[1870]: enp1s0f0np0: Configuring with /etc/systemd/network/10-b8:59:9f:e1:63:ea.network. Sep 9 06:05:22.385815 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 06:05:22.395870 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 06:05:22.407661 systemd[1]: Reached target network.target - Network. Sep 9 06:05:22.415708 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 06:05:22.425710 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 06:05:22.434746 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 06:05:22.445719 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 06:05:22.456705 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 06:05:22.467710 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 06:05:22.477702 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 06:05:22.477715 systemd[1]: Reached target paths.target - Path Units. Sep 9 06:05:22.485704 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 06:05:22.494779 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 06:05:22.503754 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 06:05:22.513706 systemd[1]: Reached target timers.target - Timer Units. Sep 9 06:05:22.521155 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 06:05:22.531330 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 06:05:22.540617 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 06:05:22.551659 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 06:05:22.559885 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 06:05:22.570421 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 06:05:22.581295 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 06:05:22.591038 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 06:05:22.600266 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 06:05:22.608792 systemd[1]: Reached target basic.target - Basic System. Sep 9 06:05:22.617805 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 06:05:22.617821 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 06:05:22.618322 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 06:05:22.634113 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 06:05:22.653851 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 06:05:22.661273 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 06:05:22.666200 coreos-metadata[1905]: Sep 09 06:05:22.666 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 9 06:05:22.667079 coreos-metadata[1905]: Sep 09 06:05:22.667 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) Sep 9 06:05:22.686005 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 06:05:22.708952 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 06:05:22.712198 jq[1911]: false Sep 9 06:05:22.717777 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 06:05:22.732967 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 06:05:22.737477 extend-filesystems[1912]: Found /dev/sdb6 Sep 9 06:05:22.741817 extend-filesystems[1912]: Found /dev/sdb9 Sep 9 06:05:22.741817 extend-filesystems[1912]: Checking size of /dev/sdb9 Sep 9 06:05:22.774731 kernel: EXT4-fs (sdb9): resizing filesystem from 553472 to 116605649 blocks Sep 9 06:05:22.765727 oslogin_cache_refresh[1913]: Refreshing passwd entry cache Sep 9 06:05:22.742422 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 06:05:22.774916 extend-filesystems[1912]: Resized partition /dev/sdb9 Sep 9 06:05:22.766810 oslogin_cache_refresh[1913]: Failure getting users, quitting Sep 9 06:05:22.789749 google_oslogin_nss_cache[1913]: oslogin_cache_refresh[1913]: Refreshing passwd entry cache Sep 9 06:05:22.789749 google_oslogin_nss_cache[1913]: oslogin_cache_refresh[1913]: Failure getting users, quitting Sep 9 06:05:22.789749 google_oslogin_nss_cache[1913]: oslogin_cache_refresh[1913]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 06:05:22.789749 google_oslogin_nss_cache[1913]: oslogin_cache_refresh[1913]: Refreshing group entry cache Sep 9 06:05:22.789749 google_oslogin_nss_cache[1913]: oslogin_cache_refresh[1913]: Failure getting groups, quitting Sep 9 06:05:22.789749 google_oslogin_nss_cache[1913]: oslogin_cache_refresh[1913]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 06:05:22.762280 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 06:05:22.789958 extend-filesystems[1924]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 06:05:22.766817 oslogin_cache_refresh[1913]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 06:05:22.783473 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 06:05:22.766836 oslogin_cache_refresh[1913]: Refreshing group entry cache Sep 9 06:05:22.767079 oslogin_cache_refresh[1913]: Failure getting groups, quitting Sep 9 06:05:22.767083 oslogin_cache_refresh[1913]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 06:05:22.800383 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 06:05:22.823955 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 06:05:22.834780 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Sep 9 06:05:22.842012 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 06:05:22.842325 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 06:05:22.850196 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 06:05:22.858858 update_engine[1943]: I20250909 06:05:22.858819 1943 main.cc:92] Flatcar Update Engine starting Sep 9 06:05:22.860838 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 06:05:22.861813 jq[1944]: true Sep 9 06:05:22.862551 systemd-logind[1938]: Watching system buttons on /dev/input/event3 (Power Button) Sep 9 06:05:22.862563 systemd-logind[1938]: Watching system buttons on /dev/input/event2 (Sleep Button) Sep 9 06:05:22.862573 systemd-logind[1938]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Sep 9 06:05:22.862686 systemd-logind[1938]: New seat seat0. Sep 9 06:05:22.871789 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 06:05:22.881874 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 06:05:22.881984 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 06:05:22.882127 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 06:05:22.882225 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 06:05:22.890829 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 06:05:22.890933 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 06:05:22.900187 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 06:05:22.900299 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 06:05:22.921486 jq[1948]: true Sep 9 06:05:22.922743 (ntainerd)[1949]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 06:05:22.931214 tar[1946]: linux-amd64/LICENSE Sep 9 06:05:22.931356 tar[1946]: linux-amd64/helm Sep 9 06:05:22.935271 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Sep 9 06:05:22.935395 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Sep 9 06:05:22.959861 dbus-daemon[1906]: [system] SELinux support is enabled Sep 9 06:05:22.959991 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 06:05:22.961544 update_engine[1943]: I20250909 06:05:22.961517 1943 update_check_scheduler.cc:74] Next update check in 7m34s Sep 9 06:05:22.962196 bash[1978]: Updated "/home/core/.ssh/authorized_keys" Sep 9 06:05:22.970335 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 06:05:22.982849 dbus-daemon[1906]: [system] Successfully activated service 'org.freedesktop.systemd1' Sep 9 06:05:22.983242 systemd[1]: Starting sshkeys.service... Sep 9 06:05:22.988783 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 06:05:22.988805 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 06:05:22.998728 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 06:05:22.998740 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 06:05:23.010262 systemd[1]: Started update-engine.service - Update Engine. Sep 9 06:05:23.021819 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 06:05:23.032539 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 06:05:23.054186 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 06:05:23.064790 coreos-metadata[1987]: Sep 09 06:05:23.064 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Sep 9 06:05:23.065590 coreos-metadata[1987]: Sep 09 06:05:23.065 INFO Failed to fetch: error sending request for url (https://metadata.packet.net/metadata) Sep 9 06:05:23.085058 containerd[1949]: time="2025-09-09T06:05:23Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 06:05:23.085395 containerd[1949]: time="2025-09-09T06:05:23.085382392Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 06:05:23.090355 locksmithd[1988]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 06:05:23.091195 containerd[1949]: time="2025-09-09T06:05:23.091169740Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.372µs" Sep 9 06:05:23.091231 containerd[1949]: time="2025-09-09T06:05:23.091195099Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 06:05:23.091231 containerd[1949]: time="2025-09-09T06:05:23.091210609Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 06:05:23.091308 containerd[1949]: time="2025-09-09T06:05:23.091299773Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 06:05:23.091331 containerd[1949]: time="2025-09-09T06:05:23.091310255Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 06:05:23.091331 containerd[1949]: time="2025-09-09T06:05:23.091325305Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091369 containerd[1949]: time="2025-09-09T06:05:23.091356171Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091369 containerd[1949]: time="2025-09-09T06:05:23.091364127Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091499 containerd[1949]: time="2025-09-09T06:05:23.091490619Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091522 containerd[1949]: time="2025-09-09T06:05:23.091499401Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091522 containerd[1949]: time="2025-09-09T06:05:23.091505230Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091522 containerd[1949]: time="2025-09-09T06:05:23.091509725Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091568 containerd[1949]: time="2025-09-09T06:05:23.091547483Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091663 containerd[1949]: time="2025-09-09T06:05:23.091655375Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091689 containerd[1949]: time="2025-09-09T06:05:23.091677912Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 06:05:23.091689 containerd[1949]: time="2025-09-09T06:05:23.091685082Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 06:05:23.091722 containerd[1949]: time="2025-09-09T06:05:23.091699934Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 06:05:23.091834 containerd[1949]: time="2025-09-09T06:05:23.091825352Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 06:05:23.091870 containerd[1949]: time="2025-09-09T06:05:23.091859023Z" level=info msg="metadata content store policy set" policy=shared Sep 9 06:05:23.097890 sshd_keygen[1941]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 06:05:23.105051 containerd[1949]: time="2025-09-09T06:05:23.105028382Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 06:05:23.105100 containerd[1949]: time="2025-09-09T06:05:23.105072666Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 06:05:23.105100 containerd[1949]: time="2025-09-09T06:05:23.105088593Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 06:05:23.105148 containerd[1949]: time="2025-09-09T06:05:23.105099116Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 06:05:23.105148 containerd[1949]: time="2025-09-09T06:05:23.105110862Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 06:05:23.105148 containerd[1949]: time="2025-09-09T06:05:23.105122965Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 06:05:23.105148 containerd[1949]: time="2025-09-09T06:05:23.105141632Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 06:05:23.105245 containerd[1949]: time="2025-09-09T06:05:23.105153752Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 06:05:23.105245 containerd[1949]: time="2025-09-09T06:05:23.105164231Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 06:05:23.105245 containerd[1949]: time="2025-09-09T06:05:23.105173424Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 06:05:23.105245 containerd[1949]: time="2025-09-09T06:05:23.105182163Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 06:05:23.105245 containerd[1949]: time="2025-09-09T06:05:23.105194167Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 06:05:23.105357 containerd[1949]: time="2025-09-09T06:05:23.105281043Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 06:05:23.105357 containerd[1949]: time="2025-09-09T06:05:23.105298693Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 06:05:23.105357 containerd[1949]: time="2025-09-09T06:05:23.105312843Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 06:05:23.105357 containerd[1949]: time="2025-09-09T06:05:23.105323120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 06:05:23.105357 containerd[1949]: time="2025-09-09T06:05:23.105332957Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 06:05:23.105357 containerd[1949]: time="2025-09-09T06:05:23.105343123Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 06:05:23.105357 containerd[1949]: time="2025-09-09T06:05:23.105353308Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 06:05:23.105509 containerd[1949]: time="2025-09-09T06:05:23.105362447Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 06:05:23.105509 containerd[1949]: time="2025-09-09T06:05:23.105373109Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 06:05:23.105509 containerd[1949]: time="2025-09-09T06:05:23.105382537Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 06:05:23.105509 containerd[1949]: time="2025-09-09T06:05:23.105392672Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 06:05:23.105509 containerd[1949]: time="2025-09-09T06:05:23.105449927Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 06:05:23.105509 containerd[1949]: time="2025-09-09T06:05:23.105461718Z" level=info msg="Start snapshots syncer" Sep 9 06:05:23.105509 containerd[1949]: time="2025-09-09T06:05:23.105480355Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 06:05:23.105714 containerd[1949]: time="2025-09-09T06:05:23.105689276Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 06:05:23.105804 containerd[1949]: time="2025-09-09T06:05:23.105726297Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 06:05:23.105804 containerd[1949]: time="2025-09-09T06:05:23.105776942Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 06:05:23.105861 containerd[1949]: time="2025-09-09T06:05:23.105843650Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 06:05:23.105888 containerd[1949]: time="2025-09-09T06:05:23.105863125Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 06:05:23.105913 containerd[1949]: time="2025-09-09T06:05:23.105886141Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 06:05:23.105913 containerd[1949]: time="2025-09-09T06:05:23.105897921Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 06:05:23.105965 containerd[1949]: time="2025-09-09T06:05:23.105911147Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 06:05:23.105965 containerd[1949]: time="2025-09-09T06:05:23.105922110Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 06:05:23.105965 containerd[1949]: time="2025-09-09T06:05:23.105933113Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 06:05:23.105965 containerd[1949]: time="2025-09-09T06:05:23.105955616Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 06:05:23.106058 containerd[1949]: time="2025-09-09T06:05:23.105968056Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 06:05:23.106058 containerd[1949]: time="2025-09-09T06:05:23.105979883Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 06:05:23.106058 containerd[1949]: time="2025-09-09T06:05:23.106004930Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 06:05:23.106058 containerd[1949]: time="2025-09-09T06:05:23.106032937Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 06:05:23.106058 containerd[1949]: time="2025-09-09T06:05:23.106041813Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 06:05:23.106058 containerd[1949]: time="2025-09-09T06:05:23.106050673Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 06:05:23.106194 containerd[1949]: time="2025-09-09T06:05:23.106057912Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 06:05:23.106194 containerd[1949]: time="2025-09-09T06:05:23.106066693Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 06:05:23.106194 containerd[1949]: time="2025-09-09T06:05:23.106076270Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 06:05:23.106194 containerd[1949]: time="2025-09-09T06:05:23.106088964Z" level=info msg="runtime interface created" Sep 9 06:05:23.106194 containerd[1949]: time="2025-09-09T06:05:23.106094383Z" level=info msg="created NRI interface" Sep 9 06:05:23.106194 containerd[1949]: time="2025-09-09T06:05:23.106101730Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 06:05:23.106194 containerd[1949]: time="2025-09-09T06:05:23.106111019Z" level=info msg="Connect containerd service" Sep 9 06:05:23.106194 containerd[1949]: time="2025-09-09T06:05:23.106132377Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 06:05:23.106588 containerd[1949]: time="2025-09-09T06:05:23.106575268Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 06:05:23.106623 tar[1946]: linux-amd64/README.md Sep 9 06:05:23.120763 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 06:05:23.130269 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 06:05:23.140060 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 06:05:23.164242 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 06:05:23.164375 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 06:05:23.175676 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Sep 9 06:05:23.178983 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 06:05:23.180374 containerd[1949]: time="2025-09-09T06:05:23.180322747Z" level=info msg="Start subscribing containerd event" Sep 9 06:05:23.180412 containerd[1949]: time="2025-09-09T06:05:23.180370446Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 06:05:23.180412 containerd[1949]: time="2025-09-09T06:05:23.180374044Z" level=info msg="Start recovering state" Sep 9 06:05:23.180459 containerd[1949]: time="2025-09-09T06:05:23.180452436Z" level=info msg="Start event monitor" Sep 9 06:05:23.180484 containerd[1949]: time="2025-09-09T06:05:23.180461030Z" level=info msg="Start cni network conf syncer for default" Sep 9 06:05:23.180484 containerd[1949]: time="2025-09-09T06:05:23.180465145Z" level=info msg="Start streaming server" Sep 9 06:05:23.180512 containerd[1949]: time="2025-09-09T06:05:23.180487541Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 06:05:23.180512 containerd[1949]: time="2025-09-09T06:05:23.180492489Z" level=info msg="runtime interface starting up..." Sep 9 06:05:23.180512 containerd[1949]: time="2025-09-09T06:05:23.180405214Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 06:05:23.180548 containerd[1949]: time="2025-09-09T06:05:23.180495643Z" level=info msg="starting plugins..." Sep 9 06:05:23.180548 containerd[1949]: time="2025-09-09T06:05:23.180531866Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 06:05:23.180639 containerd[1949]: time="2025-09-09T06:05:23.180630730Z" level=info msg="containerd successfully booted in 0.095834s" Sep 9 06:05:23.189707 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Sep 9 06:05:23.190139 systemd-networkd[1870]: enp1s0f1np1: Configuring with /etc/systemd/network/10-b8:59:9f:e1:63:eb.network. Sep 9 06:05:23.195929 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 06:05:23.207442 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 06:05:23.219025 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 06:05:23.228507 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Sep 9 06:05:23.238900 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 06:05:23.278710 kernel: EXT4-fs (sdb9): resized filesystem to 116605649 Sep 9 06:05:23.306833 extend-filesystems[1924]: Filesystem at /dev/sdb9 is mounted on /; on-line resizing required Sep 9 06:05:23.306833 extend-filesystems[1924]: old_desc_blocks = 1, new_desc_blocks = 56 Sep 9 06:05:23.306833 extend-filesystems[1924]: The filesystem on /dev/sdb9 is now 116605649 (4k) blocks long. Sep 9 06:05:23.336975 extend-filesystems[1912]: Resized filesystem in /dev/sdb9 Sep 9 06:05:23.307268 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 06:05:23.307401 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 06:05:23.364682 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Sep 9 06:05:23.377274 systemd-networkd[1870]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Sep 9 06:05:23.377686 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Sep 9 06:05:23.377988 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 06:05:23.378479 systemd-networkd[1870]: enp1s0f0np0: Link UP Sep 9 06:05:23.378803 systemd-networkd[1870]: enp1s0f0np0: Gained carrier Sep 9 06:05:23.389810 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Sep 9 06:05:23.402822 systemd-networkd[1870]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-b8:59:9f:e1:63:ea.network. Sep 9 06:05:23.402974 systemd-networkd[1870]: enp1s0f1np1: Link UP Sep 9 06:05:23.403103 systemd-networkd[1870]: enp1s0f1np1: Gained carrier Sep 9 06:05:23.419947 systemd-networkd[1870]: bond0: Link UP Sep 9 06:05:23.420129 systemd-networkd[1870]: bond0: Gained carrier Sep 9 06:05:23.420301 systemd-timesyncd[1872]: Network configuration changed, trying to establish connection. Sep 9 06:05:23.420642 systemd-timesyncd[1872]: Network configuration changed, trying to establish connection. Sep 9 06:05:23.420884 systemd-timesyncd[1872]: Network configuration changed, trying to establish connection. Sep 9 06:05:23.420960 systemd-timesyncd[1872]: Network configuration changed, trying to establish connection. Sep 9 06:05:23.493574 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 25000 Mbps full duplex Sep 9 06:05:23.493593 kernel: bond0: active interface up! Sep 9 06:05:23.609707 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Sep 9 06:05:23.667303 coreos-metadata[1905]: Sep 09 06:05:23.667 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Sep 9 06:05:24.065727 coreos-metadata[1987]: Sep 09 06:05:24.065 INFO Fetching https://metadata.packet.net/metadata: Attempt #2 Sep 9 06:05:24.455905 systemd-timesyncd[1872]: Network configuration changed, trying to establish connection. Sep 9 06:05:25.159864 systemd-networkd[1870]: bond0: Gained IPv6LL Sep 9 06:05:25.160217 systemd-timesyncd[1872]: Network configuration changed, trying to establish connection. Sep 9 06:05:25.161788 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 06:05:25.173450 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 06:05:25.185784 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 06:05:25.212120 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 06:05:25.239463 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 06:05:25.974996 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 06:05:25.986199 (kubelet)[2062]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 06:05:26.401798 kubelet[2062]: E0909 06:05:26.401729 2062 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 06:05:26.403170 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 06:05:26.403256 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 06:05:26.403447 systemd[1]: kubelet.service: Consumed 597ms CPU time, 268.7M memory peak. Sep 9 06:05:27.083803 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Sep 9 06:05:27.083958 kernel: mlx5_core 0000:01:00.0: shared_fdb:0 mode:queue_affinity Sep 9 06:05:27.095205 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 06:05:27.105655 systemd[1]: Started sshd@0-139.178.90.255:22-139.178.89.65:48492.service - OpenSSH per-connection server daemon (139.178.89.65:48492). Sep 9 06:05:27.191109 sshd[2080]: Accepted publickey for core from 139.178.89.65 port 48492 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:27.192566 sshd-session[2080]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:27.199467 systemd-logind[1938]: New session 1 of user core. Sep 9 06:05:27.200361 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 06:05:27.209540 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 06:05:27.239125 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 06:05:27.249971 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 06:05:27.265015 (systemd)[2087]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 06:05:27.266365 systemd-logind[1938]: New session c1 of user core. Sep 9 06:05:27.361862 systemd[2087]: Queued start job for default target default.target. Sep 9 06:05:27.371206 systemd[2087]: Created slice app.slice - User Application Slice. Sep 9 06:05:27.371240 systemd[2087]: Reached target paths.target - Paths. Sep 9 06:05:27.371260 systemd[2087]: Reached target timers.target - Timers. Sep 9 06:05:27.371909 systemd[2087]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 06:05:27.377606 systemd[2087]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 06:05:27.377634 systemd[2087]: Reached target sockets.target - Sockets. Sep 9 06:05:27.377656 systemd[2087]: Reached target basic.target - Basic System. Sep 9 06:05:27.377681 systemd[2087]: Reached target default.target - Main User Target. Sep 9 06:05:27.377717 systemd[2087]: Startup finished in 108ms. Sep 9 06:05:27.377756 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 06:05:27.395775 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 06:05:27.468276 systemd[1]: Started sshd@1-139.178.90.255:22-139.178.89.65:48508.service - OpenSSH per-connection server daemon (139.178.89.65:48508). Sep 9 06:05:27.514169 sshd[2098]: Accepted publickey for core from 139.178.89.65 port 48508 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:27.514817 sshd-session[2098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:27.517252 systemd-logind[1938]: New session 2 of user core. Sep 9 06:05:27.527877 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 06:05:27.582436 sshd[2101]: Connection closed by 139.178.89.65 port 48508 Sep 9 06:05:27.582590 sshd-session[2098]: pam_unix(sshd:session): session closed for user core Sep 9 06:05:27.611728 systemd[1]: sshd@1-139.178.90.255:22-139.178.89.65:48508.service: Deactivated successfully. Sep 9 06:05:27.612518 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 06:05:27.612933 systemd-logind[1938]: Session 2 logged out. Waiting for processes to exit. Sep 9 06:05:27.614065 systemd[1]: Started sshd@2-139.178.90.255:22-139.178.89.65:48510.service - OpenSSH per-connection server daemon (139.178.89.65:48510). Sep 9 06:05:27.624186 systemd-logind[1938]: Removed session 2. Sep 9 06:05:27.667344 sshd[2107]: Accepted publickey for core from 139.178.89.65 port 48510 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:27.667940 sshd-session[2107]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:27.670240 systemd-logind[1938]: New session 3 of user core. Sep 9 06:05:27.686800 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 06:05:27.741545 sshd[2110]: Connection closed by 139.178.89.65 port 48510 Sep 9 06:05:27.741691 sshd-session[2107]: pam_unix(sshd:session): session closed for user core Sep 9 06:05:27.743333 systemd[1]: sshd@2-139.178.90.255:22-139.178.89.65:48510.service: Deactivated successfully. Sep 9 06:05:27.744141 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 06:05:27.744516 systemd-logind[1938]: Session 3 logged out. Waiting for processes to exit. Sep 9 06:05:27.745011 systemd-logind[1938]: Removed session 3. Sep 9 06:05:27.831635 coreos-metadata[1987]: Sep 09 06:05:27.831 INFO Fetch successful Sep 9 06:05:27.864044 unknown[1987]: wrote ssh authorized keys file for user: core Sep 9 06:05:27.899822 update-ssh-keys[2116]: Updated "/home/core/.ssh/authorized_keys" Sep 9 06:05:27.900126 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 06:05:27.908383 coreos-metadata[1905]: Sep 09 06:05:27.908 INFO Fetch successful Sep 9 06:05:27.910516 systemd[1]: Finished sshkeys.service. Sep 9 06:05:27.953513 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 06:05:27.964754 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Sep 9 06:05:28.273795 login[2041]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 06:05:28.278216 systemd-logind[1938]: New session 4 of user core. Sep 9 06:05:28.278912 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 06:05:28.284083 login[2040]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Sep 9 06:05:28.287093 systemd-logind[1938]: New session 5 of user core. Sep 9 06:05:28.287572 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 06:05:28.511763 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Sep 9 06:05:28.513632 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 06:05:28.514305 systemd[1]: Startup finished in 4.349s (kernel) + 22.565s (initrd) + 9.730s (userspace) = 36.645s. Sep 9 06:05:30.259169 systemd-timesyncd[1872]: Network configuration changed, trying to establish connection. Sep 9 06:05:36.597721 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 06:05:36.598789 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 06:05:36.908046 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 06:05:36.910142 (kubelet)[2161]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 06:05:36.938837 kubelet[2161]: E0909 06:05:36.938799 2161 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 06:05:36.940937 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 06:05:36.941019 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 06:05:36.941187 systemd[1]: kubelet.service: Consumed 156ms CPU time, 116M memory peak. Sep 9 06:05:37.758300 systemd[1]: Started sshd@3-139.178.90.255:22-139.178.89.65:34324.service - OpenSSH per-connection server daemon (139.178.89.65:34324). Sep 9 06:05:37.795426 sshd[2182]: Accepted publickey for core from 139.178.89.65 port 34324 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:37.796027 sshd-session[2182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:37.798618 systemd-logind[1938]: New session 6 of user core. Sep 9 06:05:37.808945 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 06:05:37.860198 sshd[2185]: Connection closed by 139.178.89.65 port 34324 Sep 9 06:05:37.860362 sshd-session[2182]: pam_unix(sshd:session): session closed for user core Sep 9 06:05:37.872700 systemd[1]: sshd@3-139.178.90.255:22-139.178.89.65:34324.service: Deactivated successfully. Sep 9 06:05:37.873538 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 06:05:37.874048 systemd-logind[1938]: Session 6 logged out. Waiting for processes to exit. Sep 9 06:05:37.875148 systemd[1]: Started sshd@4-139.178.90.255:22-139.178.89.65:34332.service - OpenSSH per-connection server daemon (139.178.89.65:34332). Sep 9 06:05:37.875803 systemd-logind[1938]: Removed session 6. Sep 9 06:05:37.908336 sshd[2191]: Accepted publickey for core from 139.178.89.65 port 34332 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:37.908927 sshd-session[2191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:37.911585 systemd-logind[1938]: New session 7 of user core. Sep 9 06:05:37.927974 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 06:05:37.978770 sshd[2195]: Connection closed by 139.178.89.65 port 34332 Sep 9 06:05:37.978946 sshd-session[2191]: pam_unix(sshd:session): session closed for user core Sep 9 06:05:37.990765 systemd[1]: sshd@4-139.178.90.255:22-139.178.89.65:34332.service: Deactivated successfully. Sep 9 06:05:37.991565 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 06:05:37.992083 systemd-logind[1938]: Session 7 logged out. Waiting for processes to exit. Sep 9 06:05:37.993287 systemd[1]: Started sshd@5-139.178.90.255:22-139.178.89.65:34342.service - OpenSSH per-connection server daemon (139.178.89.65:34342). Sep 9 06:05:37.993681 systemd-logind[1938]: Removed session 7. Sep 9 06:05:38.064174 sshd[2201]: Accepted publickey for core from 139.178.89.65 port 34342 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:38.065126 sshd-session[2201]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:38.068945 systemd-logind[1938]: New session 8 of user core. Sep 9 06:05:38.082088 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 06:05:38.149998 sshd[2204]: Connection closed by 139.178.89.65 port 34342 Sep 9 06:05:38.150799 sshd-session[2201]: pam_unix(sshd:session): session closed for user core Sep 9 06:05:38.170724 systemd[1]: sshd@5-139.178.90.255:22-139.178.89.65:34342.service: Deactivated successfully. Sep 9 06:05:38.174416 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 06:05:38.176661 systemd-logind[1938]: Session 8 logged out. Waiting for processes to exit. Sep 9 06:05:38.181999 systemd[1]: Started sshd@6-139.178.90.255:22-139.178.89.65:34358.service - OpenSSH per-connection server daemon (139.178.89.65:34358). Sep 9 06:05:38.183880 systemd-logind[1938]: Removed session 8. Sep 9 06:05:38.280950 sshd[2210]: Accepted publickey for core from 139.178.89.65 port 34358 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:38.281748 sshd-session[2210]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:38.285339 systemd-logind[1938]: New session 9 of user core. Sep 9 06:05:38.299108 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 06:05:38.365958 sudo[2214]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 06:05:38.366096 sudo[2214]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 06:05:38.378999 sudo[2214]: pam_unix(sudo:session): session closed for user root Sep 9 06:05:38.379626 sshd[2213]: Connection closed by 139.178.89.65 port 34358 Sep 9 06:05:38.379836 sshd-session[2210]: pam_unix(sshd:session): session closed for user core Sep 9 06:05:38.392990 systemd[1]: sshd@6-139.178.90.255:22-139.178.89.65:34358.service: Deactivated successfully. Sep 9 06:05:38.393973 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 06:05:38.394606 systemd-logind[1938]: Session 9 logged out. Waiting for processes to exit. Sep 9 06:05:38.395980 systemd[1]: Started sshd@7-139.178.90.255:22-139.178.89.65:34364.service - OpenSSH per-connection server daemon (139.178.89.65:34364). Sep 9 06:05:38.396436 systemd-logind[1938]: Removed session 9. Sep 9 06:05:38.429330 sshd[2220]: Accepted publickey for core from 139.178.89.65 port 34364 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:38.429896 sshd-session[2220]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:38.432442 systemd-logind[1938]: New session 10 of user core. Sep 9 06:05:38.447930 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 06:05:38.500506 sudo[2225]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 06:05:38.500648 sudo[2225]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 06:05:38.503655 sudo[2225]: pam_unix(sudo:session): session closed for user root Sep 9 06:05:38.506247 sudo[2224]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 06:05:38.506384 sudo[2224]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 06:05:38.512035 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 06:05:38.542678 augenrules[2247]: No rules Sep 9 06:05:38.543416 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 06:05:38.543665 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 06:05:38.544622 sudo[2224]: pam_unix(sudo:session): session closed for user root Sep 9 06:05:38.545852 sshd[2223]: Connection closed by 139.178.89.65 port 34364 Sep 9 06:05:38.546174 sshd-session[2220]: pam_unix(sshd:session): session closed for user core Sep 9 06:05:38.570505 systemd[1]: sshd@7-139.178.90.255:22-139.178.89.65:34364.service: Deactivated successfully. Sep 9 06:05:38.574249 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 06:05:38.576495 systemd-logind[1938]: Session 10 logged out. Waiting for processes to exit. Sep 9 06:05:38.582026 systemd[1]: Started sshd@8-139.178.90.255:22-139.178.89.65:34366.service - OpenSSH per-connection server daemon (139.178.89.65:34366). Sep 9 06:05:38.583595 systemd-logind[1938]: Removed session 10. Sep 9 06:05:38.683771 sshd[2257]: Accepted publickey for core from 139.178.89.65 port 34366 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:05:38.684827 sshd-session[2257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:05:38.688634 systemd-logind[1938]: New session 11 of user core. Sep 9 06:05:38.700924 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 06:05:38.761979 sudo[2261]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 06:05:38.762789 sudo[2261]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 06:05:39.117557 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 06:05:39.139031 (dockerd)[2291]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 06:05:39.336968 dockerd[2291]: time="2025-09-09T06:05:39.336934527Z" level=info msg="Starting up" Sep 9 06:05:39.337527 dockerd[2291]: time="2025-09-09T06:05:39.337516663Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 06:05:39.343559 dockerd[2291]: time="2025-09-09T06:05:39.343510579Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 06:05:39.364876 dockerd[2291]: time="2025-09-09T06:05:39.364806197Z" level=info msg="Loading containers: start." Sep 9 06:05:39.376679 kernel: Initializing XFRM netlink socket Sep 9 06:05:39.511593 systemd-timesyncd[1872]: Network configuration changed, trying to establish connection. Sep 9 06:05:39.532308 systemd-networkd[1870]: docker0: Link UP Sep 9 06:05:39.534072 dockerd[2291]: time="2025-09-09T06:05:39.534032944Z" level=info msg="Loading containers: done." Sep 9 06:05:39.540920 dockerd[2291]: time="2025-09-09T06:05:39.540901716Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 06:05:39.540987 dockerd[2291]: time="2025-09-09T06:05:39.540941678Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 06:05:39.540987 dockerd[2291]: time="2025-09-09T06:05:39.540979514Z" level=info msg="Initializing buildkit" Sep 9 06:05:39.551903 dockerd[2291]: time="2025-09-09T06:05:39.551855961Z" level=info msg="Completed buildkit initialization" Sep 9 06:05:39.555069 dockerd[2291]: time="2025-09-09T06:05:39.555031966Z" level=info msg="Daemon has completed initialization" Sep 9 06:05:39.555105 dockerd[2291]: time="2025-09-09T06:05:39.555072245Z" level=info msg="API listen on /run/docker.sock" Sep 9 06:05:39.555125 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 06:05:39.731271 systemd-timesyncd[1872]: Contacted time server [2606:82c0:21::e]:123 (2.flatcar.pool.ntp.org). Sep 9 06:05:39.731305 systemd-timesyncd[1872]: Initial clock synchronization to Tue 2025-09-09 06:05:39.783456 UTC. Sep 9 06:05:40.341977 containerd[1949]: time="2025-09-09T06:05:40.341957638Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 06:05:40.961621 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4256380646.mount: Deactivated successfully. Sep 9 06:05:41.945783 containerd[1949]: time="2025-09-09T06:05:41.945725639Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:41.946006 containerd[1949]: time="2025-09-09T06:05:41.945961399Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28800687" Sep 9 06:05:41.946378 containerd[1949]: time="2025-09-09T06:05:41.946334324Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:41.948005 containerd[1949]: time="2025-09-09T06:05:41.947963634Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:41.948371 containerd[1949]: time="2025-09-09T06:05:41.948357170Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 1.606378936s" Sep 9 06:05:41.948405 containerd[1949]: time="2025-09-09T06:05:41.948374859Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Sep 9 06:05:41.948770 containerd[1949]: time="2025-09-09T06:05:41.948730190Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 06:05:43.308202 containerd[1949]: time="2025-09-09T06:05:43.308150193Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:43.308429 containerd[1949]: time="2025-09-09T06:05:43.308318461Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24784128" Sep 9 06:05:43.308679 containerd[1949]: time="2025-09-09T06:05:43.308635132Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:43.309979 containerd[1949]: time="2025-09-09T06:05:43.309933797Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:43.310515 containerd[1949]: time="2025-09-09T06:05:43.310474583Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 1.361728449s" Sep 9 06:05:43.310515 containerd[1949]: time="2025-09-09T06:05:43.310490000Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Sep 9 06:05:43.310807 containerd[1949]: time="2025-09-09T06:05:43.310767327Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 06:05:44.500626 containerd[1949]: time="2025-09-09T06:05:44.500600495Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:44.500869 containerd[1949]: time="2025-09-09T06:05:44.500794796Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19175036" Sep 9 06:05:44.501165 containerd[1949]: time="2025-09-09T06:05:44.501123685Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:44.502426 containerd[1949]: time="2025-09-09T06:05:44.502406058Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:44.503308 containerd[1949]: time="2025-09-09T06:05:44.503295769Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 1.192512815s" Sep 9 06:05:44.503345 containerd[1949]: time="2025-09-09T06:05:44.503310989Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Sep 9 06:05:44.503609 containerd[1949]: time="2025-09-09T06:05:44.503599504Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 06:05:45.563343 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount784021666.mount: Deactivated successfully. Sep 9 06:05:45.759477 containerd[1949]: time="2025-09-09T06:05:45.759419095Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:45.759705 containerd[1949]: time="2025-09-09T06:05:45.759616455Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30897170" Sep 9 06:05:45.760024 containerd[1949]: time="2025-09-09T06:05:45.759984429Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:45.760732 containerd[1949]: time="2025-09-09T06:05:45.760682069Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:45.761045 containerd[1949]: time="2025-09-09T06:05:45.761004595Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 1.257389206s" Sep 9 06:05:45.761045 containerd[1949]: time="2025-09-09T06:05:45.761019826Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Sep 9 06:05:45.761321 containerd[1949]: time="2025-09-09T06:05:45.761277494Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 06:05:46.345314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2129112029.mount: Deactivated successfully. Sep 9 06:05:46.867106 containerd[1949]: time="2025-09-09T06:05:46.867050771Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:46.867327 containerd[1949]: time="2025-09-09T06:05:46.867253122Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Sep 9 06:05:46.867603 containerd[1949]: time="2025-09-09T06:05:46.867567587Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:46.868921 containerd[1949]: time="2025-09-09T06:05:46.868882683Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:46.869872 containerd[1949]: time="2025-09-09T06:05:46.869832078Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.108539975s" Sep 9 06:05:46.869872 containerd[1949]: time="2025-09-09T06:05:46.869847602Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 06:05:46.870124 containerd[1949]: time="2025-09-09T06:05:46.870113893Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 06:05:47.097565 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 06:05:47.098795 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 06:05:47.363235 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 06:05:47.365246 (kubelet)[2663]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 06:05:47.384236 kubelet[2663]: E0909 06:05:47.384162 2663 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 06:05:47.385416 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 06:05:47.385492 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 06:05:47.385658 systemd[1]: kubelet.service: Consumed 124ms CPU time, 121M memory peak. Sep 9 06:05:47.519662 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1035549575.mount: Deactivated successfully. Sep 9 06:05:47.520322 containerd[1949]: time="2025-09-09T06:05:47.520306933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 06:05:47.520504 containerd[1949]: time="2025-09-09T06:05:47.520469824Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Sep 9 06:05:47.520920 containerd[1949]: time="2025-09-09T06:05:47.520886281Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 06:05:47.521717 containerd[1949]: time="2025-09-09T06:05:47.521662064Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 06:05:47.522120 containerd[1949]: time="2025-09-09T06:05:47.522080404Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 651.952581ms" Sep 9 06:05:47.522120 containerd[1949]: time="2025-09-09T06:05:47.522094881Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 06:05:47.522478 containerd[1949]: time="2025-09-09T06:05:47.522419740Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 06:05:48.076723 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3254734706.mount: Deactivated successfully. Sep 9 06:05:49.149572 containerd[1949]: time="2025-09-09T06:05:49.149543255Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:49.149807 containerd[1949]: time="2025-09-09T06:05:49.149679537Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Sep 9 06:05:49.150122 containerd[1949]: time="2025-09-09T06:05:49.150083040Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:49.151811 containerd[1949]: time="2025-09-09T06:05:49.151770508Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:05:49.152254 containerd[1949]: time="2025-09-09T06:05:49.152213257Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.629778789s" Sep 9 06:05:49.152254 containerd[1949]: time="2025-09-09T06:05:49.152229879Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 9 06:05:51.511254 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 06:05:51.511419 systemd[1]: kubelet.service: Consumed 124ms CPU time, 121M memory peak. Sep 9 06:05:51.512599 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 06:05:51.528343 systemd[1]: Reload requested from client PID 2788 ('systemctl') (unit session-11.scope)... Sep 9 06:05:51.528350 systemd[1]: Reloading... Sep 9 06:05:51.563725 zram_generator::config[2832]: No configuration found. Sep 9 06:05:51.711428 systemd[1]: Reloading finished in 182 ms. Sep 9 06:05:51.750499 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 06:05:51.750723 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 06:05:51.751266 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 06:05:51.756007 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 06:05:52.048823 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 06:05:52.050818 (kubelet)[2898]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 06:05:52.072957 kubelet[2898]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 06:05:52.072957 kubelet[2898]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 06:05:52.072957 kubelet[2898]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 06:05:52.073196 kubelet[2898]: I0909 06:05:52.073004 2898 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 06:05:52.264813 kubelet[2898]: I0909 06:05:52.264772 2898 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 06:05:52.264813 kubelet[2898]: I0909 06:05:52.264782 2898 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 06:05:52.264958 kubelet[2898]: I0909 06:05:52.264930 2898 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 06:05:52.285520 kubelet[2898]: I0909 06:05:52.285512 2898 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 06:05:52.285771 kubelet[2898]: E0909 06:05:52.285564 2898 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.90.255:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.90.255:6443: connect: connection refused" logger="UnhandledError" Sep 9 06:05:52.291790 kubelet[2898]: I0909 06:05:52.291729 2898 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 06:05:52.299910 kubelet[2898]: I0909 06:05:52.299837 2898 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 06:05:52.300011 kubelet[2898]: I0909 06:05:52.299969 2898 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 06:05:52.300106 kubelet[2898]: I0909 06:05:52.299983 2898 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452.0.0-n-7ab43648c0","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 06:05:52.300615 kubelet[2898]: I0909 06:05:52.300579 2898 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 06:05:52.300615 kubelet[2898]: I0909 06:05:52.300587 2898 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 06:05:52.300659 kubelet[2898]: I0909 06:05:52.300653 2898 state_mem.go:36] "Initialized new in-memory state store" Sep 9 06:05:52.303813 kubelet[2898]: I0909 06:05:52.303770 2898 kubelet.go:446] "Attempting to sync node with API server" Sep 9 06:05:52.303813 kubelet[2898]: I0909 06:05:52.303787 2898 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 06:05:52.303813 kubelet[2898]: I0909 06:05:52.303799 2898 kubelet.go:352] "Adding apiserver pod source" Sep 9 06:05:52.303813 kubelet[2898]: I0909 06:05:52.303806 2898 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 06:05:52.306384 kubelet[2898]: I0909 06:05:52.306326 2898 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 06:05:52.306714 kubelet[2898]: I0909 06:05:52.306658 2898 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 06:05:52.306761 kubelet[2898]: W0909 06:05:52.306717 2898 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 06:05:52.308519 kubelet[2898]: W0909 06:05:52.308478 2898 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.90.255:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452.0.0-n-7ab43648c0&limit=500&resourceVersion=0": dial tcp 139.178.90.255:6443: connect: connection refused Sep 9 06:05:52.308580 kubelet[2898]: W0909 06:05:52.308510 2898 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.90.255:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.90.255:6443: connect: connection refused Sep 9 06:05:52.308580 kubelet[2898]: E0909 06:05:52.308538 2898 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.90.255:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452.0.0-n-7ab43648c0&limit=500&resourceVersion=0\": dial tcp 139.178.90.255:6443: connect: connection refused" logger="UnhandledError" Sep 9 06:05:52.308580 kubelet[2898]: E0909 06:05:52.308553 2898 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.90.255:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.90.255:6443: connect: connection refused" logger="UnhandledError" Sep 9 06:05:52.309307 kubelet[2898]: I0909 06:05:52.309299 2898 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 06:05:52.309353 kubelet[2898]: I0909 06:05:52.309337 2898 server.go:1287] "Started kubelet" Sep 9 06:05:52.309400 kubelet[2898]: I0909 06:05:52.309378 2898 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 06:05:52.309422 kubelet[2898]: I0909 06:05:52.309392 2898 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 06:05:52.309574 kubelet[2898]: I0909 06:05:52.309562 2898 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 06:05:52.313194 kubelet[2898]: E0909 06:05:52.310859 2898 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 06:05:52.313344 kubelet[2898]: I0909 06:05:52.313333 2898 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 06:05:52.313387 kubelet[2898]: I0909 06:05:52.313352 2898 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 06:05:52.313387 kubelet[2898]: I0909 06:05:52.313367 2898 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 06:05:52.313387 kubelet[2898]: I0909 06:05:52.313380 2898 server.go:479] "Adding debug handlers to kubelet server" Sep 9 06:05:52.313457 kubelet[2898]: E0909 06:05:52.313387 2898 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452.0.0-n-7ab43648c0\" not found" Sep 9 06:05:52.313457 kubelet[2898]: I0909 06:05:52.313431 2898 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 06:05:52.313499 kubelet[2898]: I0909 06:05:52.313460 2898 reconciler.go:26] "Reconciler: start to sync state" Sep 9 06:05:52.313604 kubelet[2898]: E0909 06:05:52.313585 2898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.90.255:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452.0.0-n-7ab43648c0?timeout=10s\": dial tcp 139.178.90.255:6443: connect: connection refused" interval="200ms" Sep 9 06:05:52.313636 kubelet[2898]: W0909 06:05:52.313592 2898 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.90.255:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.90.255:6443: connect: connection refused Sep 9 06:05:52.313636 kubelet[2898]: E0909 06:05:52.313627 2898 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.90.255:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.90.255:6443: connect: connection refused" logger="UnhandledError" Sep 9 06:05:52.313717 kubelet[2898]: I0909 06:05:52.313707 2898 factory.go:221] Registration of the systemd container factory successfully Sep 9 06:05:52.313756 kubelet[2898]: I0909 06:05:52.313743 2898 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 06:05:52.314095 kubelet[2898]: I0909 06:05:52.314085 2898 factory.go:221] Registration of the containerd container factory successfully Sep 9 06:05:52.315260 kubelet[2898]: E0909 06:05:52.314238 2898 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.90.255:6443/api/v1/namespaces/default/events\": dial tcp 139.178.90.255:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4452.0.0-n-7ab43648c0.18638820646521fb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4452.0.0-n-7ab43648c0,UID:ci-4452.0.0-n-7ab43648c0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4452.0.0-n-7ab43648c0,},FirstTimestamp:2025-09-09 06:05:52.309305851 +0000 UTC m=+0.256791044,LastTimestamp:2025-09-09 06:05:52.309305851 +0000 UTC m=+0.256791044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4452.0.0-n-7ab43648c0,}" Sep 9 06:05:52.321653 kubelet[2898]: I0909 06:05:52.321635 2898 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 06:05:52.321653 kubelet[2898]: I0909 06:05:52.321646 2898 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 06:05:52.321653 kubelet[2898]: I0909 06:05:52.321656 2898 state_mem.go:36] "Initialized new in-memory state store" Sep 9 06:05:52.322580 kubelet[2898]: I0909 06:05:52.322571 2898 policy_none.go:49] "None policy: Start" Sep 9 06:05:52.322580 kubelet[2898]: I0909 06:05:52.322580 2898 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 06:05:52.322625 kubelet[2898]: I0909 06:05:52.322586 2898 state_mem.go:35] "Initializing new in-memory state store" Sep 9 06:05:52.322625 kubelet[2898]: I0909 06:05:52.322581 2898 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 06:05:52.323314 kubelet[2898]: I0909 06:05:52.323277 2898 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 06:05:52.323314 kubelet[2898]: I0909 06:05:52.323290 2898 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 06:05:52.323314 kubelet[2898]: I0909 06:05:52.323301 2898 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 06:05:52.323314 kubelet[2898]: I0909 06:05:52.323306 2898 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 06:05:52.323388 kubelet[2898]: E0909 06:05:52.323329 2898 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 06:05:52.323634 kubelet[2898]: W0909 06:05:52.323610 2898 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.90.255:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.90.255:6443: connect: connection refused Sep 9 06:05:52.323661 kubelet[2898]: E0909 06:05:52.323643 2898 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.90.255:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.90.255:6443: connect: connection refused" logger="UnhandledError" Sep 9 06:05:52.325290 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 06:05:52.340307 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 06:05:52.342054 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 06:05:52.351343 kubelet[2898]: I0909 06:05:52.351294 2898 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 06:05:52.351416 kubelet[2898]: I0909 06:05:52.351405 2898 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 06:05:52.351464 kubelet[2898]: I0909 06:05:52.351414 2898 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 06:05:52.351550 kubelet[2898]: I0909 06:05:52.351537 2898 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 06:05:52.351925 kubelet[2898]: E0909 06:05:52.351909 2898 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 06:05:52.351970 kubelet[2898]: E0909 06:05:52.351937 2898 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4452.0.0-n-7ab43648c0\" not found" Sep 9 06:05:52.433811 systemd[1]: Created slice kubepods-burstable-pod048cfb8f6621b4d88ac73fe70b73f053.slice - libcontainer container kubepods-burstable-pod048cfb8f6621b4d88ac73fe70b73f053.slice. Sep 9 06:05:52.454285 kubelet[2898]: I0909 06:05:52.454229 2898 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.455026 kubelet[2898]: E0909 06:05:52.454951 2898 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.90.255:6443/api/v1/nodes\": dial tcp 139.178.90.255:6443: connect: connection refused" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.456065 kubelet[2898]: E0909 06:05:52.456005 2898 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-7ab43648c0\" not found" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.464470 systemd[1]: Created slice kubepods-burstable-pod6970889dabddb934d8ae72d04137bfde.slice - libcontainer container kubepods-burstable-pod6970889dabddb934d8ae72d04137bfde.slice. Sep 9 06:05:52.477999 kubelet[2898]: E0909 06:05:52.477899 2898 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-7ab43648c0\" not found" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.485773 systemd[1]: Created slice kubepods-burstable-pod8c751ca14c76a347ad89994351fb061b.slice - libcontainer container kubepods-burstable-pod8c751ca14c76a347ad89994351fb061b.slice. Sep 9 06:05:52.490245 kubelet[2898]: E0909 06:05:52.490159 2898 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-7ab43648c0\" not found" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.515151 kubelet[2898]: E0909 06:05:52.515030 2898 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.90.255:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452.0.0-n-7ab43648c0?timeout=10s\": dial tcp 139.178.90.255:6443: connect: connection refused" interval="400ms" Sep 9 06:05:52.614790 kubelet[2898]: I0909 06:05:52.614650 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-flexvolume-dir\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.614790 kubelet[2898]: I0909 06:05:52.614770 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-kubeconfig\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.615164 kubelet[2898]: I0909 06:05:52.614831 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.615164 kubelet[2898]: I0909 06:05:52.614902 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8c751ca14c76a347ad89994351fb061b-kubeconfig\") pod \"kube-scheduler-ci-4452.0.0-n-7ab43648c0\" (UID: \"8c751ca14c76a347ad89994351fb061b\") " pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.615164 kubelet[2898]: I0909 06:05:52.614951 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/048cfb8f6621b4d88ac73fe70b73f053-ca-certs\") pod \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" (UID: \"048cfb8f6621b4d88ac73fe70b73f053\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.615164 kubelet[2898]: I0909 06:05:52.615045 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/048cfb8f6621b4d88ac73fe70b73f053-k8s-certs\") pod \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" (UID: \"048cfb8f6621b4d88ac73fe70b73f053\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.615164 kubelet[2898]: I0909 06:05:52.615132 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/048cfb8f6621b4d88ac73fe70b73f053-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" (UID: \"048cfb8f6621b4d88ac73fe70b73f053\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.615548 kubelet[2898]: I0909 06:05:52.615193 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-ca-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.615548 kubelet[2898]: I0909 06:05:52.615273 2898 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-k8s-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.659697 kubelet[2898]: I0909 06:05:52.659588 2898 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.660422 kubelet[2898]: E0909 06:05:52.660314 2898 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.90.255:6443/api/v1/nodes\": dial tcp 139.178.90.255:6443: connect: connection refused" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:52.758364 containerd[1949]: time="2025-09-09T06:05:52.758234447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452.0.0-n-7ab43648c0,Uid:048cfb8f6621b4d88ac73fe70b73f053,Namespace:kube-system,Attempt:0,}" Sep 9 06:05:52.767461 containerd[1949]: time="2025-09-09T06:05:52.767442470Z" level=info msg="connecting to shim b0a79de09ef7e5b8bde34628385ef8d9b2e42165d1427a6e27762d5c157b6a0c" address="unix:///run/containerd/s/560338ed9753464d1d3955613656ae164c4979971a6621da9a44da114d729626" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:05:52.778956 containerd[1949]: time="2025-09-09T06:05:52.778932605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452.0.0-n-7ab43648c0,Uid:6970889dabddb934d8ae72d04137bfde,Namespace:kube-system,Attempt:0,}" Sep 9 06:05:52.786378 containerd[1949]: time="2025-09-09T06:05:52.786330787Z" level=info msg="connecting to shim fe93835923401de6170d90f59480e85edf54b6f3071bc5250fd9d226182a6c5f" address="unix:///run/containerd/s/e69b37851b657f144baa8ac3cee602e5c4f76aaae3358d062916a0f71fb434bc" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:05:52.786843 systemd[1]: Started cri-containerd-b0a79de09ef7e5b8bde34628385ef8d9b2e42165d1427a6e27762d5c157b6a0c.scope - libcontainer container b0a79de09ef7e5b8bde34628385ef8d9b2e42165d1427a6e27762d5c157b6a0c. Sep 9 06:05:52.791886 containerd[1949]: time="2025-09-09T06:05:52.791861519Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452.0.0-n-7ab43648c0,Uid:8c751ca14c76a347ad89994351fb061b,Namespace:kube-system,Attempt:0,}" Sep 9 06:05:52.794462 systemd[1]: Started cri-containerd-fe93835923401de6170d90f59480e85edf54b6f3071bc5250fd9d226182a6c5f.scope - libcontainer container fe93835923401de6170d90f59480e85edf54b6f3071bc5250fd9d226182a6c5f. Sep 9 06:05:52.799668 containerd[1949]: time="2025-09-09T06:05:52.799615622Z" level=info msg="connecting to shim dce2a7f1d89f089e34d2dc006c8132c6582c41afd59bc03f0ab84753c6b1f1c2" address="unix:///run/containerd/s/3700ccb55e2ca7e2e7e36b1ad501afd82f39bd7cc38669addc84278d4e4788f2" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:05:52.807955 systemd[1]: Started cri-containerd-dce2a7f1d89f089e34d2dc006c8132c6582c41afd59bc03f0ab84753c6b1f1c2.scope - libcontainer container dce2a7f1d89f089e34d2dc006c8132c6582c41afd59bc03f0ab84753c6b1f1c2. Sep 9 06:05:52.815428 containerd[1949]: time="2025-09-09T06:05:52.815408715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452.0.0-n-7ab43648c0,Uid:048cfb8f6621b4d88ac73fe70b73f053,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0a79de09ef7e5b8bde34628385ef8d9b2e42165d1427a6e27762d5c157b6a0c\"" Sep 9 06:05:52.816910 containerd[1949]: time="2025-09-09T06:05:52.816893889Z" level=info msg="CreateContainer within sandbox \"b0a79de09ef7e5b8bde34628385ef8d9b2e42165d1427a6e27762d5c157b6a0c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 06:05:52.820373 containerd[1949]: time="2025-09-09T06:05:52.820354210Z" level=info msg="Container 33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:05:52.821252 containerd[1949]: time="2025-09-09T06:05:52.821236493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452.0.0-n-7ab43648c0,Uid:6970889dabddb934d8ae72d04137bfde,Namespace:kube-system,Attempt:0,} returns sandbox id \"fe93835923401de6170d90f59480e85edf54b6f3071bc5250fd9d226182a6c5f\"" Sep 9 06:05:52.822353 containerd[1949]: time="2025-09-09T06:05:52.822339904Z" level=info msg="CreateContainer within sandbox \"fe93835923401de6170d90f59480e85edf54b6f3071bc5250fd9d226182a6c5f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 06:05:52.823001 containerd[1949]: time="2025-09-09T06:05:52.822964021Z" level=info msg="CreateContainer within sandbox \"b0a79de09ef7e5b8bde34628385ef8d9b2e42165d1427a6e27762d5c157b6a0c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8\"" Sep 9 06:05:52.823185 containerd[1949]: time="2025-09-09T06:05:52.823175940Z" level=info msg="StartContainer for \"33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8\"" Sep 9 06:05:52.823789 containerd[1949]: time="2025-09-09T06:05:52.823746850Z" level=info msg="connecting to shim 33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8" address="unix:///run/containerd/s/560338ed9753464d1d3955613656ae164c4979971a6621da9a44da114d729626" protocol=ttrpc version=3 Sep 9 06:05:52.825491 containerd[1949]: time="2025-09-09T06:05:52.825475564Z" level=info msg="Container 92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:05:52.829710 containerd[1949]: time="2025-09-09T06:05:52.829035623Z" level=info msg="CreateContainer within sandbox \"fe93835923401de6170d90f59480e85edf54b6f3071bc5250fd9d226182a6c5f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9\"" Sep 9 06:05:52.830053 containerd[1949]: time="2025-09-09T06:05:52.830038600Z" level=info msg="StartContainer for \"92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9\"" Sep 9 06:05:52.830712 containerd[1949]: time="2025-09-09T06:05:52.830697095Z" level=info msg="connecting to shim 92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9" address="unix:///run/containerd/s/e69b37851b657f144baa8ac3cee602e5c4f76aaae3358d062916a0f71fb434bc" protocol=ttrpc version=3 Sep 9 06:05:52.837846 systemd[1]: Started cri-containerd-33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8.scope - libcontainer container 33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8. Sep 9 06:05:52.839031 containerd[1949]: time="2025-09-09T06:05:52.838986835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452.0.0-n-7ab43648c0,Uid:8c751ca14c76a347ad89994351fb061b,Namespace:kube-system,Attempt:0,} returns sandbox id \"dce2a7f1d89f089e34d2dc006c8132c6582c41afd59bc03f0ab84753c6b1f1c2\"" Sep 9 06:05:52.839872 systemd[1]: Started cri-containerd-92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9.scope - libcontainer container 92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9. Sep 9 06:05:52.840194 containerd[1949]: time="2025-09-09T06:05:52.840179538Z" level=info msg="CreateContainer within sandbox \"dce2a7f1d89f089e34d2dc006c8132c6582c41afd59bc03f0ab84753c6b1f1c2\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 06:05:52.843006 containerd[1949]: time="2025-09-09T06:05:52.842967625Z" level=info msg="Container f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:05:52.846065 containerd[1949]: time="2025-09-09T06:05:52.846051104Z" level=info msg="CreateContainer within sandbox \"dce2a7f1d89f089e34d2dc006c8132c6582c41afd59bc03f0ab84753c6b1f1c2\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9\"" Sep 9 06:05:52.846326 containerd[1949]: time="2025-09-09T06:05:52.846315148Z" level=info msg="StartContainer for \"f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9\"" Sep 9 06:05:52.846845 containerd[1949]: time="2025-09-09T06:05:52.846834277Z" level=info msg="connecting to shim f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9" address="unix:///run/containerd/s/3700ccb55e2ca7e2e7e36b1ad501afd82f39bd7cc38669addc84278d4e4788f2" protocol=ttrpc version=3 Sep 9 06:05:52.853505 systemd[1]: Started cri-containerd-f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9.scope - libcontainer container f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9. Sep 9 06:05:52.866801 containerd[1949]: time="2025-09-09T06:05:52.866706331Z" level=info msg="StartContainer for \"33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8\" returns successfully" Sep 9 06:05:52.869203 containerd[1949]: time="2025-09-09T06:05:52.869187198Z" level=info msg="StartContainer for \"92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9\" returns successfully" Sep 9 06:05:52.883162 containerd[1949]: time="2025-09-09T06:05:52.883137810Z" level=info msg="StartContainer for \"f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9\" returns successfully" Sep 9 06:05:53.062026 kubelet[2898]: I0909 06:05:53.061983 2898 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.326939 kubelet[2898]: E0909 06:05:53.326885 2898 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-7ab43648c0\" not found" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.327167 kubelet[2898]: E0909 06:05:53.327110 2898 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-7ab43648c0\" not found" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.327756 kubelet[2898]: E0909 06:05:53.327746 2898 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452.0.0-n-7ab43648c0\" not found" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.537048 kubelet[2898]: E0909 06:05:53.537013 2898 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4452.0.0-n-7ab43648c0\" not found" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.644187 kubelet[2898]: I0909 06:05:53.644092 2898 kubelet_node_status.go:78] "Successfully registered node" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.713640 kubelet[2898]: I0909 06:05:53.713521 2898 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.723577 kubelet[2898]: E0909 06:05:53.723503 2898 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.723577 kubelet[2898]: I0909 06:05:53.723564 2898 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.728179 kubelet[2898]: E0909 06:05:53.728120 2898 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.728179 kubelet[2898]: I0909 06:05:53.728176 2898 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:53.732509 kubelet[2898]: E0909 06:05:53.732419 2898 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4452.0.0-n-7ab43648c0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:54.305040 kubelet[2898]: I0909 06:05:54.304987 2898 apiserver.go:52] "Watching apiserver" Sep 9 06:05:54.314590 kubelet[2898]: I0909 06:05:54.314552 2898 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 06:05:54.329219 kubelet[2898]: I0909 06:05:54.329178 2898 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:54.329899 kubelet[2898]: I0909 06:05:54.329364 2898 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:54.332766 kubelet[2898]: E0909 06:05:54.332700 2898 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4452.0.0-n-7ab43648c0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:54.333148 kubelet[2898]: E0909 06:05:54.333099 2898 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:55.997116 systemd[1]: Reload requested from client PID 3214 ('systemctl') (unit session-11.scope)... Sep 9 06:05:55.997123 systemd[1]: Reloading... Sep 9 06:05:56.038681 zram_generator::config[3259]: No configuration found. Sep 9 06:05:56.200841 systemd[1]: Reloading finished in 203 ms. Sep 9 06:05:56.225250 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 06:05:56.236593 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 06:05:56.237082 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 06:05:56.237181 systemd[1]: kubelet.service: Consumed 766ms CPU time, 135.5M memory peak. Sep 9 06:05:56.240527 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 06:05:56.516989 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 06:05:56.519382 (kubelet)[3324]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 06:05:56.540338 kubelet[3324]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 06:05:56.540338 kubelet[3324]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 06:05:56.540338 kubelet[3324]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 06:05:56.540652 kubelet[3324]: I0909 06:05:56.540384 3324 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 06:05:56.544298 kubelet[3324]: I0909 06:05:56.544265 3324 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 06:05:56.544298 kubelet[3324]: I0909 06:05:56.544293 3324 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 06:05:56.544648 kubelet[3324]: I0909 06:05:56.544640 3324 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 06:05:56.545359 kubelet[3324]: I0909 06:05:56.545321 3324 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 06:05:56.546517 kubelet[3324]: I0909 06:05:56.546509 3324 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 06:05:56.548366 kubelet[3324]: I0909 06:05:56.548355 3324 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 06:05:56.555146 kubelet[3324]: I0909 06:05:56.555131 3324 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 06:05:56.555378 kubelet[3324]: I0909 06:05:56.555331 3324 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 06:05:56.556305 kubelet[3324]: I0909 06:05:56.555376 3324 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452.0.0-n-7ab43648c0","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 06:05:56.556396 kubelet[3324]: I0909 06:05:56.556311 3324 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 06:05:56.556468 kubelet[3324]: I0909 06:05:56.556459 3324 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 06:05:56.556505 kubelet[3324]: I0909 06:05:56.556501 3324 state_mem.go:36] "Initialized new in-memory state store" Sep 9 06:05:56.556642 kubelet[3324]: I0909 06:05:56.556638 3324 kubelet.go:446] "Attempting to sync node with API server" Sep 9 06:05:56.556663 kubelet[3324]: I0909 06:05:56.556649 3324 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 06:05:56.556709 kubelet[3324]: I0909 06:05:56.556663 3324 kubelet.go:352] "Adding apiserver pod source" Sep 9 06:05:56.556709 kubelet[3324]: I0909 06:05:56.556674 3324 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 06:05:56.557314 kubelet[3324]: I0909 06:05:56.557306 3324 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 06:05:56.557561 kubelet[3324]: I0909 06:05:56.557556 3324 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 06:05:56.557905 kubelet[3324]: I0909 06:05:56.557877 3324 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 06:05:56.557950 kubelet[3324]: I0909 06:05:56.557910 3324 server.go:1287] "Started kubelet" Sep 9 06:05:56.558010 kubelet[3324]: I0909 06:05:56.557965 3324 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 06:05:56.558010 kubelet[3324]: I0909 06:05:56.557977 3324 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 06:05:56.558167 kubelet[3324]: I0909 06:05:56.558158 3324 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 06:05:56.558824 kubelet[3324]: E0909 06:05:56.558816 3324 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 06:05:56.558859 kubelet[3324]: I0909 06:05:56.558849 3324 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 06:05:56.558900 kubelet[3324]: I0909 06:05:56.558889 3324 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 06:05:56.558929 kubelet[3324]: I0909 06:05:56.558922 3324 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 06:05:56.558960 kubelet[3324]: I0909 06:05:56.558939 3324 server.go:479] "Adding debug handlers to kubelet server" Sep 9 06:05:56.558960 kubelet[3324]: I0909 06:05:56.558952 3324 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 06:05:56.559100 kubelet[3324]: E0909 06:05:56.559073 3324 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452.0.0-n-7ab43648c0\" not found" Sep 9 06:05:56.559139 kubelet[3324]: I0909 06:05:56.559105 3324 reconciler.go:26] "Reconciler: start to sync state" Sep 9 06:05:56.560983 kubelet[3324]: I0909 06:05:56.560951 3324 factory.go:221] Registration of the systemd container factory successfully Sep 9 06:05:56.561223 kubelet[3324]: I0909 06:05:56.561202 3324 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 06:05:56.563174 kubelet[3324]: I0909 06:05:56.563163 3324 factory.go:221] Registration of the containerd container factory successfully Sep 9 06:05:56.565741 kubelet[3324]: I0909 06:05:56.565675 3324 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 06:05:56.566393 kubelet[3324]: I0909 06:05:56.566382 3324 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 06:05:56.566439 kubelet[3324]: I0909 06:05:56.566397 3324 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 06:05:56.566439 kubelet[3324]: I0909 06:05:56.566410 3324 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 06:05:56.566439 kubelet[3324]: I0909 06:05:56.566416 3324 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 06:05:56.566526 kubelet[3324]: E0909 06:05:56.566443 3324 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 06:05:56.578008 kubelet[3324]: I0909 06:05:56.577979 3324 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 06:05:56.578008 kubelet[3324]: I0909 06:05:56.578004 3324 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 06:05:56.578008 kubelet[3324]: I0909 06:05:56.578015 3324 state_mem.go:36] "Initialized new in-memory state store" Sep 9 06:05:56.578119 kubelet[3324]: I0909 06:05:56.578104 3324 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 06:05:56.578119 kubelet[3324]: I0909 06:05:56.578110 3324 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 06:05:56.578151 kubelet[3324]: I0909 06:05:56.578120 3324 policy_none.go:49] "None policy: Start" Sep 9 06:05:56.578151 kubelet[3324]: I0909 06:05:56.578125 3324 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 06:05:56.578151 kubelet[3324]: I0909 06:05:56.578130 3324 state_mem.go:35] "Initializing new in-memory state store" Sep 9 06:05:56.578192 kubelet[3324]: I0909 06:05:56.578185 3324 state_mem.go:75] "Updated machine memory state" Sep 9 06:05:56.580431 kubelet[3324]: I0909 06:05:56.580423 3324 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 06:05:56.580529 kubelet[3324]: I0909 06:05:56.580524 3324 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 06:05:56.580564 kubelet[3324]: I0909 06:05:56.580546 3324 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 06:05:56.580645 kubelet[3324]: I0909 06:05:56.580640 3324 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 06:05:56.581125 kubelet[3324]: E0909 06:05:56.581116 3324 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 06:05:56.668146 kubelet[3324]: I0909 06:05:56.668040 3324 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.668146 kubelet[3324]: I0909 06:05:56.668149 3324 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.668513 kubelet[3324]: I0909 06:05:56.668245 3324 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.675804 kubelet[3324]: W0909 06:05:56.675737 3324 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 06:05:56.676036 kubelet[3324]: W0909 06:05:56.675842 3324 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 06:05:56.676036 kubelet[3324]: W0909 06:05:56.675982 3324 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 06:05:56.687232 kubelet[3324]: I0909 06:05:56.687188 3324 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.697018 kubelet[3324]: I0909 06:05:56.696971 3324 kubelet_node_status.go:124] "Node was previously registered" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.697194 kubelet[3324]: I0909 06:05:56.697106 3324 kubelet_node_status.go:78] "Successfully registered node" node="ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.759986 kubelet[3324]: I0909 06:05:56.759850 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/048cfb8f6621b4d88ac73fe70b73f053-ca-certs\") pod \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" (UID: \"048cfb8f6621b4d88ac73fe70b73f053\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.759986 kubelet[3324]: I0909 06:05:56.759945 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/048cfb8f6621b4d88ac73fe70b73f053-k8s-certs\") pod \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" (UID: \"048cfb8f6621b4d88ac73fe70b73f053\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.760360 kubelet[3324]: I0909 06:05:56.760010 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/048cfb8f6621b4d88ac73fe70b73f053-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" (UID: \"048cfb8f6621b4d88ac73fe70b73f053\") " pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.861333 kubelet[3324]: I0909 06:05:56.861261 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-flexvolume-dir\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.861601 kubelet[3324]: I0909 06:05:56.861365 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8c751ca14c76a347ad89994351fb061b-kubeconfig\") pod \"kube-scheduler-ci-4452.0.0-n-7ab43648c0\" (UID: \"8c751ca14c76a347ad89994351fb061b\") " pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.861601 kubelet[3324]: I0909 06:05:56.861455 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-k8s-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.861601 kubelet[3324]: I0909 06:05:56.861529 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-kubeconfig\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.861995 kubelet[3324]: I0909 06:05:56.861615 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:56.861995 kubelet[3324]: I0909 06:05:56.861850 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6970889dabddb934d8ae72d04137bfde-ca-certs\") pod \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" (UID: \"6970889dabddb934d8ae72d04137bfde\") " pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:57.557568 kubelet[3324]: I0909 06:05:57.557459 3324 apiserver.go:52] "Watching apiserver" Sep 9 06:05:57.573166 kubelet[3324]: I0909 06:05:57.573136 3324 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:57.573452 kubelet[3324]: I0909 06:05:57.573136 3324 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:57.573452 kubelet[3324]: I0909 06:05:57.573284 3324 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:57.577547 kubelet[3324]: W0909 06:05:57.577527 3324 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 06:05:57.577547 kubelet[3324]: W0909 06:05:57.577528 3324 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 06:05:57.577683 kubelet[3324]: E0909 06:05:57.577576 3324 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452.0.0-n-7ab43648c0\" already exists" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:57.577683 kubelet[3324]: E0909 06:05:57.577599 3324 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4452.0.0-n-7ab43648c0\" already exists" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:57.577730 kubelet[3324]: W0909 06:05:57.577724 3324 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Sep 9 06:05:57.577756 kubelet[3324]: E0909 06:05:57.577749 3324 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4452.0.0-n-7ab43648c0\" already exists" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" Sep 9 06:05:57.599095 kubelet[3324]: I0909 06:05:57.599044 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4452.0.0-n-7ab43648c0" podStartSLOduration=1.599030121 podStartE2EDuration="1.599030121s" podCreationTimestamp="2025-09-09 06:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 06:05:57.599005001 +0000 UTC m=+1.077650067" watchObservedRunningTime="2025-09-09 06:05:57.599030121 +0000 UTC m=+1.077675180" Sep 9 06:05:57.613506 kubelet[3324]: I0909 06:05:57.613468 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4452.0.0-n-7ab43648c0" podStartSLOduration=1.613451574 podStartE2EDuration="1.613451574s" podCreationTimestamp="2025-09-09 06:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 06:05:57.607364602 +0000 UTC m=+1.086009667" watchObservedRunningTime="2025-09-09 06:05:57.613451574 +0000 UTC m=+1.092096633" Sep 9 06:05:57.618551 kubelet[3324]: I0909 06:05:57.618528 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4452.0.0-n-7ab43648c0" podStartSLOduration=1.618518702 podStartE2EDuration="1.618518702s" podCreationTimestamp="2025-09-09 06:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 06:05:57.613560665 +0000 UTC m=+1.092205725" watchObservedRunningTime="2025-09-09 06:05:57.618518702 +0000 UTC m=+1.097163762" Sep 9 06:05:57.660095 kubelet[3324]: I0909 06:05:57.660046 3324 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 06:06:01.559128 kubelet[3324]: I0909 06:06:01.559029 3324 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 06:06:01.559982 containerd[1949]: time="2025-09-09T06:06:01.559724898Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 06:06:01.560571 kubelet[3324]: I0909 06:06:01.560141 3324 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 06:06:02.284984 systemd[1]: Created slice kubepods-besteffort-pod67d4528a_8c40_42f3_a5e8_eeaffb53617b.slice - libcontainer container kubepods-besteffort-pod67d4528a_8c40_42f3_a5e8_eeaffb53617b.slice. Sep 9 06:06:02.299702 kubelet[3324]: I0909 06:06:02.299646 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/67d4528a-8c40-42f3-a5e8-eeaffb53617b-lib-modules\") pod \"kube-proxy-jmwtr\" (UID: \"67d4528a-8c40-42f3-a5e8-eeaffb53617b\") " pod="kube-system/kube-proxy-jmwtr" Sep 9 06:06:02.299897 kubelet[3324]: I0909 06:06:02.299715 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkpw5\" (UniqueName: \"kubernetes.io/projected/67d4528a-8c40-42f3-a5e8-eeaffb53617b-kube-api-access-dkpw5\") pod \"kube-proxy-jmwtr\" (UID: \"67d4528a-8c40-42f3-a5e8-eeaffb53617b\") " pod="kube-system/kube-proxy-jmwtr" Sep 9 06:06:02.299897 kubelet[3324]: I0909 06:06:02.299750 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/67d4528a-8c40-42f3-a5e8-eeaffb53617b-kube-proxy\") pod \"kube-proxy-jmwtr\" (UID: \"67d4528a-8c40-42f3-a5e8-eeaffb53617b\") " pod="kube-system/kube-proxy-jmwtr" Sep 9 06:06:02.299897 kubelet[3324]: I0909 06:06:02.299774 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/67d4528a-8c40-42f3-a5e8-eeaffb53617b-xtables-lock\") pod \"kube-proxy-jmwtr\" (UID: \"67d4528a-8c40-42f3-a5e8-eeaffb53617b\") " pod="kube-system/kube-proxy-jmwtr" Sep 9 06:06:02.413088 kubelet[3324]: E0909 06:06:02.412992 3324 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 9 06:06:02.413088 kubelet[3324]: E0909 06:06:02.413053 3324 projected.go:194] Error preparing data for projected volume kube-api-access-dkpw5 for pod kube-system/kube-proxy-jmwtr: configmap "kube-root-ca.crt" not found Sep 9 06:06:02.413445 kubelet[3324]: E0909 06:06:02.413178 3324 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/67d4528a-8c40-42f3-a5e8-eeaffb53617b-kube-api-access-dkpw5 podName:67d4528a-8c40-42f3-a5e8-eeaffb53617b nodeName:}" failed. No retries permitted until 2025-09-09 06:06:02.91313038 +0000 UTC m=+6.391775505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dkpw5" (UniqueName: "kubernetes.io/projected/67d4528a-8c40-42f3-a5e8-eeaffb53617b-kube-api-access-dkpw5") pod "kube-proxy-jmwtr" (UID: "67d4528a-8c40-42f3-a5e8-eeaffb53617b") : configmap "kube-root-ca.crt" not found Sep 9 06:06:02.629086 systemd[1]: Created slice kubepods-besteffort-pod0aaa330e_2862_4b0b_8004_2c379ce5b230.slice - libcontainer container kubepods-besteffort-pod0aaa330e_2862_4b0b_8004_2c379ce5b230.slice. Sep 9 06:06:02.702323 kubelet[3324]: I0909 06:06:02.702228 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jsfb\" (UniqueName: \"kubernetes.io/projected/0aaa330e-2862-4b0b-8004-2c379ce5b230-kube-api-access-6jsfb\") pod \"tigera-operator-755d956888-kgfh2\" (UID: \"0aaa330e-2862-4b0b-8004-2c379ce5b230\") " pod="tigera-operator/tigera-operator-755d956888-kgfh2" Sep 9 06:06:02.702323 kubelet[3324]: I0909 06:06:02.702333 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0aaa330e-2862-4b0b-8004-2c379ce5b230-var-lib-calico\") pod \"tigera-operator-755d956888-kgfh2\" (UID: \"0aaa330e-2862-4b0b-8004-2c379ce5b230\") " pod="tigera-operator/tigera-operator-755d956888-kgfh2" Sep 9 06:06:02.935696 containerd[1949]: time="2025-09-09T06:06:02.935458167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-kgfh2,Uid:0aaa330e-2862-4b0b-8004-2c379ce5b230,Namespace:tigera-operator,Attempt:0,}" Sep 9 06:06:02.944984 containerd[1949]: time="2025-09-09T06:06:02.944955119Z" level=info msg="connecting to shim 783ba7c780c5b86e0320980a2629a685424af7a9a734c41f0db16ae8821fac73" address="unix:///run/containerd/s/f85227e75bd6e9961e4451dadb2c3aff28d6ca3c9ccb9b5906f7713eb4770f7c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:02.967861 systemd[1]: Started cri-containerd-783ba7c780c5b86e0320980a2629a685424af7a9a734c41f0db16ae8821fac73.scope - libcontainer container 783ba7c780c5b86e0320980a2629a685424af7a9a734c41f0db16ae8821fac73. Sep 9 06:06:03.003105 containerd[1949]: time="2025-09-09T06:06:03.003082028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-kgfh2,Uid:0aaa330e-2862-4b0b-8004-2c379ce5b230,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"783ba7c780c5b86e0320980a2629a685424af7a9a734c41f0db16ae8821fac73\"" Sep 9 06:06:03.003872 containerd[1949]: time="2025-09-09T06:06:03.003860735Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 06:06:03.205294 containerd[1949]: time="2025-09-09T06:06:03.205065488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jmwtr,Uid:67d4528a-8c40-42f3-a5e8-eeaffb53617b,Namespace:kube-system,Attempt:0,}" Sep 9 06:06:03.213373 containerd[1949]: time="2025-09-09T06:06:03.213334650Z" level=info msg="connecting to shim f98dda3dabc251500bb38f1defd1a111316b5a6df59161b9b1159db943550cc1" address="unix:///run/containerd/s/538259bca24aad1ec194af095df601e44422bfbff9c87d56843b3045cbabae36" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:03.233041 systemd[1]: Started cri-containerd-f98dda3dabc251500bb38f1defd1a111316b5a6df59161b9b1159db943550cc1.scope - libcontainer container f98dda3dabc251500bb38f1defd1a111316b5a6df59161b9b1159db943550cc1. Sep 9 06:06:03.252101 containerd[1949]: time="2025-09-09T06:06:03.252079489Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jmwtr,Uid:67d4528a-8c40-42f3-a5e8-eeaffb53617b,Namespace:kube-system,Attempt:0,} returns sandbox id \"f98dda3dabc251500bb38f1defd1a111316b5a6df59161b9b1159db943550cc1\"" Sep 9 06:06:03.253283 containerd[1949]: time="2025-09-09T06:06:03.253270847Z" level=info msg="CreateContainer within sandbox \"f98dda3dabc251500bb38f1defd1a111316b5a6df59161b9b1159db943550cc1\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 06:06:03.257318 containerd[1949]: time="2025-09-09T06:06:03.257278441Z" level=info msg="Container 1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:03.260833 containerd[1949]: time="2025-09-09T06:06:03.260790506Z" level=info msg="CreateContainer within sandbox \"f98dda3dabc251500bb38f1defd1a111316b5a6df59161b9b1159db943550cc1\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946\"" Sep 9 06:06:03.261203 containerd[1949]: time="2025-09-09T06:06:03.261152509Z" level=info msg="StartContainer for \"1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946\"" Sep 9 06:06:03.262083 containerd[1949]: time="2025-09-09T06:06:03.262026636Z" level=info msg="connecting to shim 1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946" address="unix:///run/containerd/s/538259bca24aad1ec194af095df601e44422bfbff9c87d56843b3045cbabae36" protocol=ttrpc version=3 Sep 9 06:06:03.283060 systemd[1]: Started cri-containerd-1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946.scope - libcontainer container 1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946. Sep 9 06:06:03.314316 containerd[1949]: time="2025-09-09T06:06:03.314292644Z" level=info msg="StartContainer for \"1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946\" returns successfully" Sep 9 06:06:03.609285 kubelet[3324]: I0909 06:06:03.609123 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jmwtr" podStartSLOduration=1.609073636 podStartE2EDuration="1.609073636s" podCreationTimestamp="2025-09-09 06:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 06:06:03.608963864 +0000 UTC m=+7.087609015" watchObservedRunningTime="2025-09-09 06:06:03.609073636 +0000 UTC m=+7.087718783" Sep 9 06:06:04.439229 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3076780327.mount: Deactivated successfully. Sep 9 06:06:04.686913 containerd[1949]: time="2025-09-09T06:06:04.686858170Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:04.687139 containerd[1949]: time="2025-09-09T06:06:04.686971052Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 06:06:04.687380 containerd[1949]: time="2025-09-09T06:06:04.687365820Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:04.688649 containerd[1949]: time="2025-09-09T06:06:04.688621165Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:04.689418 containerd[1949]: time="2025-09-09T06:06:04.689332456Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.685452154s" Sep 9 06:06:04.689418 containerd[1949]: time="2025-09-09T06:06:04.689359587Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 06:06:04.690544 containerd[1949]: time="2025-09-09T06:06:04.690531305Z" level=info msg="CreateContainer within sandbox \"783ba7c780c5b86e0320980a2629a685424af7a9a734c41f0db16ae8821fac73\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 06:06:04.693438 containerd[1949]: time="2025-09-09T06:06:04.693394531Z" level=info msg="Container e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:04.695694 containerd[1949]: time="2025-09-09T06:06:04.695649556Z" level=info msg="CreateContainer within sandbox \"783ba7c780c5b86e0320980a2629a685424af7a9a734c41f0db16ae8821fac73\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330\"" Sep 9 06:06:04.695902 containerd[1949]: time="2025-09-09T06:06:04.695860496Z" level=info msg="StartContainer for \"e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330\"" Sep 9 06:06:04.696253 containerd[1949]: time="2025-09-09T06:06:04.696211129Z" level=info msg="connecting to shim e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330" address="unix:///run/containerd/s/f85227e75bd6e9961e4451dadb2c3aff28d6ca3c9ccb9b5906f7713eb4770f7c" protocol=ttrpc version=3 Sep 9 06:06:04.713946 systemd[1]: Started cri-containerd-e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330.scope - libcontainer container e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330. Sep 9 06:06:04.726220 containerd[1949]: time="2025-09-09T06:06:04.726165255Z" level=info msg="StartContainer for \"e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330\" returns successfully" Sep 9 06:06:05.613448 kubelet[3324]: I0909 06:06:05.613391 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-kgfh2" podStartSLOduration=1.9270836660000001 podStartE2EDuration="3.613367539s" podCreationTimestamp="2025-09-09 06:06:02 +0000 UTC" firstStartedPulling="2025-09-09 06:06:03.003666542 +0000 UTC m=+6.482311602" lastFinishedPulling="2025-09-09 06:06:04.689950411 +0000 UTC m=+8.168595475" observedRunningTime="2025-09-09 06:06:05.613273453 +0000 UTC m=+9.091918516" watchObservedRunningTime="2025-09-09 06:06:05.613367539 +0000 UTC m=+9.092012596" Sep 9 06:06:07.872101 update_engine[1943]: I20250909 06:06:07.871943 1943 update_attempter.cc:509] Updating boot flags... Sep 9 06:06:09.054176 sudo[2261]: pam_unix(sudo:session): session closed for user root Sep 9 06:06:09.055143 sshd[2260]: Connection closed by 139.178.89.65 port 34366 Sep 9 06:06:09.055350 sshd-session[2257]: pam_unix(sshd:session): session closed for user core Sep 9 06:06:09.058017 systemd[1]: sshd@8-139.178.90.255:22-139.178.89.65:34366.service: Deactivated successfully. Sep 9 06:06:09.059143 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 06:06:09.059260 systemd[1]: session-11.scope: Consumed 3.903s CPU time, 235.6M memory peak. Sep 9 06:06:09.060067 systemd-logind[1938]: Session 11 logged out. Waiting for processes to exit. Sep 9 06:06:09.060722 systemd-logind[1938]: Removed session 11. Sep 9 06:06:11.130262 systemd[1]: Created slice kubepods-besteffort-pod148ff7ed_a6ad_439b_94a7_44be7f029476.slice - libcontainer container kubepods-besteffort-pod148ff7ed_a6ad_439b_94a7_44be7f029476.slice. Sep 9 06:06:11.159341 kubelet[3324]: I0909 06:06:11.159243 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148ff7ed-a6ad-439b-94a7-44be7f029476-tigera-ca-bundle\") pod \"calico-typha-686d9c9bc4-8tjcs\" (UID: \"148ff7ed-a6ad-439b-94a7-44be7f029476\") " pod="calico-system/calico-typha-686d9c9bc4-8tjcs" Sep 9 06:06:11.160386 kubelet[3324]: I0909 06:06:11.159375 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td5xt\" (UniqueName: \"kubernetes.io/projected/148ff7ed-a6ad-439b-94a7-44be7f029476-kube-api-access-td5xt\") pod \"calico-typha-686d9c9bc4-8tjcs\" (UID: \"148ff7ed-a6ad-439b-94a7-44be7f029476\") " pod="calico-system/calico-typha-686d9c9bc4-8tjcs" Sep 9 06:06:11.160386 kubelet[3324]: I0909 06:06:11.159489 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/148ff7ed-a6ad-439b-94a7-44be7f029476-typha-certs\") pod \"calico-typha-686d9c9bc4-8tjcs\" (UID: \"148ff7ed-a6ad-439b-94a7-44be7f029476\") " pod="calico-system/calico-typha-686d9c9bc4-8tjcs" Sep 9 06:06:11.434190 containerd[1949]: time="2025-09-09T06:06:11.433955033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-686d9c9bc4-8tjcs,Uid:148ff7ed-a6ad-439b-94a7-44be7f029476,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:11.441661 containerd[1949]: time="2025-09-09T06:06:11.441637938Z" level=info msg="connecting to shim f7c644b2039b06d604ec1a4d26f10bfa5a7f4590bdf689ed965d29800dd86a7d" address="unix:///run/containerd/s/7793fa33e5939deda924e08098dd987b29856b434ccec868ab17e7279e42a8d8" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:11.473269 systemd[1]: Started cri-containerd-f7c644b2039b06d604ec1a4d26f10bfa5a7f4590bdf689ed965d29800dd86a7d.scope - libcontainer container f7c644b2039b06d604ec1a4d26f10bfa5a7f4590bdf689ed965d29800dd86a7d. Sep 9 06:06:11.544540 systemd[1]: Created slice kubepods-besteffort-pod0f59acca_8ece_4a8e_9704_7d1c0d7010a0.slice - libcontainer container kubepods-besteffort-pod0f59acca_8ece_4a8e_9704_7d1c0d7010a0.slice. Sep 9 06:06:11.553503 containerd[1949]: time="2025-09-09T06:06:11.553481407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-686d9c9bc4-8tjcs,Uid:148ff7ed-a6ad-439b-94a7-44be7f029476,Namespace:calico-system,Attempt:0,} returns sandbox id \"f7c644b2039b06d604ec1a4d26f10bfa5a7f4590bdf689ed965d29800dd86a7d\"" Sep 9 06:06:11.554112 containerd[1949]: time="2025-09-09T06:06:11.554101341Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 06:06:11.562354 kubelet[3324]: I0909 06:06:11.562336 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-flexvol-driver-host\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562432 kubelet[3324]: I0909 06:06:11.562360 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-lib-modules\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562432 kubelet[3324]: I0909 06:06:11.562371 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-tigera-ca-bundle\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562432 kubelet[3324]: I0909 06:06:11.562381 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-cni-net-dir\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562432 kubelet[3324]: I0909 06:06:11.562391 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-cni-log-dir\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562432 kubelet[3324]: I0909 06:06:11.562404 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-node-certs\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562530 kubelet[3324]: I0909 06:06:11.562418 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-var-run-calico\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562530 kubelet[3324]: I0909 06:06:11.562432 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7ddp\" (UniqueName: \"kubernetes.io/projected/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-kube-api-access-z7ddp\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562530 kubelet[3324]: I0909 06:06:11.562443 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-xtables-lock\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562530 kubelet[3324]: I0909 06:06:11.562452 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-cni-bin-dir\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562530 kubelet[3324]: I0909 06:06:11.562461 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-policysync\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.562632 kubelet[3324]: I0909 06:06:11.562469 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0f59acca-8ece-4a8e-9704-7d1c0d7010a0-var-lib-calico\") pod \"calico-node-2lddc\" (UID: \"0f59acca-8ece-4a8e-9704-7d1c0d7010a0\") " pod="calico-system/calico-node-2lddc" Sep 9 06:06:11.665739 kubelet[3324]: E0909 06:06:11.665645 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.665739 kubelet[3324]: W0909 06:06:11.665729 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.666097 kubelet[3324]: E0909 06:06:11.665804 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.670822 kubelet[3324]: E0909 06:06:11.670729 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.670822 kubelet[3324]: W0909 06:06:11.670772 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.670822 kubelet[3324]: E0909 06:06:11.670809 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.683292 kubelet[3324]: E0909 06:06:11.683206 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.683292 kubelet[3324]: W0909 06:06:11.683246 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.683292 kubelet[3324]: E0909 06:06:11.683284 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.787963 kubelet[3324]: E0909 06:06:11.787820 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lwj49" podUID="70ad17b8-206d-4ff7-a93b-019a4878fd81" Sep 9 06:06:11.847465 containerd[1949]: time="2025-09-09T06:06:11.847375538Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2lddc,Uid:0f59acca-8ece-4a8e-9704-7d1c0d7010a0,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:11.855267 containerd[1949]: time="2025-09-09T06:06:11.855236142Z" level=info msg="connecting to shim e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1" address="unix:///run/containerd/s/2b84bc2ff438664778015817c75d76061253c0e5ffac077831f3e2d42e92f39e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:11.858863 kubelet[3324]: E0909 06:06:11.858831 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.858863 kubelet[3324]: W0909 06:06:11.858848 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859061 kubelet[3324]: E0909 06:06:11.858874 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859061 kubelet[3324]: E0909 06:06:11.858980 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859061 kubelet[3324]: W0909 06:06:11.858985 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859061 kubelet[3324]: E0909 06:06:11.858991 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859173 kubelet[3324]: E0909 06:06:11.859126 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859173 kubelet[3324]: W0909 06:06:11.859131 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859173 kubelet[3324]: E0909 06:06:11.859136 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859257 kubelet[3324]: E0909 06:06:11.859254 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859291 kubelet[3324]: W0909 06:06:11.859258 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859291 kubelet[3324]: E0909 06:06:11.859264 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859362 kubelet[3324]: E0909 06:06:11.859354 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859362 kubelet[3324]: W0909 06:06:11.859359 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859362 kubelet[3324]: E0909 06:06:11.859363 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859450 kubelet[3324]: E0909 06:06:11.859427 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859450 kubelet[3324]: W0909 06:06:11.859431 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859450 kubelet[3324]: E0909 06:06:11.859440 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859526 kubelet[3324]: E0909 06:06:11.859500 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859526 kubelet[3324]: W0909 06:06:11.859504 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859526 kubelet[3324]: E0909 06:06:11.859508 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859604 kubelet[3324]: E0909 06:06:11.859572 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859604 kubelet[3324]: W0909 06:06:11.859577 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859604 kubelet[3324]: E0909 06:06:11.859581 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859689 kubelet[3324]: E0909 06:06:11.859645 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859689 kubelet[3324]: W0909 06:06:11.859650 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859689 kubelet[3324]: E0909 06:06:11.859654 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859778 kubelet[3324]: E0909 06:06:11.859720 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859778 kubelet[3324]: W0909 06:06:11.859725 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859778 kubelet[3324]: E0909 06:06:11.859729 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859855 kubelet[3324]: E0909 06:06:11.859786 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859855 kubelet[3324]: W0909 06:06:11.859791 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859855 kubelet[3324]: E0909 06:06:11.859795 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859855 kubelet[3324]: E0909 06:06:11.859854 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859961 kubelet[3324]: W0909 06:06:11.859858 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859961 kubelet[3324]: E0909 06:06:11.859862 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.859961 kubelet[3324]: E0909 06:06:11.859924 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.859961 kubelet[3324]: W0909 06:06:11.859928 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.859961 kubelet[3324]: E0909 06:06:11.859933 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.860090 kubelet[3324]: E0909 06:06:11.860001 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.860090 kubelet[3324]: W0909 06:06:11.860006 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.860090 kubelet[3324]: E0909 06:06:11.860010 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.860090 kubelet[3324]: E0909 06:06:11.860080 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.860090 kubelet[3324]: W0909 06:06:11.860084 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.860090 kubelet[3324]: E0909 06:06:11.860088 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.860245 kubelet[3324]: E0909 06:06:11.860191 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.860245 kubelet[3324]: W0909 06:06:11.860195 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.860245 kubelet[3324]: E0909 06:06:11.860200 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.860322 kubelet[3324]: E0909 06:06:11.860292 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.860322 kubelet[3324]: W0909 06:06:11.860298 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.860322 kubelet[3324]: E0909 06:06:11.860306 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.860407 kubelet[3324]: E0909 06:06:11.860394 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.860407 kubelet[3324]: W0909 06:06:11.860399 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.860407 kubelet[3324]: E0909 06:06:11.860405 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.860520 kubelet[3324]: E0909 06:06:11.860493 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.860520 kubelet[3324]: W0909 06:06:11.860499 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.860520 kubelet[3324]: E0909 06:06:11.860506 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.860604 kubelet[3324]: E0909 06:06:11.860589 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.860604 kubelet[3324]: W0909 06:06:11.860594 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.860604 kubelet[3324]: E0909 06:06:11.860601 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.863833 kubelet[3324]: E0909 06:06:11.863818 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.863833 kubelet[3324]: W0909 06:06:11.863829 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.863921 kubelet[3324]: E0909 06:06:11.863841 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.863921 kubelet[3324]: I0909 06:06:11.863859 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/70ad17b8-206d-4ff7-a93b-019a4878fd81-varrun\") pod \"csi-node-driver-lwj49\" (UID: \"70ad17b8-206d-4ff7-a93b-019a4878fd81\") " pod="calico-system/csi-node-driver-lwj49" Sep 9 06:06:11.863957 kubelet[3324]: E0909 06:06:11.863946 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.863957 kubelet[3324]: W0909 06:06:11.863951 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.863989 kubelet[3324]: E0909 06:06:11.863958 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.863989 kubelet[3324]: I0909 06:06:11.863967 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smp2l\" (UniqueName: \"kubernetes.io/projected/70ad17b8-206d-4ff7-a93b-019a4878fd81-kube-api-access-smp2l\") pod \"csi-node-driver-lwj49\" (UID: \"70ad17b8-206d-4ff7-a93b-019a4878fd81\") " pod="calico-system/csi-node-driver-lwj49" Sep 9 06:06:11.864052 kubelet[3324]: E0909 06:06:11.864045 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864052 kubelet[3324]: W0909 06:06:11.864051 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864087 kubelet[3324]: E0909 06:06:11.864056 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864087 kubelet[3324]: I0909 06:06:11.864064 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/70ad17b8-206d-4ff7-a93b-019a4878fd81-kubelet-dir\") pod \"csi-node-driver-lwj49\" (UID: \"70ad17b8-206d-4ff7-a93b-019a4878fd81\") " pod="calico-system/csi-node-driver-lwj49" Sep 9 06:06:11.864164 kubelet[3324]: E0909 06:06:11.864157 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864186 kubelet[3324]: W0909 06:06:11.864164 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864186 kubelet[3324]: E0909 06:06:11.864172 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864243 kubelet[3324]: E0909 06:06:11.864238 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864262 kubelet[3324]: W0909 06:06:11.864243 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864262 kubelet[3324]: E0909 06:06:11.864248 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864340 kubelet[3324]: E0909 06:06:11.864334 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864360 kubelet[3324]: W0909 06:06:11.864341 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864360 kubelet[3324]: E0909 06:06:11.864348 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864422 kubelet[3324]: E0909 06:06:11.864416 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864438 kubelet[3324]: W0909 06:06:11.864422 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864438 kubelet[3324]: E0909 06:06:11.864428 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864504 kubelet[3324]: E0909 06:06:11.864498 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864504 kubelet[3324]: W0909 06:06:11.864503 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864556 kubelet[3324]: E0909 06:06:11.864510 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864556 kubelet[3324]: I0909 06:06:11.864522 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/70ad17b8-206d-4ff7-a93b-019a4878fd81-socket-dir\") pod \"csi-node-driver-lwj49\" (UID: \"70ad17b8-206d-4ff7-a93b-019a4878fd81\") " pod="calico-system/csi-node-driver-lwj49" Sep 9 06:06:11.864599 kubelet[3324]: E0909 06:06:11.864592 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864616 kubelet[3324]: W0909 06:06:11.864598 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864616 kubelet[3324]: E0909 06:06:11.864604 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864616 kubelet[3324]: I0909 06:06:11.864612 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/70ad17b8-206d-4ff7-a93b-019a4878fd81-registration-dir\") pod \"csi-node-driver-lwj49\" (UID: \"70ad17b8-206d-4ff7-a93b-019a4878fd81\") " pod="calico-system/csi-node-driver-lwj49" Sep 9 06:06:11.864707 kubelet[3324]: E0909 06:06:11.864702 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864734 kubelet[3324]: W0909 06:06:11.864707 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864734 kubelet[3324]: E0909 06:06:11.864714 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864810 kubelet[3324]: E0909 06:06:11.864802 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864810 kubelet[3324]: W0909 06:06:11.864808 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864855 kubelet[3324]: E0909 06:06:11.864816 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864898 kubelet[3324]: E0909 06:06:11.864893 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864918 kubelet[3324]: W0909 06:06:11.864898 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.864918 kubelet[3324]: E0909 06:06:11.864905 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.864971 kubelet[3324]: E0909 06:06:11.864966 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.864971 kubelet[3324]: W0909 06:06:11.864971 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.865009 kubelet[3324]: E0909 06:06:11.864976 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.865055 kubelet[3324]: E0909 06:06:11.865049 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.865055 kubelet[3324]: W0909 06:06:11.865053 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.865113 kubelet[3324]: E0909 06:06:11.865058 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.865142 kubelet[3324]: E0909 06:06:11.865122 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.865142 kubelet[3324]: W0909 06:06:11.865126 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.865142 kubelet[3324]: E0909 06:06:11.865130 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.873848 systemd[1]: Started cri-containerd-e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1.scope - libcontainer container e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1. Sep 9 06:06:11.884363 containerd[1949]: time="2025-09-09T06:06:11.884311664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2lddc,Uid:0f59acca-8ece-4a8e-9704-7d1c0d7010a0,Namespace:calico-system,Attempt:0,} returns sandbox id \"e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1\"" Sep 9 06:06:11.966241 kubelet[3324]: E0909 06:06:11.966186 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.966241 kubelet[3324]: W0909 06:06:11.966235 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.966725 kubelet[3324]: E0909 06:06:11.966293 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.967015 kubelet[3324]: E0909 06:06:11.966955 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.967015 kubelet[3324]: W0909 06:06:11.966992 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.967306 kubelet[3324]: E0909 06:06:11.967026 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.967595 kubelet[3324]: E0909 06:06:11.967549 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.967595 kubelet[3324]: W0909 06:06:11.967593 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.968094 kubelet[3324]: E0909 06:06:11.967642 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.968301 kubelet[3324]: E0909 06:06:11.968181 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.968301 kubelet[3324]: W0909 06:06:11.968225 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.968301 kubelet[3324]: E0909 06:06:11.968283 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.968790 kubelet[3324]: E0909 06:06:11.968750 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.968790 kubelet[3324]: W0909 06:06:11.968780 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.969138 kubelet[3324]: E0909 06:06:11.968819 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.969300 kubelet[3324]: E0909 06:06:11.969278 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.969479 kubelet[3324]: W0909 06:06:11.969316 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.969479 kubelet[3324]: E0909 06:06:11.969368 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.969864 kubelet[3324]: E0909 06:06:11.969835 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.970035 kubelet[3324]: W0909 06:06:11.969868 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.970035 kubelet[3324]: E0909 06:06:11.969970 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.970392 kubelet[3324]: E0909 06:06:11.970345 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.970392 kubelet[3324]: W0909 06:06:11.970381 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.970761 kubelet[3324]: E0909 06:06:11.970455 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.970966 kubelet[3324]: E0909 06:06:11.970910 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.970966 kubelet[3324]: W0909 06:06:11.970943 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.971262 kubelet[3324]: E0909 06:06:11.971018 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.971438 kubelet[3324]: E0909 06:06:11.971387 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.971438 kubelet[3324]: W0909 06:06:11.971419 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.971776 kubelet[3324]: E0909 06:06:11.971482 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.972045 kubelet[3324]: E0909 06:06:11.972002 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.972045 kubelet[3324]: W0909 06:06:11.972035 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.972353 kubelet[3324]: E0909 06:06:11.972130 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.972558 kubelet[3324]: E0909 06:06:11.972516 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.972558 kubelet[3324]: W0909 06:06:11.972550 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.972935 kubelet[3324]: E0909 06:06:11.972625 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.973115 kubelet[3324]: E0909 06:06:11.973060 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.973115 kubelet[3324]: W0909 06:06:11.973094 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.973394 kubelet[3324]: E0909 06:06:11.973181 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.973585 kubelet[3324]: E0909 06:06:11.973537 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.973793 kubelet[3324]: W0909 06:06:11.973583 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.973793 kubelet[3324]: E0909 06:06:11.973654 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.974099 kubelet[3324]: E0909 06:06:11.974051 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.974099 kubelet[3324]: W0909 06:06:11.974084 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.974379 kubelet[3324]: E0909 06:06:11.974167 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.974562 kubelet[3324]: E0909 06:06:11.974518 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.974764 kubelet[3324]: W0909 06:06:11.974560 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.974764 kubelet[3324]: E0909 06:06:11.974628 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.975094 kubelet[3324]: E0909 06:06:11.975050 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.975286 kubelet[3324]: W0909 06:06:11.975095 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.975286 kubelet[3324]: E0909 06:06:11.975183 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.975547 kubelet[3324]: E0909 06:06:11.975527 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.975759 kubelet[3324]: W0909 06:06:11.975559 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.975759 kubelet[3324]: E0909 06:06:11.975625 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.976154 kubelet[3324]: E0909 06:06:11.976119 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.976154 kubelet[3324]: W0909 06:06:11.976150 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.976457 kubelet[3324]: E0909 06:06:11.976233 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.976620 kubelet[3324]: E0909 06:06:11.976575 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.976620 kubelet[3324]: W0909 06:06:11.976608 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.976967 kubelet[3324]: E0909 06:06:11.976696 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.977162 kubelet[3324]: E0909 06:06:11.977087 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.977162 kubelet[3324]: W0909 06:06:11.977117 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.977471 kubelet[3324]: E0909 06:06:11.977202 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.977660 kubelet[3324]: E0909 06:06:11.977620 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.977660 kubelet[3324]: W0909 06:06:11.977657 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.978002 kubelet[3324]: E0909 06:06:11.977773 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.978226 kubelet[3324]: E0909 06:06:11.978189 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.978226 kubelet[3324]: W0909 06:06:11.978223 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.978534 kubelet[3324]: E0909 06:06:11.978275 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.978924 kubelet[3324]: E0909 06:06:11.978889 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.978924 kubelet[3324]: W0909 06:06:11.978921 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.979233 kubelet[3324]: E0909 06:06:11.978959 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.979531 kubelet[3324]: E0909 06:06:11.979490 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.979531 kubelet[3324]: W0909 06:06:11.979525 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.979828 kubelet[3324]: E0909 06:06:11.979569 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:11.997441 kubelet[3324]: E0909 06:06:11.997350 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:11.997441 kubelet[3324]: W0909 06:06:11.997388 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:11.997441 kubelet[3324]: E0909 06:06:11.997424 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:13.567803 kubelet[3324]: E0909 06:06:13.567697 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lwj49" podUID="70ad17b8-206d-4ff7-a93b-019a4878fd81" Sep 9 06:06:13.745135 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount148114883.mount: Deactivated successfully. Sep 9 06:06:14.319043 containerd[1949]: time="2025-09-09T06:06:14.318987837Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:14.319268 containerd[1949]: time="2025-09-09T06:06:14.319167900Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 06:06:14.319567 containerd[1949]: time="2025-09-09T06:06:14.319527553Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:14.320310 containerd[1949]: time="2025-09-09T06:06:14.320269429Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:14.320648 containerd[1949]: time="2025-09-09T06:06:14.320612529Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.766494923s" Sep 9 06:06:14.320648 containerd[1949]: time="2025-09-09T06:06:14.320626987Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 06:06:14.321105 containerd[1949]: time="2025-09-09T06:06:14.321065161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 06:06:14.324197 containerd[1949]: time="2025-09-09T06:06:14.324180831Z" level=info msg="CreateContainer within sandbox \"f7c644b2039b06d604ec1a4d26f10bfa5a7f4590bdf689ed965d29800dd86a7d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 06:06:14.326975 containerd[1949]: time="2025-09-09T06:06:14.326939377Z" level=info msg="Container ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:14.329687 containerd[1949]: time="2025-09-09T06:06:14.329644688Z" level=info msg="CreateContainer within sandbox \"f7c644b2039b06d604ec1a4d26f10bfa5a7f4590bdf689ed965d29800dd86a7d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17\"" Sep 9 06:06:14.329889 containerd[1949]: time="2025-09-09T06:06:14.329876163Z" level=info msg="StartContainer for \"ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17\"" Sep 9 06:06:14.330418 containerd[1949]: time="2025-09-09T06:06:14.330400511Z" level=info msg="connecting to shim ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17" address="unix:///run/containerd/s/7793fa33e5939deda924e08098dd987b29856b434ccec868ab17e7279e42a8d8" protocol=ttrpc version=3 Sep 9 06:06:14.344002 systemd[1]: Started cri-containerd-ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17.scope - libcontainer container ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17. Sep 9 06:06:14.370525 containerd[1949]: time="2025-09-09T06:06:14.370500084Z" level=info msg="StartContainer for \"ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17\" returns successfully" Sep 9 06:06:14.645945 kubelet[3324]: I0909 06:06:14.645814 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-686d9c9bc4-8tjcs" podStartSLOduration=0.878738834 podStartE2EDuration="3.645763576s" podCreationTimestamp="2025-09-09 06:06:11 +0000 UTC" firstStartedPulling="2025-09-09 06:06:11.553992199 +0000 UTC m=+15.032637259" lastFinishedPulling="2025-09-09 06:06:14.321016945 +0000 UTC m=+17.799662001" observedRunningTime="2025-09-09 06:06:14.645405645 +0000 UTC m=+18.124050796" watchObservedRunningTime="2025-09-09 06:06:14.645763576 +0000 UTC m=+18.124408693" Sep 9 06:06:14.680448 kubelet[3324]: E0909 06:06:14.680354 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.680448 kubelet[3324]: W0909 06:06:14.680401 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.680448 kubelet[3324]: E0909 06:06:14.680448 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.681018 kubelet[3324]: E0909 06:06:14.680938 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.681018 kubelet[3324]: W0909 06:06:14.680973 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.681018 kubelet[3324]: E0909 06:06:14.681011 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.681531 kubelet[3324]: E0909 06:06:14.681453 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.681531 kubelet[3324]: W0909 06:06:14.681481 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.681844 kubelet[3324]: E0909 06:06:14.681543 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.682181 kubelet[3324]: E0909 06:06:14.682100 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.682181 kubelet[3324]: W0909 06:06:14.682135 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.682181 kubelet[3324]: E0909 06:06:14.682169 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.682687 kubelet[3324]: E0909 06:06:14.682638 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.682795 kubelet[3324]: W0909 06:06:14.682693 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.682795 kubelet[3324]: E0909 06:06:14.682727 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.683230 kubelet[3324]: E0909 06:06:14.683153 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.683230 kubelet[3324]: W0909 06:06:14.683181 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.683230 kubelet[3324]: E0909 06:06:14.683215 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.683705 kubelet[3324]: E0909 06:06:14.683643 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.683839 kubelet[3324]: W0909 06:06:14.683711 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.683839 kubelet[3324]: E0909 06:06:14.683743 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.684213 kubelet[3324]: E0909 06:06:14.684159 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.684213 kubelet[3324]: W0909 06:06:14.684183 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.684213 kubelet[3324]: E0909 06:06:14.684207 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.684663 kubelet[3324]: E0909 06:06:14.684632 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.684663 kubelet[3324]: W0909 06:06:14.684658 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.684929 kubelet[3324]: E0909 06:06:14.684722 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.685207 kubelet[3324]: E0909 06:06:14.685174 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.685207 kubelet[3324]: W0909 06:06:14.685204 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.685420 kubelet[3324]: E0909 06:06:14.685231 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.685649 kubelet[3324]: E0909 06:06:14.685621 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.685649 kubelet[3324]: W0909 06:06:14.685645 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.685897 kubelet[3324]: E0909 06:06:14.685701 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.686144 kubelet[3324]: E0909 06:06:14.686115 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.686144 kubelet[3324]: W0909 06:06:14.686141 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.686359 kubelet[3324]: E0909 06:06:14.686167 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.686612 kubelet[3324]: E0909 06:06:14.686584 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.686753 kubelet[3324]: W0909 06:06:14.686611 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.686753 kubelet[3324]: E0909 06:06:14.686637 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.687147 kubelet[3324]: E0909 06:06:14.687090 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.687147 kubelet[3324]: W0909 06:06:14.687113 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.687147 kubelet[3324]: E0909 06:06:14.687136 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.687585 kubelet[3324]: E0909 06:06:14.687533 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.687585 kubelet[3324]: W0909 06:06:14.687556 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.687585 kubelet[3324]: E0909 06:06:14.687579 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.695404 kubelet[3324]: E0909 06:06:14.695321 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.695404 kubelet[3324]: W0909 06:06:14.695360 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.695404 kubelet[3324]: E0909 06:06:14.695395 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.696004 kubelet[3324]: E0909 06:06:14.695922 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.696004 kubelet[3324]: W0909 06:06:14.695960 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.696004 kubelet[3324]: E0909 06:06:14.696003 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.696634 kubelet[3324]: E0909 06:06:14.696581 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.696634 kubelet[3324]: W0909 06:06:14.696622 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.697059 kubelet[3324]: E0909 06:06:14.696664 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.697318 kubelet[3324]: E0909 06:06:14.697231 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.697318 kubelet[3324]: W0909 06:06:14.697269 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.697318 kubelet[3324]: E0909 06:06:14.697309 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.697862 kubelet[3324]: E0909 06:06:14.697781 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.697862 kubelet[3324]: W0909 06:06:14.697808 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.698162 kubelet[3324]: E0909 06:06:14.697928 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.698295 kubelet[3324]: E0909 06:06:14.698251 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.698295 kubelet[3324]: W0909 06:06:14.698277 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.698481 kubelet[3324]: E0909 06:06:14.698399 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.698784 kubelet[3324]: E0909 06:06:14.698706 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.698784 kubelet[3324]: W0909 06:06:14.698739 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.699088 kubelet[3324]: E0909 06:06:14.698861 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.699267 kubelet[3324]: E0909 06:06:14.699236 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.699267 kubelet[3324]: W0909 06:06:14.699263 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.699503 kubelet[3324]: E0909 06:06:14.699298 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.699941 kubelet[3324]: E0909 06:06:14.699902 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.700089 kubelet[3324]: W0909 06:06:14.699940 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.700089 kubelet[3324]: E0909 06:06:14.699985 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.700503 kubelet[3324]: E0909 06:06:14.700425 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.700503 kubelet[3324]: W0909 06:06:14.700452 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.700860 kubelet[3324]: E0909 06:06:14.700525 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.700860 kubelet[3324]: E0909 06:06:14.700835 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.700860 kubelet[3324]: W0909 06:06:14.700858 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.701167 kubelet[3324]: E0909 06:06:14.700937 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.701378 kubelet[3324]: E0909 06:06:14.701294 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.701378 kubelet[3324]: W0909 06:06:14.701322 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.701648 kubelet[3324]: E0909 06:06:14.701388 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.701781 kubelet[3324]: E0909 06:06:14.701723 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.701781 kubelet[3324]: W0909 06:06:14.701746 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.701998 kubelet[3324]: E0909 06:06:14.701779 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.702317 kubelet[3324]: E0909 06:06:14.702239 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.702317 kubelet[3324]: W0909 06:06:14.702265 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.702317 kubelet[3324]: E0909 06:06:14.702296 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.702857 kubelet[3324]: E0909 06:06:14.702783 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.702857 kubelet[3324]: W0909 06:06:14.702813 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.702857 kubelet[3324]: E0909 06:06:14.702844 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.703366 kubelet[3324]: E0909 06:06:14.703334 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.703366 kubelet[3324]: W0909 06:06:14.703363 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.703575 kubelet[3324]: E0909 06:06:14.703399 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.703919 kubelet[3324]: E0909 06:06:14.703886 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.703919 kubelet[3324]: W0909 06:06:14.703919 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.704169 kubelet[3324]: E0909 06:06:14.703956 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:14.704497 kubelet[3324]: E0909 06:06:14.704438 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:14.704497 kubelet[3324]: W0909 06:06:14.704466 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:14.704497 kubelet[3324]: E0909 06:06:14.704494 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.567388 kubelet[3324]: E0909 06:06:15.567300 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lwj49" podUID="70ad17b8-206d-4ff7-a93b-019a4878fd81" Sep 9 06:06:15.628702 kubelet[3324]: I0909 06:06:15.628630 3324 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 06:06:15.694068 kubelet[3324]: E0909 06:06:15.693994 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.694068 kubelet[3324]: W0909 06:06:15.694036 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.694068 kubelet[3324]: E0909 06:06:15.694079 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.695236 kubelet[3324]: E0909 06:06:15.694521 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.695236 kubelet[3324]: W0909 06:06:15.694552 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.695236 kubelet[3324]: E0909 06:06:15.694584 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.695236 kubelet[3324]: E0909 06:06:15.695028 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.695236 kubelet[3324]: W0909 06:06:15.695054 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.695236 kubelet[3324]: E0909 06:06:15.695085 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.695917 kubelet[3324]: E0909 06:06:15.695584 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.695917 kubelet[3324]: W0909 06:06:15.695610 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.695917 kubelet[3324]: E0909 06:06:15.695637 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.696245 kubelet[3324]: E0909 06:06:15.696081 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.696245 kubelet[3324]: W0909 06:06:15.696108 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.696245 kubelet[3324]: E0909 06:06:15.696135 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.696510 kubelet[3324]: E0909 06:06:15.696461 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.696510 kubelet[3324]: W0909 06:06:15.696484 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.696723 kubelet[3324]: E0909 06:06:15.696508 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.696890 kubelet[3324]: E0909 06:06:15.696848 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.696890 kubelet[3324]: W0909 06:06:15.696875 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.697114 kubelet[3324]: E0909 06:06:15.696900 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.697330 kubelet[3324]: E0909 06:06:15.697274 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.697330 kubelet[3324]: W0909 06:06:15.697300 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.697330 kubelet[3324]: E0909 06:06:15.697322 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.697740 kubelet[3324]: E0909 06:06:15.697705 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.697740 kubelet[3324]: W0909 06:06:15.697732 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.697987 kubelet[3324]: E0909 06:06:15.697757 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.698124 kubelet[3324]: E0909 06:06:15.698087 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.698124 kubelet[3324]: W0909 06:06:15.698113 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.698349 kubelet[3324]: E0909 06:06:15.698140 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.698484 kubelet[3324]: E0909 06:06:15.698457 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.698593 kubelet[3324]: W0909 06:06:15.698485 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.698593 kubelet[3324]: E0909 06:06:15.698513 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.698872 kubelet[3324]: E0909 06:06:15.698845 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.698872 kubelet[3324]: W0909 06:06:15.698869 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.699073 kubelet[3324]: E0909 06:06:15.698893 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.699286 kubelet[3324]: E0909 06:06:15.699257 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.699392 kubelet[3324]: W0909 06:06:15.699283 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.699392 kubelet[3324]: E0909 06:06:15.699308 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.699688 kubelet[3324]: E0909 06:06:15.699642 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.699688 kubelet[3324]: W0909 06:06:15.699666 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.699912 kubelet[3324]: E0909 06:06:15.699715 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.700064 kubelet[3324]: E0909 06:06:15.700037 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.700064 kubelet[3324]: W0909 06:06:15.700061 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.700264 kubelet[3324]: E0909 06:06:15.700085 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.706833 kubelet[3324]: E0909 06:06:15.706751 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.706833 kubelet[3324]: W0909 06:06:15.706790 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.706833 kubelet[3324]: E0909 06:06:15.706828 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.707432 kubelet[3324]: E0909 06:06:15.707367 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.707432 kubelet[3324]: W0909 06:06:15.707404 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.707692 kubelet[3324]: E0909 06:06:15.707446 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.708138 kubelet[3324]: E0909 06:06:15.708072 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.708138 kubelet[3324]: W0909 06:06:15.708117 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.708387 kubelet[3324]: E0909 06:06:15.708163 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.708632 kubelet[3324]: E0909 06:06:15.708602 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.708814 kubelet[3324]: W0909 06:06:15.708631 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.708814 kubelet[3324]: E0909 06:06:15.708705 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.709230 kubelet[3324]: E0909 06:06:15.709163 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.709230 kubelet[3324]: W0909 06:06:15.709200 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.709461 kubelet[3324]: E0909 06:06:15.709271 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.709798 kubelet[3324]: E0909 06:06:15.709722 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.709798 kubelet[3324]: W0909 06:06:15.709754 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.710095 kubelet[3324]: E0909 06:06:15.709822 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.710265 kubelet[3324]: E0909 06:06:15.710214 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.710265 kubelet[3324]: W0909 06:06:15.710247 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.710505 kubelet[3324]: E0909 06:06:15.710321 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.710856 kubelet[3324]: E0909 06:06:15.710773 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.710856 kubelet[3324]: W0909 06:06:15.710799 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.710856 kubelet[3324]: E0909 06:06:15.710839 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.711401 kubelet[3324]: E0909 06:06:15.711324 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.711401 kubelet[3324]: W0909 06:06:15.711356 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.711401 kubelet[3324]: E0909 06:06:15.711394 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.711927 kubelet[3324]: E0909 06:06:15.711865 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.711927 kubelet[3324]: W0909 06:06:15.711901 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.712165 kubelet[3324]: E0909 06:06:15.711939 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.712457 kubelet[3324]: E0909 06:06:15.712407 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.712457 kubelet[3324]: W0909 06:06:15.712433 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.712717 kubelet[3324]: E0909 06:06:15.712465 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.712889 kubelet[3324]: E0909 06:06:15.712860 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.712988 kubelet[3324]: W0909 06:06:15.712893 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.713082 kubelet[3324]: E0909 06:06:15.712998 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.713365 kubelet[3324]: E0909 06:06:15.713338 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.713484 kubelet[3324]: W0909 06:06:15.713368 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.713484 kubelet[3324]: E0909 06:06:15.713411 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.713890 kubelet[3324]: E0909 06:06:15.713858 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.714005 kubelet[3324]: W0909 06:06:15.713891 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.714005 kubelet[3324]: E0909 06:06:15.713933 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.714452 kubelet[3324]: E0909 06:06:15.714407 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.714452 kubelet[3324]: W0909 06:06:15.714444 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.714770 kubelet[3324]: E0909 06:06:15.714500 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.715196 kubelet[3324]: E0909 06:06:15.715151 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.715196 kubelet[3324]: W0909 06:06:15.715188 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.715429 kubelet[3324]: E0909 06:06:15.715233 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.715835 kubelet[3324]: E0909 06:06:15.715781 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.715835 kubelet[3324]: W0909 06:06:15.715810 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.716100 kubelet[3324]: E0909 06:06:15.715846 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:15.716435 kubelet[3324]: E0909 06:06:15.716392 3324 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 06:06:15.716435 kubelet[3324]: W0909 06:06:15.716432 3324 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 06:06:15.716691 kubelet[3324]: E0909 06:06:15.716465 3324 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 06:06:16.032433 containerd[1949]: time="2025-09-09T06:06:16.032384550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:16.032643 containerd[1949]: time="2025-09-09T06:06:16.032527341Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 06:06:16.032953 containerd[1949]: time="2025-09-09T06:06:16.032912922Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:16.033765 containerd[1949]: time="2025-09-09T06:06:16.033721522Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:16.034141 containerd[1949]: time="2025-09-09T06:06:16.034104839Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.713024331s" Sep 9 06:06:16.034141 containerd[1949]: time="2025-09-09T06:06:16.034121446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 06:06:16.035192 containerd[1949]: time="2025-09-09T06:06:16.035179713Z" level=info msg="CreateContainer within sandbox \"e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 06:06:16.038575 containerd[1949]: time="2025-09-09T06:06:16.038535018Z" level=info msg="Container a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:16.042002 containerd[1949]: time="2025-09-09T06:06:16.041962345Z" level=info msg="CreateContainer within sandbox \"e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b\"" Sep 9 06:06:16.042201 containerd[1949]: time="2025-09-09T06:06:16.042142978Z" level=info msg="StartContainer for \"a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b\"" Sep 9 06:06:16.043005 containerd[1949]: time="2025-09-09T06:06:16.042957175Z" level=info msg="connecting to shim a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b" address="unix:///run/containerd/s/2b84bc2ff438664778015817c75d76061253c0e5ffac077831f3e2d42e92f39e" protocol=ttrpc version=3 Sep 9 06:06:16.055917 systemd[1]: Started cri-containerd-a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b.scope - libcontainer container a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b. Sep 9 06:06:16.075503 containerd[1949]: time="2025-09-09T06:06:16.075477998Z" level=info msg="StartContainer for \"a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b\" returns successfully" Sep 9 06:06:16.079225 systemd[1]: cri-containerd-a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b.scope: Deactivated successfully. Sep 9 06:06:16.080352 containerd[1949]: time="2025-09-09T06:06:16.080335194Z" level=info msg="received exit event container_id:\"a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b\" id:\"a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b\" pid:4179 exited_at:{seconds:1757397976 nanos:80129153}" Sep 9 06:06:16.080479 containerd[1949]: time="2025-09-09T06:06:16.080422716Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b\" id:\"a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b\" pid:4179 exited_at:{seconds:1757397976 nanos:80129153}" Sep 9 06:06:16.092105 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b-rootfs.mount: Deactivated successfully. Sep 9 06:06:17.567531 kubelet[3324]: E0909 06:06:17.567441 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lwj49" podUID="70ad17b8-206d-4ff7-a93b-019a4878fd81" Sep 9 06:06:17.650070 containerd[1949]: time="2025-09-09T06:06:17.649988647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 06:06:19.567060 kubelet[3324]: E0909 06:06:19.567026 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lwj49" podUID="70ad17b8-206d-4ff7-a93b-019a4878fd81" Sep 9 06:06:20.753430 containerd[1949]: time="2025-09-09T06:06:20.753383995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:20.753642 containerd[1949]: time="2025-09-09T06:06:20.753615675Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 06:06:20.754003 containerd[1949]: time="2025-09-09T06:06:20.753967265Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:20.754841 containerd[1949]: time="2025-09-09T06:06:20.754801368Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:20.755217 containerd[1949]: time="2025-09-09T06:06:20.755176582Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.105118866s" Sep 9 06:06:20.755217 containerd[1949]: time="2025-09-09T06:06:20.755192031Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 06:06:20.756188 containerd[1949]: time="2025-09-09T06:06:20.756150751Z" level=info msg="CreateContainer within sandbox \"e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 06:06:20.759244 containerd[1949]: time="2025-09-09T06:06:20.759229797Z" level=info msg="Container 9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:20.762725 containerd[1949]: time="2025-09-09T06:06:20.762713702Z" level=info msg="CreateContainer within sandbox \"e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224\"" Sep 9 06:06:20.762956 containerd[1949]: time="2025-09-09T06:06:20.762941203Z" level=info msg="StartContainer for \"9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224\"" Sep 9 06:06:20.763688 containerd[1949]: time="2025-09-09T06:06:20.763672397Z" level=info msg="connecting to shim 9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224" address="unix:///run/containerd/s/2b84bc2ff438664778015817c75d76061253c0e5ffac077831f3e2d42e92f39e" protocol=ttrpc version=3 Sep 9 06:06:20.783016 systemd[1]: Started cri-containerd-9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224.scope - libcontainer container 9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224. Sep 9 06:06:20.802135 containerd[1949]: time="2025-09-09T06:06:20.802108419Z" level=info msg="StartContainer for \"9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224\" returns successfully" Sep 9 06:06:21.363863 systemd[1]: cri-containerd-9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224.scope: Deactivated successfully. Sep 9 06:06:21.364052 systemd[1]: cri-containerd-9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224.scope: Consumed 363ms CPU time, 194.2M memory peak, 171.3M written to disk. Sep 9 06:06:21.364797 containerd[1949]: time="2025-09-09T06:06:21.364777805Z" level=info msg="received exit event container_id:\"9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224\" id:\"9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224\" pid:4241 exited_at:{seconds:1757397981 nanos:364654489}" Sep 9 06:06:21.364843 containerd[1949]: time="2025-09-09T06:06:21.364825444Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224\" id:\"9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224\" pid:4241 exited_at:{seconds:1757397981 nanos:364654489}" Sep 9 06:06:21.374544 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224-rootfs.mount: Deactivated successfully. Sep 9 06:06:21.376251 kubelet[3324]: I0909 06:06:21.376239 3324 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 06:06:21.392387 systemd[1]: Created slice kubepods-burstable-pod4e5bf45a_d9cc_4d9e_ab3f_a2ef869f4d55.slice - libcontainer container kubepods-burstable-pod4e5bf45a_d9cc_4d9e_ab3f_a2ef869f4d55.slice. Sep 9 06:06:21.396316 systemd[1]: Created slice kubepods-burstable-pod8d7e7612_c3bb_4af7_b7b1_26306df4f0f2.slice - libcontainer container kubepods-burstable-pod8d7e7612_c3bb_4af7_b7b1_26306df4f0f2.slice. Sep 9 06:06:21.399805 systemd[1]: Created slice kubepods-besteffort-pod485b14b7_7626_411d_9b65_4f8a429182ab.slice - libcontainer container kubepods-besteffort-pod485b14b7_7626_411d_9b65_4f8a429182ab.slice. Sep 9 06:06:21.403160 systemd[1]: Created slice kubepods-besteffort-pod8724bcd6_4402_4119_80d5_dd4ddd8684aa.slice - libcontainer container kubepods-besteffort-pod8724bcd6_4402_4119_80d5_dd4ddd8684aa.slice. Sep 9 06:06:21.406477 systemd[1]: Created slice kubepods-besteffort-podff94f516_177d_42b6_94ab_d8e1a6cb501b.slice - libcontainer container kubepods-besteffort-podff94f516_177d_42b6_94ab_d8e1a6cb501b.slice. Sep 9 06:06:21.409999 systemd[1]: Created slice kubepods-besteffort-pod4ae1f795_3830_4fb7_9c99_070d871677cc.slice - libcontainer container kubepods-besteffort-pod4ae1f795_3830_4fb7_9c99_070d871677cc.slice. Sep 9 06:06:21.412848 systemd[1]: Created slice kubepods-besteffort-poda11861c6_53bd_4f46_a3fd_f04b4a0ee127.slice - libcontainer container kubepods-besteffort-poda11861c6_53bd_4f46_a3fd_f04b4a0ee127.slice. Sep 9 06:06:21.453088 kubelet[3324]: I0909 06:06:21.452973 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ae1f795-3830-4fb7-9c99-070d871677cc-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-m4bj8\" (UID: \"4ae1f795-3830-4fb7-9c99-070d871677cc\") " pod="calico-system/goldmane-54d579b49d-m4bj8" Sep 9 06:06:21.453088 kubelet[3324]: I0909 06:06:21.453065 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm2sd\" (UniqueName: \"kubernetes.io/projected/8d7e7612-c3bb-4af7-b7b1-26306df4f0f2-kube-api-access-wm2sd\") pod \"coredns-668d6bf9bc-z8vxh\" (UID: \"8d7e7612-c3bb-4af7-b7b1-26306df4f0f2\") " pod="kube-system/coredns-668d6bf9bc-z8vxh" Sep 9 06:06:21.453622 kubelet[3324]: I0909 06:06:21.453116 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kltk\" (UniqueName: \"kubernetes.io/projected/8724bcd6-4402-4119-80d5-dd4ddd8684aa-kube-api-access-6kltk\") pod \"calico-apiserver-5454b67db4-jzppx\" (UID: \"8724bcd6-4402-4119-80d5-dd4ddd8684aa\") " pod="calico-apiserver/calico-apiserver-5454b67db4-jzppx" Sep 9 06:06:21.453622 kubelet[3324]: I0909 06:06:21.453167 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55-config-volume\") pod \"coredns-668d6bf9bc-m5t7p\" (UID: \"4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55\") " pod="kube-system/coredns-668d6bf9bc-m5t7p" Sep 9 06:06:21.453622 kubelet[3324]: I0909 06:06:21.453218 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hhqg\" (UniqueName: \"kubernetes.io/projected/4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55-kube-api-access-9hhqg\") pod \"coredns-668d6bf9bc-m5t7p\" (UID: \"4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55\") " pod="kube-system/coredns-668d6bf9bc-m5t7p" Sep 9 06:06:21.453622 kubelet[3324]: I0909 06:06:21.453265 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8724bcd6-4402-4119-80d5-dd4ddd8684aa-calico-apiserver-certs\") pod \"calico-apiserver-5454b67db4-jzppx\" (UID: \"8724bcd6-4402-4119-80d5-dd4ddd8684aa\") " pod="calico-apiserver/calico-apiserver-5454b67db4-jzppx" Sep 9 06:06:21.453622 kubelet[3324]: I0909 06:06:21.453306 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae1f795-3830-4fb7-9c99-070d871677cc-config\") pod \"goldmane-54d579b49d-m4bj8\" (UID: \"4ae1f795-3830-4fb7-9c99-070d871677cc\") " pod="calico-system/goldmane-54d579b49d-m4bj8" Sep 9 06:06:21.454454 kubelet[3324]: I0909 06:06:21.453347 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/4ae1f795-3830-4fb7-9c99-070d871677cc-goldmane-key-pair\") pod \"goldmane-54d579b49d-m4bj8\" (UID: \"4ae1f795-3830-4fb7-9c99-070d871677cc\") " pod="calico-system/goldmane-54d579b49d-m4bj8" Sep 9 06:06:21.454454 kubelet[3324]: I0909 06:06:21.453394 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-whisker-backend-key-pair\") pod \"whisker-d787d59c-d8jbq\" (UID: \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\") " pod="calico-system/whisker-d787d59c-d8jbq" Sep 9 06:06:21.454454 kubelet[3324]: I0909 06:06:21.453506 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-whisker-ca-bundle\") pod \"whisker-d787d59c-d8jbq\" (UID: \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\") " pod="calico-system/whisker-d787d59c-d8jbq" Sep 9 06:06:21.454454 kubelet[3324]: I0909 06:06:21.453642 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/485b14b7-7626-411d-9b65-4f8a429182ab-calico-apiserver-certs\") pod \"calico-apiserver-5454b67db4-qhwhh\" (UID: \"485b14b7-7626-411d-9b65-4f8a429182ab\") " pod="calico-apiserver/calico-apiserver-5454b67db4-qhwhh" Sep 9 06:06:21.454454 kubelet[3324]: I0909 06:06:21.453769 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vdl\" (UniqueName: \"kubernetes.io/projected/4ae1f795-3830-4fb7-9c99-070d871677cc-kube-api-access-n8vdl\") pod \"goldmane-54d579b49d-m4bj8\" (UID: \"4ae1f795-3830-4fb7-9c99-070d871677cc\") " pod="calico-system/goldmane-54d579b49d-m4bj8" Sep 9 06:06:21.454999 kubelet[3324]: I0909 06:06:21.453951 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8d7e7612-c3bb-4af7-b7b1-26306df4f0f2-config-volume\") pod \"coredns-668d6bf9bc-z8vxh\" (UID: \"8d7e7612-c3bb-4af7-b7b1-26306df4f0f2\") " pod="kube-system/coredns-668d6bf9bc-z8vxh" Sep 9 06:06:21.454999 kubelet[3324]: I0909 06:06:21.454141 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff94f516-177d-42b6-94ab-d8e1a6cb501b-tigera-ca-bundle\") pod \"calico-kube-controllers-55f466fc98-st25q\" (UID: \"ff94f516-177d-42b6-94ab-d8e1a6cb501b\") " pod="calico-system/calico-kube-controllers-55f466fc98-st25q" Sep 9 06:06:21.454999 kubelet[3324]: I0909 06:06:21.454235 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdmb\" (UniqueName: \"kubernetes.io/projected/ff94f516-177d-42b6-94ab-d8e1a6cb501b-kube-api-access-szdmb\") pod \"calico-kube-controllers-55f466fc98-st25q\" (UID: \"ff94f516-177d-42b6-94ab-d8e1a6cb501b\") " pod="calico-system/calico-kube-controllers-55f466fc98-st25q" Sep 9 06:06:21.454999 kubelet[3324]: I0909 06:06:21.454343 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnfzq\" (UniqueName: \"kubernetes.io/projected/485b14b7-7626-411d-9b65-4f8a429182ab-kube-api-access-gnfzq\") pod \"calico-apiserver-5454b67db4-qhwhh\" (UID: \"485b14b7-7626-411d-9b65-4f8a429182ab\") " pod="calico-apiserver/calico-apiserver-5454b67db4-qhwhh" Sep 9 06:06:21.454999 kubelet[3324]: I0909 06:06:21.454436 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mxc6\" (UniqueName: \"kubernetes.io/projected/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-kube-api-access-8mxc6\") pod \"whisker-d787d59c-d8jbq\" (UID: \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\") " pod="calico-system/whisker-d787d59c-d8jbq" Sep 9 06:06:21.579068 systemd[1]: Created slice kubepods-besteffort-pod70ad17b8_206d_4ff7_a93b_019a4878fd81.slice - libcontainer container kubepods-besteffort-pod70ad17b8_206d_4ff7_a93b_019a4878fd81.slice. Sep 9 06:06:21.603389 containerd[1949]: time="2025-09-09T06:06:21.603310554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lwj49,Uid:70ad17b8-206d-4ff7-a93b-019a4878fd81,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:21.695915 containerd[1949]: time="2025-09-09T06:06:21.695696534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-m5t7p,Uid:4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55,Namespace:kube-system,Attempt:0,}" Sep 9 06:06:21.698063 containerd[1949]: time="2025-09-09T06:06:21.698021794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z8vxh,Uid:8d7e7612-c3bb-4af7-b7b1-26306df4f0f2,Namespace:kube-system,Attempt:0,}" Sep 9 06:06:21.701449 containerd[1949]: time="2025-09-09T06:06:21.701392492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5454b67db4-qhwhh,Uid:485b14b7-7626-411d-9b65-4f8a429182ab,Namespace:calico-apiserver,Attempt:0,}" Sep 9 06:06:21.704879 containerd[1949]: time="2025-09-09T06:06:21.704825558Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5454b67db4-jzppx,Uid:8724bcd6-4402-4119-80d5-dd4ddd8684aa,Namespace:calico-apiserver,Attempt:0,}" Sep 9 06:06:21.708270 containerd[1949]: time="2025-09-09T06:06:21.708258467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55f466fc98-st25q,Uid:ff94f516-177d-42b6-94ab-d8e1a6cb501b,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:21.711735 containerd[1949]: time="2025-09-09T06:06:21.711721961Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-m4bj8,Uid:4ae1f795-3830-4fb7-9c99-070d871677cc,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:21.715154 containerd[1949]: time="2025-09-09T06:06:21.715142390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d787d59c-d8jbq,Uid:a11861c6-53bd-4f46-a3fd-f04b4a0ee127,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:21.768329 containerd[1949]: time="2025-09-09T06:06:21.768257722Z" level=error msg="Failed to destroy network for sandbox \"1fb4f107f899df93ccd3e04ebb13ea562717c8792590de9c937ec180bc40df65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.768764 containerd[1949]: time="2025-09-09T06:06:21.768616936Z" level=error msg="Failed to destroy network for sandbox \"5419c319930dde9e0d13b5b33380395184c4c4c93eda997fe336a6de397fe0e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.769149 containerd[1949]: time="2025-09-09T06:06:21.769053214Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5454b67db4-jzppx,Uid:8724bcd6-4402-4119-80d5-dd4ddd8684aa,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fb4f107f899df93ccd3e04ebb13ea562717c8792590de9c937ec180bc40df65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.769372 containerd[1949]: time="2025-09-09T06:06:21.769352467Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-m4bj8,Uid:4ae1f795-3830-4fb7-9c99-070d871677cc,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5419c319930dde9e0d13b5b33380395184c4c4c93eda997fe336a6de397fe0e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.769460 kubelet[3324]: E0909 06:06:21.769413 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fb4f107f899df93ccd3e04ebb13ea562717c8792590de9c937ec180bc40df65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.769518 kubelet[3324]: E0909 06:06:21.769496 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fb4f107f899df93ccd3e04ebb13ea562717c8792590de9c937ec180bc40df65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5454b67db4-jzppx" Sep 9 06:06:21.769551 kubelet[3324]: E0909 06:06:21.769444 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5419c319930dde9e0d13b5b33380395184c4c4c93eda997fe336a6de397fe0e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.769551 kubelet[3324]: E0909 06:06:21.769543 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5419c319930dde9e0d13b5b33380395184c4c4c93eda997fe336a6de397fe0e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-m4bj8" Sep 9 06:06:21.769615 kubelet[3324]: E0909 06:06:21.769558 3324 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5419c319930dde9e0d13b5b33380395184c4c4c93eda997fe336a6de397fe0e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-m4bj8" Sep 9 06:06:21.769615 kubelet[3324]: E0909 06:06:21.769516 3324 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1fb4f107f899df93ccd3e04ebb13ea562717c8792590de9c937ec180bc40df65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5454b67db4-jzppx" Sep 9 06:06:21.769615 kubelet[3324]: E0909 06:06:21.769588 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-m4bj8_calico-system(4ae1f795-3830-4fb7-9c99-070d871677cc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-m4bj8_calico-system(4ae1f795-3830-4fb7-9c99-070d871677cc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5419c319930dde9e0d13b5b33380395184c4c4c93eda997fe336a6de397fe0e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-m4bj8" podUID="4ae1f795-3830-4fb7-9c99-070d871677cc" Sep 9 06:06:21.769732 kubelet[3324]: E0909 06:06:21.769615 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5454b67db4-jzppx_calico-apiserver(8724bcd6-4402-4119-80d5-dd4ddd8684aa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5454b67db4-jzppx_calico-apiserver(8724bcd6-4402-4119-80d5-dd4ddd8684aa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1fb4f107f899df93ccd3e04ebb13ea562717c8792590de9c937ec180bc40df65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5454b67db4-jzppx" podUID="8724bcd6-4402-4119-80d5-dd4ddd8684aa" Sep 9 06:06:21.769784 containerd[1949]: time="2025-09-09T06:06:21.769661755Z" level=error msg="Failed to destroy network for sandbox \"9994127458e9a00d7c8183496b24c46945521cbc44ec242580a9b873d6c41362\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.769816 containerd[1949]: time="2025-09-09T06:06:21.769771211Z" level=error msg="Failed to destroy network for sandbox \"35052f4f99682cad374c5a2aee27a26878b044d48493536fd890b325b2726265\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.769929 containerd[1949]: time="2025-09-09T06:06:21.769912118Z" level=error msg="Failed to destroy network for sandbox \"0015f4c0452a7bc0fb7e933f49df90eaff04c49c65596d5f97e33f58738b3227\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770076 containerd[1949]: time="2025-09-09T06:06:21.770050943Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d787d59c-d8jbq,Uid:a11861c6-53bd-4f46-a3fd-f04b4a0ee127,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9994127458e9a00d7c8183496b24c46945521cbc44ec242580a9b873d6c41362\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770151 kubelet[3324]: E0909 06:06:21.770134 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9994127458e9a00d7c8183496b24c46945521cbc44ec242580a9b873d6c41362\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770188 kubelet[3324]: E0909 06:06:21.770161 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9994127458e9a00d7c8183496b24c46945521cbc44ec242580a9b873d6c41362\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d787d59c-d8jbq" Sep 9 06:06:21.770188 kubelet[3324]: E0909 06:06:21.770176 3324 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9994127458e9a00d7c8183496b24c46945521cbc44ec242580a9b873d6c41362\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d787d59c-d8jbq" Sep 9 06:06:21.770252 kubelet[3324]: E0909 06:06:21.770194 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-d787d59c-d8jbq_calico-system(a11861c6-53bd-4f46-a3fd-f04b4a0ee127)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-d787d59c-d8jbq_calico-system(a11861c6-53bd-4f46-a3fd-f04b4a0ee127)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9994127458e9a00d7c8183496b24c46945521cbc44ec242580a9b873d6c41362\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-d787d59c-d8jbq" podUID="a11861c6-53bd-4f46-a3fd-f04b4a0ee127" Sep 9 06:06:21.770324 containerd[1949]: time="2025-09-09T06:06:21.770310768Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5454b67db4-qhwhh,Uid:485b14b7-7626-411d-9b65-4f8a429182ab,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"35052f4f99682cad374c5a2aee27a26878b044d48493536fd890b325b2726265\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770388 kubelet[3324]: E0909 06:06:21.770375 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35052f4f99682cad374c5a2aee27a26878b044d48493536fd890b325b2726265\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770421 kubelet[3324]: E0909 06:06:21.770395 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35052f4f99682cad374c5a2aee27a26878b044d48493536fd890b325b2726265\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5454b67db4-qhwhh" Sep 9 06:06:21.770421 kubelet[3324]: E0909 06:06:21.770407 3324 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35052f4f99682cad374c5a2aee27a26878b044d48493536fd890b325b2726265\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5454b67db4-qhwhh" Sep 9 06:06:21.770480 kubelet[3324]: E0909 06:06:21.770427 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5454b67db4-qhwhh_calico-apiserver(485b14b7-7626-411d-9b65-4f8a429182ab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5454b67db4-qhwhh_calico-apiserver(485b14b7-7626-411d-9b65-4f8a429182ab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"35052f4f99682cad374c5a2aee27a26878b044d48493536fd890b325b2726265\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5454b67db4-qhwhh" podUID="485b14b7-7626-411d-9b65-4f8a429182ab" Sep 9 06:06:21.770580 containerd[1949]: time="2025-09-09T06:06:21.770567162Z" level=error msg="Failed to destroy network for sandbox \"ecb24dac0056919149ceefa8aad7610cd6084eb4b558e7a8dfd949c72f46aa46\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770637 containerd[1949]: time="2025-09-09T06:06:21.770612371Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lwj49,Uid:70ad17b8-206d-4ff7-a93b-019a4878fd81,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0015f4c0452a7bc0fb7e933f49df90eaff04c49c65596d5f97e33f58738b3227\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770694 containerd[1949]: time="2025-09-09T06:06:21.770661250Z" level=error msg="Failed to destroy network for sandbox \"fc8efe36a418e3ec603c0bc9a08d2057807fe79bbd389946351784047f6247bc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770724 kubelet[3324]: E0909 06:06:21.770708 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0015f4c0452a7bc0fb7e933f49df90eaff04c49c65596d5f97e33f58738b3227\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770744 kubelet[3324]: E0909 06:06:21.770727 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0015f4c0452a7bc0fb7e933f49df90eaff04c49c65596d5f97e33f58738b3227\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lwj49" Sep 9 06:06:21.770766 kubelet[3324]: E0909 06:06:21.770742 3324 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0015f4c0452a7bc0fb7e933f49df90eaff04c49c65596d5f97e33f58738b3227\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lwj49" Sep 9 06:06:21.770788 kubelet[3324]: E0909 06:06:21.770763 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-lwj49_calico-system(70ad17b8-206d-4ff7-a93b-019a4878fd81)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-lwj49_calico-system(70ad17b8-206d-4ff7-a93b-019a4878fd81)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0015f4c0452a7bc0fb7e933f49df90eaff04c49c65596d5f97e33f58738b3227\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-lwj49" podUID="70ad17b8-206d-4ff7-a93b-019a4878fd81" Sep 9 06:06:21.770918 containerd[1949]: time="2025-09-09T06:06:21.770904656Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-m5t7p,Uid:4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecb24dac0056919149ceefa8aad7610cd6084eb4b558e7a8dfd949c72f46aa46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.770971 kubelet[3324]: E0909 06:06:21.770960 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecb24dac0056919149ceefa8aad7610cd6084eb4b558e7a8dfd949c72f46aa46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.771031 kubelet[3324]: E0909 06:06:21.770990 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecb24dac0056919149ceefa8aad7610cd6084eb4b558e7a8dfd949c72f46aa46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-m5t7p" Sep 9 06:06:21.771031 kubelet[3324]: E0909 06:06:21.771006 3324 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecb24dac0056919149ceefa8aad7610cd6084eb4b558e7a8dfd949c72f46aa46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-m5t7p" Sep 9 06:06:21.771070 kubelet[3324]: E0909 06:06:21.771029 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-m5t7p_kube-system(4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-m5t7p_kube-system(4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ecb24dac0056919149ceefa8aad7610cd6084eb4b558e7a8dfd949c72f46aa46\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-m5t7p" podUID="4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55" Sep 9 06:06:21.771169 containerd[1949]: time="2025-09-09T06:06:21.771155006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z8vxh,Uid:8d7e7612-c3bb-4af7-b7b1-26306df4f0f2,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc8efe36a418e3ec603c0bc9a08d2057807fe79bbd389946351784047f6247bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.771222 kubelet[3324]: E0909 06:06:21.771212 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc8efe36a418e3ec603c0bc9a08d2057807fe79bbd389946351784047f6247bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.771246 kubelet[3324]: E0909 06:06:21.771227 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc8efe36a418e3ec603c0bc9a08d2057807fe79bbd389946351784047f6247bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-z8vxh" Sep 9 06:06:21.771246 kubelet[3324]: E0909 06:06:21.771236 3324 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fc8efe36a418e3ec603c0bc9a08d2057807fe79bbd389946351784047f6247bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-z8vxh" Sep 9 06:06:21.771303 kubelet[3324]: E0909 06:06:21.771258 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-z8vxh_kube-system(8d7e7612-c3bb-4af7-b7b1-26306df4f0f2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-z8vxh_kube-system(8d7e7612-c3bb-4af7-b7b1-26306df4f0f2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fc8efe36a418e3ec603c0bc9a08d2057807fe79bbd389946351784047f6247bc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-z8vxh" podUID="8d7e7612-c3bb-4af7-b7b1-26306df4f0f2" Sep 9 06:06:21.771819 containerd[1949]: time="2025-09-09T06:06:21.771805607Z" level=error msg="Failed to destroy network for sandbox \"5d3ac7e5946debc94a90dc6d1a249e4d168969ec20fa3d0c8f4ed505b34d4f4b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.772145 containerd[1949]: time="2025-09-09T06:06:21.772129539Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55f466fc98-st25q,Uid:ff94f516-177d-42b6-94ab-d8e1a6cb501b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d3ac7e5946debc94a90dc6d1a249e4d168969ec20fa3d0c8f4ed505b34d4f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.772201 kubelet[3324]: E0909 06:06:21.772189 3324 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d3ac7e5946debc94a90dc6d1a249e4d168969ec20fa3d0c8f4ed505b34d4f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 06:06:21.772229 kubelet[3324]: E0909 06:06:21.772204 3324 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d3ac7e5946debc94a90dc6d1a249e4d168969ec20fa3d0c8f4ed505b34d4f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-55f466fc98-st25q" Sep 9 06:06:21.772229 kubelet[3324]: E0909 06:06:21.772213 3324 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d3ac7e5946debc94a90dc6d1a249e4d168969ec20fa3d0c8f4ed505b34d4f4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-55f466fc98-st25q" Sep 9 06:06:21.772271 kubelet[3324]: E0909 06:06:21.772230 3324 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-55f466fc98-st25q_calico-system(ff94f516-177d-42b6-94ab-d8e1a6cb501b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-55f466fc98-st25q_calico-system(ff94f516-177d-42b6-94ab-d8e1a6cb501b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d3ac7e5946debc94a90dc6d1a249e4d168969ec20fa3d0c8f4ed505b34d4f4b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-55f466fc98-st25q" podUID="ff94f516-177d-42b6-94ab-d8e1a6cb501b" Sep 9 06:06:21.779474 systemd[1]: run-netns-cni\x2dcc309c6f\x2d4f19\x2d4f80\x2dc677\x2d11417cdd9bce.mount: Deactivated successfully. Sep 9 06:06:21.779545 systemd[1]: run-netns-cni\x2dd2565e46\x2d15cd\x2dd2ab\x2d3bdf\x2d9dbf9467c0e9.mount: Deactivated successfully. Sep 9 06:06:21.779611 systemd[1]: run-netns-cni\x2d90e4fbc2\x2daa46\x2d1872\x2dc300\x2d49403e0f203d.mount: Deactivated successfully. Sep 9 06:06:21.779659 systemd[1]: run-netns-cni\x2dca90be76\x2d4eed\x2def92\x2d49c0\x2de041babc4d31.mount: Deactivated successfully. Sep 9 06:06:21.779734 systemd[1]: run-netns-cni\x2debcde656\x2dc3dc\x2dacc2\x2dff09\x2da57e22df0b9f.mount: Deactivated successfully. Sep 9 06:06:21.779780 systemd[1]: run-netns-cni\x2d35848ae6\x2dc15f\x2deb02\x2de001\x2d7492f94ae237.mount: Deactivated successfully. Sep 9 06:06:21.779825 systemd[1]: run-netns-cni\x2dc81cb8b0\x2df674\x2d64ad\x2df259\x2daae7fe15ce23.mount: Deactivated successfully. Sep 9 06:06:21.779869 systemd[1]: run-netns-cni\x2d7034e3c5\x2dbdc8\x2d5df0\x2dee7c\x2dab395892c570.mount: Deactivated successfully. Sep 9 06:06:22.676441 containerd[1949]: time="2025-09-09T06:06:22.676363441Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 06:06:27.806897 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3882212955.mount: Deactivated successfully. Sep 9 06:06:27.815768 containerd[1949]: time="2025-09-09T06:06:27.815748698Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:27.815973 containerd[1949]: time="2025-09-09T06:06:27.815957009Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 06:06:27.816283 containerd[1949]: time="2025-09-09T06:06:27.816272725Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:27.817039 containerd[1949]: time="2025-09-09T06:06:27.817026068Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:27.817370 containerd[1949]: time="2025-09-09T06:06:27.817326418Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.140900639s" Sep 9 06:06:27.817370 containerd[1949]: time="2025-09-09T06:06:27.817341150Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 06:06:27.820746 containerd[1949]: time="2025-09-09T06:06:27.820725549Z" level=info msg="CreateContainer within sandbox \"e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 06:06:27.824548 containerd[1949]: time="2025-09-09T06:06:27.824503766Z" level=info msg="Container c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:27.835664 containerd[1949]: time="2025-09-09T06:06:27.835620991Z" level=info msg="CreateContainer within sandbox \"e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\"" Sep 9 06:06:27.835924 containerd[1949]: time="2025-09-09T06:06:27.835891214Z" level=info msg="StartContainer for \"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\"" Sep 9 06:06:27.836659 containerd[1949]: time="2025-09-09T06:06:27.836618938Z" level=info msg="connecting to shim c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a" address="unix:///run/containerd/s/2b84bc2ff438664778015817c75d76061253c0e5ffac077831f3e2d42e92f39e" protocol=ttrpc version=3 Sep 9 06:06:27.851947 systemd[1]: Started cri-containerd-c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a.scope - libcontainer container c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a. Sep 9 06:06:27.875390 containerd[1949]: time="2025-09-09T06:06:27.875335080Z" level=info msg="StartContainer for \"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" returns successfully" Sep 9 06:06:27.937441 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 06:06:27.937503 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 06:06:28.002766 kubelet[3324]: I0909 06:06:28.002743 3324 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-whisker-ca-bundle\") pod \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\" (UID: \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\") " Sep 9 06:06:28.002766 kubelet[3324]: I0909 06:06:28.002777 3324 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mxc6\" (UniqueName: \"kubernetes.io/projected/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-kube-api-access-8mxc6\") pod \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\" (UID: \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\") " Sep 9 06:06:28.003121 kubelet[3324]: I0909 06:06:28.002792 3324 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-whisker-backend-key-pair\") pod \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\" (UID: \"a11861c6-53bd-4f46-a3fd-f04b4a0ee127\") " Sep 9 06:06:28.003121 kubelet[3324]: I0909 06:06:28.003031 3324 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "a11861c6-53bd-4f46-a3fd-f04b4a0ee127" (UID: "a11861c6-53bd-4f46-a3fd-f04b4a0ee127"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 06:06:28.004377 kubelet[3324]: I0909 06:06:28.004363 3324 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "a11861c6-53bd-4f46-a3fd-f04b4a0ee127" (UID: "a11861c6-53bd-4f46-a3fd-f04b4a0ee127"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 06:06:28.004377 kubelet[3324]: I0909 06:06:28.004364 3324 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-kube-api-access-8mxc6" (OuterVolumeSpecName: "kube-api-access-8mxc6") pod "a11861c6-53bd-4f46-a3fd-f04b4a0ee127" (UID: "a11861c6-53bd-4f46-a3fd-f04b4a0ee127"). InnerVolumeSpecName "kube-api-access-8mxc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 06:06:28.103968 kubelet[3324]: I0909 06:06:28.103859 3324 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-whisker-backend-key-pair\") on node \"ci-4452.0.0-n-7ab43648c0\" DevicePath \"\"" Sep 9 06:06:28.103968 kubelet[3324]: I0909 06:06:28.103931 3324 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8mxc6\" (UniqueName: \"kubernetes.io/projected/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-kube-api-access-8mxc6\") on node \"ci-4452.0.0-n-7ab43648c0\" DevicePath \"\"" Sep 9 06:06:28.103968 kubelet[3324]: I0909 06:06:28.103960 3324 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a11861c6-53bd-4f46-a3fd-f04b4a0ee127-whisker-ca-bundle\") on node \"ci-4452.0.0-n-7ab43648c0\" DevicePath \"\"" Sep 9 06:06:28.582218 systemd[1]: Removed slice kubepods-besteffort-poda11861c6_53bd_4f46_a3fd_f04b4a0ee127.slice - libcontainer container kubepods-besteffort-poda11861c6_53bd_4f46_a3fd_f04b4a0ee127.slice. Sep 9 06:06:28.718792 kubelet[3324]: I0909 06:06:28.718746 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-2lddc" podStartSLOduration=1.785856726 podStartE2EDuration="17.718732618s" podCreationTimestamp="2025-09-09 06:06:11 +0000 UTC" firstStartedPulling="2025-09-09 06:06:11.88481669 +0000 UTC m=+15.363461750" lastFinishedPulling="2025-09-09 06:06:27.817692582 +0000 UTC m=+31.296337642" observedRunningTime="2025-09-09 06:06:28.718439427 +0000 UTC m=+32.197084488" watchObservedRunningTime="2025-09-09 06:06:28.718732618 +0000 UTC m=+32.197377675" Sep 9 06:06:28.761018 systemd[1]: Created slice kubepods-besteffort-pod4dac51d9_2351_4292_98d0_cc8b28ec5c12.slice - libcontainer container kubepods-besteffort-pod4dac51d9_2351_4292_98d0_cc8b28ec5c12.slice. Sep 9 06:06:28.810024 kubelet[3324]: I0909 06:06:28.809944 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86455\" (UniqueName: \"kubernetes.io/projected/4dac51d9-2351-4292-98d0-cc8b28ec5c12-kube-api-access-86455\") pod \"whisker-66d87767b4-dr68m\" (UID: \"4dac51d9-2351-4292-98d0-cc8b28ec5c12\") " pod="calico-system/whisker-66d87767b4-dr68m" Sep 9 06:06:28.810305 kubelet[3324]: I0909 06:06:28.810106 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4dac51d9-2351-4292-98d0-cc8b28ec5c12-whisker-backend-key-pair\") pod \"whisker-66d87767b4-dr68m\" (UID: \"4dac51d9-2351-4292-98d0-cc8b28ec5c12\") " pod="calico-system/whisker-66d87767b4-dr68m" Sep 9 06:06:28.810305 kubelet[3324]: I0909 06:06:28.810174 3324 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4dac51d9-2351-4292-98d0-cc8b28ec5c12-whisker-ca-bundle\") pod \"whisker-66d87767b4-dr68m\" (UID: \"4dac51d9-2351-4292-98d0-cc8b28ec5c12\") " pod="calico-system/whisker-66d87767b4-dr68m" Sep 9 06:06:28.814240 systemd[1]: var-lib-kubelet-pods-a11861c6\x2d53bd\x2d4f46\x2da3fd\x2df04b4a0ee127-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8mxc6.mount: Deactivated successfully. Sep 9 06:06:28.814516 systemd[1]: var-lib-kubelet-pods-a11861c6\x2d53bd\x2d4f46\x2da3fd\x2df04b4a0ee127-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 06:06:29.067517 containerd[1949]: time="2025-09-09T06:06:29.067483941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-66d87767b4-dr68m,Uid:4dac51d9-2351-4292-98d0-cc8b28ec5c12,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:29.149602 systemd-networkd[1870]: cali266f721a6a9: Link UP Sep 9 06:06:29.149767 systemd-networkd[1870]: cali266f721a6a9: Gained carrier Sep 9 06:06:29.156327 containerd[1949]: 2025-09-09 06:06:29.089 [INFO][4803] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 06:06:29.156327 containerd[1949]: 2025-09-09 06:06:29.101 [INFO][4803] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0 whisker-66d87767b4- calico-system 4dac51d9-2351-4292-98d0-cc8b28ec5c12 847 0 2025-09-09 06:06:28 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:66d87767b4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4452.0.0-n-7ab43648c0 whisker-66d87767b4-dr68m eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali266f721a6a9 [] [] }} ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Namespace="calico-system" Pod="whisker-66d87767b4-dr68m" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-" Sep 9 06:06:29.156327 containerd[1949]: 2025-09-09 06:06:29.101 [INFO][4803] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Namespace="calico-system" Pod="whisker-66d87767b4-dr68m" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" Sep 9 06:06:29.156327 containerd[1949]: 2025-09-09 06:06:29.117 [INFO][4865] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" HandleID="k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Workload="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.117 [INFO][4865] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" HandleID="k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Workload="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00026f8c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-7ab43648c0", "pod":"whisker-66d87767b4-dr68m", "timestamp":"2025-09-09 06:06:29.117366604 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7ab43648c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.117 [INFO][4865] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.117 [INFO][4865] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.117 [INFO][4865] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7ab43648c0' Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.122 [INFO][4865] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.124 [INFO][4865] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.128 [INFO][4865] ipam/ipam.go 511: Trying affinity for 192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.129 [INFO][4865] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156480 containerd[1949]: 2025-09-09 06:06:29.131 [INFO][4865] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156633 containerd[1949]: 2025-09-09 06:06:29.132 [INFO][4865] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.192/26 handle="k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156633 containerd[1949]: 2025-09-09 06:06:29.133 [INFO][4865] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7 Sep 9 06:06:29.156633 containerd[1949]: 2025-09-09 06:06:29.137 [INFO][4865] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.192/26 handle="k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156633 containerd[1949]: 2025-09-09 06:06:29.143 [INFO][4865] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.193/26] block=192.168.55.192/26 handle="k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156633 containerd[1949]: 2025-09-09 06:06:29.143 [INFO][4865] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.193/26] handle="k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:29.156633 containerd[1949]: 2025-09-09 06:06:29.143 [INFO][4865] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 06:06:29.156633 containerd[1949]: 2025-09-09 06:06:29.143 [INFO][4865] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.193/26] IPv6=[] ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" HandleID="k8s-pod-network.e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Workload="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" Sep 9 06:06:29.156766 containerd[1949]: 2025-09-09 06:06:29.145 [INFO][4803] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Namespace="calico-system" Pod="whisker-66d87767b4-dr68m" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0", GenerateName:"whisker-66d87767b4-", Namespace:"calico-system", SelfLink:"", UID:"4dac51d9-2351-4292-98d0-cc8b28ec5c12", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"66d87767b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"", Pod:"whisker-66d87767b4-dr68m", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.55.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali266f721a6a9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:29.156766 containerd[1949]: 2025-09-09 06:06:29.145 [INFO][4803] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.193/32] ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Namespace="calico-system" Pod="whisker-66d87767b4-dr68m" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" Sep 9 06:06:29.156820 containerd[1949]: 2025-09-09 06:06:29.145 [INFO][4803] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali266f721a6a9 ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Namespace="calico-system" Pod="whisker-66d87767b4-dr68m" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" Sep 9 06:06:29.156820 containerd[1949]: 2025-09-09 06:06:29.150 [INFO][4803] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Namespace="calico-system" Pod="whisker-66d87767b4-dr68m" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" Sep 9 06:06:29.156850 containerd[1949]: 2025-09-09 06:06:29.150 [INFO][4803] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Namespace="calico-system" Pod="whisker-66d87767b4-dr68m" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0", GenerateName:"whisker-66d87767b4-", Namespace:"calico-system", SelfLink:"", UID:"4dac51d9-2351-4292-98d0-cc8b28ec5c12", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"66d87767b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7", Pod:"whisker-66d87767b4-dr68m", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.55.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali266f721a6a9", MAC:"5a:42:db:e8:9c:bd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:29.156885 containerd[1949]: 2025-09-09 06:06:29.155 [INFO][4803] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" Namespace="calico-system" Pod="whisker-66d87767b4-dr68m" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-whisker--66d87767b4--dr68m-eth0" Sep 9 06:06:29.165461 containerd[1949]: time="2025-09-09T06:06:29.165421156Z" level=info msg="connecting to shim e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7" address="unix:///run/containerd/s/5a9fc1ec31c783c2a1e33818f6bf0accda80b3e92b9e926d4fd7a55feb721be8" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:29.182283 systemd[1]: Started cri-containerd-e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7.scope - libcontainer container e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7. Sep 9 06:06:29.254083 containerd[1949]: time="2025-09-09T06:06:29.254059952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-66d87767b4-dr68m,Uid:4dac51d9-2351-4292-98d0-cc8b28ec5c12,Namespace:calico-system,Attempt:0,} returns sandbox id \"e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7\"" Sep 9 06:06:29.254706 containerd[1949]: time="2025-09-09T06:06:29.254694544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 06:06:29.759398 containerd[1949]: time="2025-09-09T06:06:29.759328611Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"7e9990df4520422e749053412608138ad030ab38a83b4aeae4c3bebf8ec266c7\" pid:4950 exit_status:1 exited_at:{seconds:1757397989 nanos:759120227}" Sep 9 06:06:30.311974 systemd-networkd[1870]: cali266f721a6a9: Gained IPv6LL Sep 9 06:06:30.574510 kubelet[3324]: I0909 06:06:30.574283 3324 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a11861c6-53bd-4f46-a3fd-f04b4a0ee127" path="/var/lib/kubelet/pods/a11861c6-53bd-4f46-a3fd-f04b4a0ee127/volumes" Sep 9 06:06:30.748128 containerd[1949]: time="2025-09-09T06:06:30.748103638Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"4eecb4be0ce1e547b088bbe2c5c0032118a3614a4dbd47406d4119be2acafb5c\" pid:5039 exit_status:1 exited_at:{seconds:1757397990 nanos:747881320}" Sep 9 06:06:31.338994 containerd[1949]: time="2025-09-09T06:06:31.338939147Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:31.339106 containerd[1949]: time="2025-09-09T06:06:31.339094067Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 06:06:31.339486 containerd[1949]: time="2025-09-09T06:06:31.339444518Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:31.340345 containerd[1949]: time="2025-09-09T06:06:31.340310543Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:31.340771 containerd[1949]: time="2025-09-09T06:06:31.340753329Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 2.086042306s" Sep 9 06:06:31.340794 containerd[1949]: time="2025-09-09T06:06:31.340775501Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 06:06:31.341923 containerd[1949]: time="2025-09-09T06:06:31.341912040Z" level=info msg="CreateContainer within sandbox \"e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 06:06:31.344259 containerd[1949]: time="2025-09-09T06:06:31.344246416Z" level=info msg="Container d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:31.347169 containerd[1949]: time="2025-09-09T06:06:31.347155913Z" level=info msg="CreateContainer within sandbox \"e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7\"" Sep 9 06:06:31.347431 containerd[1949]: time="2025-09-09T06:06:31.347392297Z" level=info msg="StartContainer for \"d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7\"" Sep 9 06:06:31.348051 containerd[1949]: time="2025-09-09T06:06:31.348010494Z" level=info msg="connecting to shim d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7" address="unix:///run/containerd/s/5a9fc1ec31c783c2a1e33818f6bf0accda80b3e92b9e926d4fd7a55feb721be8" protocol=ttrpc version=3 Sep 9 06:06:31.367008 systemd[1]: Started cri-containerd-d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7.scope - libcontainer container d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7. Sep 9 06:06:31.397837 containerd[1949]: time="2025-09-09T06:06:31.397814850Z" level=info msg="StartContainer for \"d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7\" returns successfully" Sep 9 06:06:31.398406 containerd[1949]: time="2025-09-09T06:06:31.398389752Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 06:06:32.569171 containerd[1949]: time="2025-09-09T06:06:32.569056273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z8vxh,Uid:8d7e7612-c3bb-4af7-b7b1-26306df4f0f2,Namespace:kube-system,Attempt:0,}" Sep 9 06:06:32.569883 containerd[1949]: time="2025-09-09T06:06:32.569059011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-m4bj8,Uid:4ae1f795-3830-4fb7-9c99-070d871677cc,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:32.569883 containerd[1949]: time="2025-09-09T06:06:32.569272957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5454b67db4-qhwhh,Uid:485b14b7-7626-411d-9b65-4f8a429182ab,Namespace:calico-apiserver,Attempt:0,}" Sep 9 06:06:32.626260 systemd-networkd[1870]: califfc7d307867: Link UP Sep 9 06:06:32.626599 systemd-networkd[1870]: califfc7d307867: Gained carrier Sep 9 06:06:32.632665 containerd[1949]: 2025-09-09 06:06:32.584 [INFO][5208] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 06:06:32.632665 containerd[1949]: 2025-09-09 06:06:32.591 [INFO][5208] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0 goldmane-54d579b49d- calico-system 4ae1f795-3830-4fb7-9c99-070d871677cc 781 0 2025-09-09 06:06:11 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4452.0.0-n-7ab43648c0 goldmane-54d579b49d-m4bj8 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califfc7d307867 [] [] }} ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Namespace="calico-system" Pod="goldmane-54d579b49d-m4bj8" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-" Sep 9 06:06:32.632665 containerd[1949]: 2025-09-09 06:06:32.591 [INFO][5208] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Namespace="calico-system" Pod="goldmane-54d579b49d-m4bj8" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" Sep 9 06:06:32.632665 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5276] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" HandleID="k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Workload="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5276] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" HandleID="k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Workload="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00026f6c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-7ab43648c0", "pod":"goldmane-54d579b49d-m4bj8", "timestamp":"2025-09-09 06:06:32.605281838 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7ab43648c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5276] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5276] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5276] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7ab43648c0' Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.609 [INFO][5276] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.611 [INFO][5276] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.614 [INFO][5276] ipam/ipam.go 511: Trying affinity for 192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.616 [INFO][5276] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632821 containerd[1949]: 2025-09-09 06:06:32.617 [INFO][5276] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632961 containerd[1949]: 2025-09-09 06:06:32.617 [INFO][5276] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.192/26 handle="k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632961 containerd[1949]: 2025-09-09 06:06:32.618 [INFO][5276] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a Sep 9 06:06:32.632961 containerd[1949]: 2025-09-09 06:06:32.621 [INFO][5276] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.192/26 handle="k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632961 containerd[1949]: 2025-09-09 06:06:32.624 [INFO][5276] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.194/26] block=192.168.55.192/26 handle="k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632961 containerd[1949]: 2025-09-09 06:06:32.624 [INFO][5276] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.194/26] handle="k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.632961 containerd[1949]: 2025-09-09 06:06:32.624 [INFO][5276] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 06:06:32.632961 containerd[1949]: 2025-09-09 06:06:32.624 [INFO][5276] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.194/26] IPv6=[] ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" HandleID="k8s-pod-network.31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Workload="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" Sep 9 06:06:32.633059 containerd[1949]: 2025-09-09 06:06:32.625 [INFO][5208] cni-plugin/k8s.go 418: Populated endpoint ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Namespace="calico-system" Pod="goldmane-54d579b49d-m4bj8" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"4ae1f795-3830-4fb7-9c99-070d871677cc", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"", Pod:"goldmane-54d579b49d-m4bj8", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.55.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califfc7d307867", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:32.633097 containerd[1949]: 2025-09-09 06:06:32.625 [INFO][5208] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.194/32] ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Namespace="calico-system" Pod="goldmane-54d579b49d-m4bj8" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" Sep 9 06:06:32.633097 containerd[1949]: 2025-09-09 06:06:32.625 [INFO][5208] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califfc7d307867 ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Namespace="calico-system" Pod="goldmane-54d579b49d-m4bj8" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" Sep 9 06:06:32.633097 containerd[1949]: 2025-09-09 06:06:32.626 [INFO][5208] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Namespace="calico-system" Pod="goldmane-54d579b49d-m4bj8" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" Sep 9 06:06:32.633146 containerd[1949]: 2025-09-09 06:06:32.627 [INFO][5208] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Namespace="calico-system" Pod="goldmane-54d579b49d-m4bj8" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"4ae1f795-3830-4fb7-9c99-070d871677cc", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a", Pod:"goldmane-54d579b49d-m4bj8", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.55.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califfc7d307867", MAC:"d6:27:cf:9c:8e:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:32.633181 containerd[1949]: 2025-09-09 06:06:32.631 [INFO][5208] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" Namespace="calico-system" Pod="goldmane-54d579b49d-m4bj8" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-goldmane--54d579b49d--m4bj8-eth0" Sep 9 06:06:32.640671 containerd[1949]: time="2025-09-09T06:06:32.640643949Z" level=info msg="connecting to shim 31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a" address="unix:///run/containerd/s/2e4c8f93211d0d30381781391ea2be19438f2d2aa25cc4d2dc06e6996dfeb22d" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:32.666946 systemd[1]: Started cri-containerd-31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a.scope - libcontainer container 31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a. Sep 9 06:06:32.698707 containerd[1949]: time="2025-09-09T06:06:32.698646506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-m4bj8,Uid:4ae1f795-3830-4fb7-9c99-070d871677cc,Namespace:calico-system,Attempt:0,} returns sandbox id \"31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a\"" Sep 9 06:06:32.724382 systemd-networkd[1870]: calicbbb48b48c2: Link UP Sep 9 06:06:32.724533 systemd-networkd[1870]: calicbbb48b48c2: Gained carrier Sep 9 06:06:32.730022 containerd[1949]: 2025-09-09 06:06:32.584 [INFO][5203] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 06:06:32.730022 containerd[1949]: 2025-09-09 06:06:32.591 [INFO][5203] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0 coredns-668d6bf9bc- kube-system 8d7e7612-c3bb-4af7-b7b1-26306df4f0f2 782 0 2025-09-09 06:06:02 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452.0.0-n-7ab43648c0 coredns-668d6bf9bc-z8vxh eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calicbbb48b48c2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Namespace="kube-system" Pod="coredns-668d6bf9bc-z8vxh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-" Sep 9 06:06:32.730022 containerd[1949]: 2025-09-09 06:06:32.591 [INFO][5203] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Namespace="kube-system" Pod="coredns-668d6bf9bc-z8vxh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" Sep 9 06:06:32.730022 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5273] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" HandleID="k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Workload="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5273] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" HandleID="k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Workload="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f770), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452.0.0-n-7ab43648c0", "pod":"coredns-668d6bf9bc-z8vxh", "timestamp":"2025-09-09 06:06:32.605293394 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7ab43648c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5273] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.624 [INFO][5273] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.624 [INFO][5273] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7ab43648c0' Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.709 [INFO][5273] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.711 [INFO][5273] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.715 [INFO][5273] ipam/ipam.go 511: Trying affinity for 192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.716 [INFO][5273] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730204 containerd[1949]: 2025-09-09 06:06:32.717 [INFO][5273] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730356 containerd[1949]: 2025-09-09 06:06:32.717 [INFO][5273] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.192/26 handle="k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730356 containerd[1949]: 2025-09-09 06:06:32.718 [INFO][5273] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0 Sep 9 06:06:32.730356 containerd[1949]: 2025-09-09 06:06:32.720 [INFO][5273] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.192/26 handle="k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730356 containerd[1949]: 2025-09-09 06:06:32.722 [INFO][5273] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.195/26] block=192.168.55.192/26 handle="k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730356 containerd[1949]: 2025-09-09 06:06:32.722 [INFO][5273] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.195/26] handle="k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.730356 containerd[1949]: 2025-09-09 06:06:32.722 [INFO][5273] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 06:06:32.730356 containerd[1949]: 2025-09-09 06:06:32.722 [INFO][5273] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.195/26] IPv6=[] ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" HandleID="k8s-pod-network.e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Workload="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" Sep 9 06:06:32.730461 containerd[1949]: 2025-09-09 06:06:32.723 [INFO][5203] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Namespace="kube-system" Pod="coredns-668d6bf9bc-z8vxh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"8d7e7612-c3bb-4af7-b7b1-26306df4f0f2", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"", Pod:"coredns-668d6bf9bc-z8vxh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicbbb48b48c2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:32.730461 containerd[1949]: 2025-09-09 06:06:32.723 [INFO][5203] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.195/32] ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Namespace="kube-system" Pod="coredns-668d6bf9bc-z8vxh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" Sep 9 06:06:32.730461 containerd[1949]: 2025-09-09 06:06:32.723 [INFO][5203] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicbbb48b48c2 ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Namespace="kube-system" Pod="coredns-668d6bf9bc-z8vxh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" Sep 9 06:06:32.730461 containerd[1949]: 2025-09-09 06:06:32.724 [INFO][5203] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Namespace="kube-system" Pod="coredns-668d6bf9bc-z8vxh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" Sep 9 06:06:32.730461 containerd[1949]: 2025-09-09 06:06:32.724 [INFO][5203] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Namespace="kube-system" Pod="coredns-668d6bf9bc-z8vxh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"8d7e7612-c3bb-4af7-b7b1-26306df4f0f2", ResourceVersion:"782", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0", Pod:"coredns-668d6bf9bc-z8vxh", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicbbb48b48c2", MAC:"1e:e0:0a:5c:86:8d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:32.730461 containerd[1949]: 2025-09-09 06:06:32.729 [INFO][5203] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" Namespace="kube-system" Pod="coredns-668d6bf9bc-z8vxh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--z8vxh-eth0" Sep 9 06:06:32.737874 containerd[1949]: time="2025-09-09T06:06:32.737850736Z" level=info msg="connecting to shim e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0" address="unix:///run/containerd/s/405b4da304b77a14a73192c0eabc9ae762549da615ec26a940abcbfa347daa4f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:32.755910 systemd[1]: Started cri-containerd-e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0.scope - libcontainer container e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0. Sep 9 06:06:32.786584 containerd[1949]: time="2025-09-09T06:06:32.786534738Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-z8vxh,Uid:8d7e7612-c3bb-4af7-b7b1-26306df4f0f2,Namespace:kube-system,Attempt:0,} returns sandbox id \"e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0\"" Sep 9 06:06:32.787654 containerd[1949]: time="2025-09-09T06:06:32.787639716Z" level=info msg="CreateContainer within sandbox \"e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 06:06:32.790848 containerd[1949]: time="2025-09-09T06:06:32.790807796Z" level=info msg="Container ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:32.793313 containerd[1949]: time="2025-09-09T06:06:32.793271012Z" level=info msg="CreateContainer within sandbox \"e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1\"" Sep 9 06:06:32.793443 containerd[1949]: time="2025-09-09T06:06:32.793432489Z" level=info msg="StartContainer for \"ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1\"" Sep 9 06:06:32.793859 containerd[1949]: time="2025-09-09T06:06:32.793818091Z" level=info msg="connecting to shim ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1" address="unix:///run/containerd/s/405b4da304b77a14a73192c0eabc9ae762549da615ec26a940abcbfa347daa4f" protocol=ttrpc version=3 Sep 9 06:06:32.805912 systemd[1]: Started cri-containerd-ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1.scope - libcontainer container ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1. Sep 9 06:06:32.819116 containerd[1949]: time="2025-09-09T06:06:32.819092908Z" level=info msg="StartContainer for \"ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1\" returns successfully" Sep 9 06:06:32.827328 systemd-networkd[1870]: calice4d7147e75: Link UP Sep 9 06:06:32.827450 systemd-networkd[1870]: calice4d7147e75: Gained carrier Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.584 [INFO][5210] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.591 [INFO][5210] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0 calico-apiserver-5454b67db4- calico-apiserver 485b14b7-7626-411d-9b65-4f8a429182ab 777 0 2025-09-09 06:06:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5454b67db4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452.0.0-n-7ab43648c0 calico-apiserver-5454b67db4-qhwhh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calice4d7147e75 [] [] }} ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-qhwhh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.591 [INFO][5210] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-qhwhh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5277] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" HandleID="k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5277] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" HandleID="k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e76b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452.0.0-n-7ab43648c0", "pod":"calico-apiserver-5454b67db4-qhwhh", "timestamp":"2025-09-09 06:06:32.605340837 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7ab43648c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.605 [INFO][5277] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.722 [INFO][5277] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.722 [INFO][5277] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7ab43648c0' Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.809 [INFO][5277] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.812 [INFO][5277] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.816 [INFO][5277] ipam/ipam.go 511: Trying affinity for 192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.817 [INFO][5277] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.819 [INFO][5277] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.819 [INFO][5277] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.192/26 handle="k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.820 [INFO][5277] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.822 [INFO][5277] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.192/26 handle="k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.825 [INFO][5277] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.196/26] block=192.168.55.192/26 handle="k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.825 [INFO][5277] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.196/26] handle="k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.825 [INFO][5277] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 06:06:32.833751 containerd[1949]: 2025-09-09 06:06:32.825 [INFO][5277] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.196/26] IPv6=[] ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" HandleID="k8s-pod-network.b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" Sep 9 06:06:32.834319 containerd[1949]: 2025-09-09 06:06:32.826 [INFO][5210] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-qhwhh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0", GenerateName:"calico-apiserver-5454b67db4-", Namespace:"calico-apiserver", SelfLink:"", UID:"485b14b7-7626-411d-9b65-4f8a429182ab", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5454b67db4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"", Pod:"calico-apiserver-5454b67db4-qhwhh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calice4d7147e75", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:32.834319 containerd[1949]: 2025-09-09 06:06:32.826 [INFO][5210] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.196/32] ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-qhwhh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" Sep 9 06:06:32.834319 containerd[1949]: 2025-09-09 06:06:32.826 [INFO][5210] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calice4d7147e75 ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-qhwhh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" Sep 9 06:06:32.834319 containerd[1949]: 2025-09-09 06:06:32.827 [INFO][5210] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-qhwhh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" Sep 9 06:06:32.834319 containerd[1949]: 2025-09-09 06:06:32.827 [INFO][5210] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-qhwhh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0", GenerateName:"calico-apiserver-5454b67db4-", Namespace:"calico-apiserver", SelfLink:"", UID:"485b14b7-7626-411d-9b65-4f8a429182ab", ResourceVersion:"777", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5454b67db4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed", Pod:"calico-apiserver-5454b67db4-qhwhh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calice4d7147e75", MAC:"fa:f6:0e:82:44:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:32.834319 containerd[1949]: 2025-09-09 06:06:32.832 [INFO][5210] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-qhwhh" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--qhwhh-eth0" Sep 9 06:06:32.841718 containerd[1949]: time="2025-09-09T06:06:32.841689716Z" level=info msg="connecting to shim b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed" address="unix:///run/containerd/s/7960969818144d4dad1061f245ab760fd8904f2bf5c0680aab3b29bb935c6310" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:32.871849 systemd[1]: Started cri-containerd-b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed.scope - libcontainer container b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed. Sep 9 06:06:32.898953 containerd[1949]: time="2025-09-09T06:06:32.898928316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5454b67db4-qhwhh,Uid:485b14b7-7626-411d-9b65-4f8a429182ab,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed\"" Sep 9 06:06:33.567849 containerd[1949]: time="2025-09-09T06:06:33.567828669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5454b67db4-jzppx,Uid:8724bcd6-4402-4119-80d5-dd4ddd8684aa,Namespace:calico-apiserver,Attempt:0,}" Sep 9 06:06:33.617996 systemd-networkd[1870]: cali7139c9a6ff8: Link UP Sep 9 06:06:33.618139 systemd-networkd[1870]: cali7139c9a6ff8: Gained carrier Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.579 [INFO][5582] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.585 [INFO][5582] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0 calico-apiserver-5454b67db4- calico-apiserver 8724bcd6-4402-4119-80d5-dd4ddd8684aa 779 0 2025-09-09 06:06:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5454b67db4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452.0.0-n-7ab43648c0 calico-apiserver-5454b67db4-jzppx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7139c9a6ff8 [] [] }} ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-jzppx" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.585 [INFO][5582] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-jzppx" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.599 [INFO][5605] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" HandleID="k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.599 [INFO][5605] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" HandleID="k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e7730), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452.0.0-n-7ab43648c0", "pod":"calico-apiserver-5454b67db4-jzppx", "timestamp":"2025-09-09 06:06:33.59913721 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7ab43648c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.599 [INFO][5605] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.599 [INFO][5605] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.599 [INFO][5605] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7ab43648c0' Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.603 [INFO][5605] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.605 [INFO][5605] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.607 [INFO][5605] ipam/ipam.go 511: Trying affinity for 192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.608 [INFO][5605] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.609 [INFO][5605] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.609 [INFO][5605] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.192/26 handle="k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.610 [INFO][5605] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.612 [INFO][5605] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.192/26 handle="k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.616 [INFO][5605] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.197/26] block=192.168.55.192/26 handle="k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.616 [INFO][5605] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.197/26] handle="k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.616 [INFO][5605] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 06:06:33.624534 containerd[1949]: 2025-09-09 06:06:33.616 [INFO][5605] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.197/26] IPv6=[] ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" HandleID="k8s-pod-network.4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" Sep 9 06:06:33.625373 containerd[1949]: 2025-09-09 06:06:33.617 [INFO][5582] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-jzppx" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0", GenerateName:"calico-apiserver-5454b67db4-", Namespace:"calico-apiserver", SelfLink:"", UID:"8724bcd6-4402-4119-80d5-dd4ddd8684aa", ResourceVersion:"779", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5454b67db4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"", Pod:"calico-apiserver-5454b67db4-jzppx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7139c9a6ff8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:33.625373 containerd[1949]: 2025-09-09 06:06:33.617 [INFO][5582] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.197/32] ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-jzppx" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" Sep 9 06:06:33.625373 containerd[1949]: 2025-09-09 06:06:33.617 [INFO][5582] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7139c9a6ff8 ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-jzppx" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" Sep 9 06:06:33.625373 containerd[1949]: 2025-09-09 06:06:33.618 [INFO][5582] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-jzppx" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" Sep 9 06:06:33.625373 containerd[1949]: 2025-09-09 06:06:33.618 [INFO][5582] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-jzppx" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0", GenerateName:"calico-apiserver-5454b67db4-", Namespace:"calico-apiserver", SelfLink:"", UID:"8724bcd6-4402-4119-80d5-dd4ddd8684aa", ResourceVersion:"779", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5454b67db4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db", Pod:"calico-apiserver-5454b67db4-jzppx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7139c9a6ff8", MAC:"fa:9e:1a:f1:57:f1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:33.625373 containerd[1949]: 2025-09-09 06:06:33.623 [INFO][5582] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" Namespace="calico-apiserver" Pod="calico-apiserver-5454b67db4-jzppx" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--apiserver--5454b67db4--jzppx-eth0" Sep 9 06:06:33.632420 containerd[1949]: time="2025-09-09T06:06:33.632388661Z" level=info msg="connecting to shim 4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db" address="unix:///run/containerd/s/a4b199f2588f570fe792e7e4682f87946b895573b9c2ace0626172c1ebc2d522" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:33.653846 systemd[1]: Started cri-containerd-4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db.scope - libcontainer container 4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db. Sep 9 06:06:33.680353 containerd[1949]: time="2025-09-09T06:06:33.680330815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5454b67db4-jzppx,Uid:8724bcd6-4402-4119-80d5-dd4ddd8684aa,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db\"" Sep 9 06:06:33.718945 kubelet[3324]: I0909 06:06:33.718884 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-z8vxh" podStartSLOduration=31.718865298 podStartE2EDuration="31.718865298s" podCreationTimestamp="2025-09-09 06:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 06:06:33.718179889 +0000 UTC m=+37.196824950" watchObservedRunningTime="2025-09-09 06:06:33.718865298 +0000 UTC m=+37.197510356" Sep 9 06:06:33.789420 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1750133364.mount: Deactivated successfully. Sep 9 06:06:33.793913 containerd[1949]: time="2025-09-09T06:06:33.793893714Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:33.794098 containerd[1949]: time="2025-09-09T06:06:33.794081157Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 06:06:33.794467 containerd[1949]: time="2025-09-09T06:06:33.794453872Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:33.795369 containerd[1949]: time="2025-09-09T06:06:33.795357243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:33.795798 containerd[1949]: time="2025-09-09T06:06:33.795780046Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.397372841s" Sep 9 06:06:33.795798 containerd[1949]: time="2025-09-09T06:06:33.795794510Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 06:06:33.796250 containerd[1949]: time="2025-09-09T06:06:33.796240910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 06:06:33.796816 containerd[1949]: time="2025-09-09T06:06:33.796805588Z" level=info msg="CreateContainer within sandbox \"e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 06:06:33.799284 containerd[1949]: time="2025-09-09T06:06:33.799270885Z" level=info msg="Container 742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:33.802240 containerd[1949]: time="2025-09-09T06:06:33.802226217Z" level=info msg="CreateContainer within sandbox \"e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06\"" Sep 9 06:06:33.802482 containerd[1949]: time="2025-09-09T06:06:33.802469851Z" level=info msg="StartContainer for \"742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06\"" Sep 9 06:06:33.802997 containerd[1949]: time="2025-09-09T06:06:33.802983188Z" level=info msg="connecting to shim 742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06" address="unix:///run/containerd/s/5a9fc1ec31c783c2a1e33818f6bf0accda80b3e92b9e926d4fd7a55feb721be8" protocol=ttrpc version=3 Sep 9 06:06:33.823802 systemd[1]: Started cri-containerd-742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06.scope - libcontainer container 742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06. Sep 9 06:06:33.852522 containerd[1949]: time="2025-09-09T06:06:33.852497052Z" level=info msg="StartContainer for \"742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06\" returns successfully" Sep 9 06:06:34.024041 systemd-networkd[1870]: calice4d7147e75: Gained IPv6LL Sep 9 06:06:34.215927 systemd-networkd[1870]: califfc7d307867: Gained IPv6LL Sep 9 06:06:34.536044 systemd-networkd[1870]: calicbbb48b48c2: Gained IPv6LL Sep 9 06:06:34.720910 kubelet[3324]: I0909 06:06:34.720864 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-66d87767b4-dr68m" podStartSLOduration=2.179247705 podStartE2EDuration="6.720849766s" podCreationTimestamp="2025-09-09 06:06:28 +0000 UTC" firstStartedPulling="2025-09-09 06:06:29.25457697 +0000 UTC m=+32.733222030" lastFinishedPulling="2025-09-09 06:06:33.796179032 +0000 UTC m=+37.274824091" observedRunningTime="2025-09-09 06:06:34.720576116 +0000 UTC m=+38.199221177" watchObservedRunningTime="2025-09-09 06:06:34.720849766 +0000 UTC m=+38.199494825" Sep 9 06:06:35.112900 systemd-networkd[1870]: cali7139c9a6ff8: Gained IPv6LL Sep 9 06:06:35.568767 containerd[1949]: time="2025-09-09T06:06:35.568530425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-m5t7p,Uid:4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55,Namespace:kube-system,Attempt:0,}" Sep 9 06:06:35.624200 systemd-networkd[1870]: cali477657555ba: Link UP Sep 9 06:06:35.624432 systemd-networkd[1870]: cali477657555ba: Gained carrier Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.580 [INFO][5817] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.587 [INFO][5817] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0 coredns-668d6bf9bc- kube-system 4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55 773 0 2025-09-09 06:06:02 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452.0.0-n-7ab43648c0 coredns-668d6bf9bc-m5t7p eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali477657555ba [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Namespace="kube-system" Pod="coredns-668d6bf9bc-m5t7p" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.587 [INFO][5817] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Namespace="kube-system" Pod="coredns-668d6bf9bc-m5t7p" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.600 [INFO][5839] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" HandleID="k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Workload="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.600 [INFO][5839] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" HandleID="k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Workload="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033e5b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452.0.0-n-7ab43648c0", "pod":"coredns-668d6bf9bc-m5t7p", "timestamp":"2025-09-09 06:06:35.600094942 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7ab43648c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.600 [INFO][5839] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.600 [INFO][5839] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.600 [INFO][5839] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7ab43648c0' Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.604 [INFO][5839] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.607 [INFO][5839] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.610 [INFO][5839] ipam/ipam.go 511: Trying affinity for 192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.612 [INFO][5839] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.614 [INFO][5839] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.614 [INFO][5839] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.192/26 handle="k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.615 [INFO][5839] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547 Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.618 [INFO][5839] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.192/26 handle="k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.621 [INFO][5839] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.198/26] block=192.168.55.192/26 handle="k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.621 [INFO][5839] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.198/26] handle="k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.621 [INFO][5839] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 06:06:35.649901 containerd[1949]: 2025-09-09 06:06:35.621 [INFO][5839] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.198/26] IPv6=[] ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" HandleID="k8s-pod-network.a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Workload="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" Sep 9 06:06:35.651045 containerd[1949]: 2025-09-09 06:06:35.623 [INFO][5817] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Namespace="kube-system" Pod="coredns-668d6bf9bc-m5t7p" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55", ResourceVersion:"773", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"", Pod:"coredns-668d6bf9bc-m5t7p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali477657555ba", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:35.651045 containerd[1949]: 2025-09-09 06:06:35.623 [INFO][5817] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.198/32] ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Namespace="kube-system" Pod="coredns-668d6bf9bc-m5t7p" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" Sep 9 06:06:35.651045 containerd[1949]: 2025-09-09 06:06:35.623 [INFO][5817] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali477657555ba ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Namespace="kube-system" Pod="coredns-668d6bf9bc-m5t7p" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" Sep 9 06:06:35.651045 containerd[1949]: 2025-09-09 06:06:35.624 [INFO][5817] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Namespace="kube-system" Pod="coredns-668d6bf9bc-m5t7p" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" Sep 9 06:06:35.651045 containerd[1949]: 2025-09-09 06:06:35.624 [INFO][5817] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Namespace="kube-system" Pod="coredns-668d6bf9bc-m5t7p" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55", ResourceVersion:"773", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547", Pod:"coredns-668d6bf9bc-m5t7p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali477657555ba", MAC:"92:37:68:e6:ac:4b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:35.651045 containerd[1949]: 2025-09-09 06:06:35.647 [INFO][5817] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" Namespace="kube-system" Pod="coredns-668d6bf9bc-m5t7p" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-coredns--668d6bf9bc--m5t7p-eth0" Sep 9 06:06:35.660530 containerd[1949]: time="2025-09-09T06:06:35.660466304Z" level=info msg="connecting to shim a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547" address="unix:///run/containerd/s/41d7063b5c95f3b6d0c4814e5676f372b8d089b1bdd83c44c82d0d1d0c330357" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:35.682828 systemd[1]: Started cri-containerd-a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547.scope - libcontainer container a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547. Sep 9 06:06:35.719071 containerd[1949]: time="2025-09-09T06:06:35.719046176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-m5t7p,Uid:4e5bf45a-d9cc-4d9e-ab3f-a2ef869f4d55,Namespace:kube-system,Attempt:0,} returns sandbox id \"a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547\"" Sep 9 06:06:35.719958 containerd[1949]: time="2025-09-09T06:06:35.719945183Z" level=info msg="CreateContainer within sandbox \"a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 06:06:35.723230 containerd[1949]: time="2025-09-09T06:06:35.723216961Z" level=info msg="Container 0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:35.725527 containerd[1949]: time="2025-09-09T06:06:35.725510580Z" level=info msg="CreateContainer within sandbox \"a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c\"" Sep 9 06:06:35.725735 containerd[1949]: time="2025-09-09T06:06:35.725724690Z" level=info msg="StartContainer for \"0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c\"" Sep 9 06:06:35.726113 containerd[1949]: time="2025-09-09T06:06:35.726101063Z" level=info msg="connecting to shim 0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c" address="unix:///run/containerd/s/41d7063b5c95f3b6d0c4814e5676f372b8d089b1bdd83c44c82d0d1d0c330357" protocol=ttrpc version=3 Sep 9 06:06:35.742809 systemd[1]: Started cri-containerd-0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c.scope - libcontainer container 0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c. Sep 9 06:06:35.758467 containerd[1949]: time="2025-09-09T06:06:35.758419155Z" level=info msg="StartContainer for \"0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c\" returns successfully" Sep 9 06:06:36.568719 containerd[1949]: time="2025-09-09T06:06:36.568693147Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55f466fc98-st25q,Uid:ff94f516-177d-42b6-94ab-d8e1a6cb501b,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:36.568830 containerd[1949]: time="2025-09-09T06:06:36.568744603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lwj49,Uid:70ad17b8-206d-4ff7-a93b-019a4878fd81,Namespace:calico-system,Attempt:0,}" Sep 9 06:06:36.595154 containerd[1949]: time="2025-09-09T06:06:36.595102892Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:36.595248 containerd[1949]: time="2025-09-09T06:06:36.595230302Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 06:06:36.595613 containerd[1949]: time="2025-09-09T06:06:36.595600850Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:36.597277 containerd[1949]: time="2025-09-09T06:06:36.597261401Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:36.597542 containerd[1949]: time="2025-09-09T06:06:36.597526528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 2.801271238s" Sep 9 06:06:36.597584 containerd[1949]: time="2025-09-09T06:06:36.597545151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 06:06:36.598012 containerd[1949]: time="2025-09-09T06:06:36.597999704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 06:06:36.598737 containerd[1949]: time="2025-09-09T06:06:36.598721417Z" level=info msg="CreateContainer within sandbox \"31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 06:06:36.602171 containerd[1949]: time="2025-09-09T06:06:36.602145601Z" level=info msg="Container 33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:36.605019 containerd[1949]: time="2025-09-09T06:06:36.605004346Z" level=info msg="CreateContainer within sandbox \"31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\"" Sep 9 06:06:36.605303 containerd[1949]: time="2025-09-09T06:06:36.605288807Z" level=info msg="StartContainer for \"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\"" Sep 9 06:06:36.605869 containerd[1949]: time="2025-09-09T06:06:36.605851823Z" level=info msg="connecting to shim 33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f" address="unix:///run/containerd/s/2e4c8f93211d0d30381781391ea2be19438f2d2aa25cc4d2dc06e6996dfeb22d" protocol=ttrpc version=3 Sep 9 06:06:36.619663 systemd-networkd[1870]: calib82c1226374: Link UP Sep 9 06:06:36.619812 systemd-networkd[1870]: calib82c1226374: Gained carrier Sep 9 06:06:36.619819 systemd[1]: Started cri-containerd-33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f.scope - libcontainer container 33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f. Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.580 [INFO][6006] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.587 [INFO][6006] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0 calico-kube-controllers-55f466fc98- calico-system ff94f516-177d-42b6-94ab-d8e1a6cb501b 780 0 2025-09-09 06:06:11 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:55f466fc98 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4452.0.0-n-7ab43648c0 calico-kube-controllers-55f466fc98-st25q eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib82c1226374 [] [] }} ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Namespace="calico-system" Pod="calico-kube-controllers-55f466fc98-st25q" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.587 [INFO][6006] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Namespace="calico-system" Pod="calico-kube-controllers-55f466fc98-st25q" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.601 [INFO][6051] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" HandleID="k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.602 [INFO][6051] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" HandleID="k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f850), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-7ab43648c0", "pod":"calico-kube-controllers-55f466fc98-st25q", "timestamp":"2025-09-09 06:06:36.601987539 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7ab43648c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.602 [INFO][6051] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.602 [INFO][6051] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.602 [INFO][6051] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7ab43648c0' Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.606 [INFO][6051] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.608 [INFO][6051] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.610 [INFO][6051] ipam/ipam.go 511: Trying affinity for 192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.611 [INFO][6051] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.612 [INFO][6051] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.612 [INFO][6051] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.192/26 handle="k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.613 [INFO][6051] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18 Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.614 [INFO][6051] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.192/26 handle="k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.617 [INFO][6051] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.199/26] block=192.168.55.192/26 handle="k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.617 [INFO][6051] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.199/26] handle="k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.617 [INFO][6051] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 06:06:36.626851 containerd[1949]: 2025-09-09 06:06:36.618 [INFO][6051] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.199/26] IPv6=[] ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" HandleID="k8s-pod-network.bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Workload="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" Sep 9 06:06:36.627245 containerd[1949]: 2025-09-09 06:06:36.618 [INFO][6006] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Namespace="calico-system" Pod="calico-kube-controllers-55f466fc98-st25q" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0", GenerateName:"calico-kube-controllers-55f466fc98-", Namespace:"calico-system", SelfLink:"", UID:"ff94f516-177d-42b6-94ab-d8e1a6cb501b", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55f466fc98", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"", Pod:"calico-kube-controllers-55f466fc98-st25q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.55.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib82c1226374", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:36.627245 containerd[1949]: 2025-09-09 06:06:36.618 [INFO][6006] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.199/32] ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Namespace="calico-system" Pod="calico-kube-controllers-55f466fc98-st25q" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" Sep 9 06:06:36.627245 containerd[1949]: 2025-09-09 06:06:36.619 [INFO][6006] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib82c1226374 ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Namespace="calico-system" Pod="calico-kube-controllers-55f466fc98-st25q" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" Sep 9 06:06:36.627245 containerd[1949]: 2025-09-09 06:06:36.620 [INFO][6006] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Namespace="calico-system" Pod="calico-kube-controllers-55f466fc98-st25q" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" Sep 9 06:06:36.627245 containerd[1949]: 2025-09-09 06:06:36.620 [INFO][6006] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Namespace="calico-system" Pod="calico-kube-controllers-55f466fc98-st25q" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0", GenerateName:"calico-kube-controllers-55f466fc98-", Namespace:"calico-system", SelfLink:"", UID:"ff94f516-177d-42b6-94ab-d8e1a6cb501b", ResourceVersion:"780", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"55f466fc98", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18", Pod:"calico-kube-controllers-55f466fc98-st25q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.55.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib82c1226374", MAC:"3e:6b:a5:a1:4a:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:36.627245 containerd[1949]: 2025-09-09 06:06:36.625 [INFO][6006] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" Namespace="calico-system" Pod="calico-kube-controllers-55f466fc98-st25q" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-calico--kube--controllers--55f466fc98--st25q-eth0" Sep 9 06:06:36.634429 containerd[1949]: time="2025-09-09T06:06:36.634402222Z" level=info msg="connecting to shim bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18" address="unix:///run/containerd/s/968f05eb84d2c96edeceba458aa21e1945fd99749455edaa9f04278e4367be07" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:36.661825 systemd[1]: Started cri-containerd-bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18.scope - libcontainer container bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18. Sep 9 06:06:36.663908 containerd[1949]: time="2025-09-09T06:06:36.663881683Z" level=info msg="StartContainer for \"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" returns successfully" Sep 9 06:06:36.688622 containerd[1949]: time="2025-09-09T06:06:36.688600679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-55f466fc98-st25q,Uid:ff94f516-177d-42b6-94ab-d8e1a6cb501b,Namespace:calico-system,Attempt:0,} returns sandbox id \"bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18\"" Sep 9 06:06:36.733562 systemd-networkd[1870]: cali0baf60a4ffc: Link UP Sep 9 06:06:36.734397 systemd-networkd[1870]: cali0baf60a4ffc: Gained carrier Sep 9 06:06:36.750706 kubelet[3324]: I0909 06:06:36.749888 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-m5t7p" podStartSLOduration=34.749848965 podStartE2EDuration="34.749848965s" podCreationTimestamp="2025-09-09 06:06:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 06:06:36.748948459 +0000 UTC m=+40.227593581" watchObservedRunningTime="2025-09-09 06:06:36.749848965 +0000 UTC m=+40.228494057" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.581 [INFO][6004] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.587 [INFO][6004] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0 csi-node-driver- calico-system 70ad17b8-206d-4ff7-a93b-019a4878fd81 667 0 2025-09-09 06:06:11 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4452.0.0-n-7ab43648c0 csi-node-driver-lwj49 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0baf60a4ffc [] [] }} ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Namespace="calico-system" Pod="csi-node-driver-lwj49" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.587 [INFO][6004] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Namespace="calico-system" Pod="csi-node-driver-lwj49" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.602 [INFO][6052] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" HandleID="k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Workload="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.602 [INFO][6052] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" HandleID="k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Workload="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f610), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452.0.0-n-7ab43648c0", "pod":"csi-node-driver-lwj49", "timestamp":"2025-09-09 06:06:36.60206365 +0000 UTC"}, Hostname:"ci-4452.0.0-n-7ab43648c0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.602 [INFO][6052] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.618 [INFO][6052] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.618 [INFO][6052] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452.0.0-n-7ab43648c0' Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.707 [INFO][6052] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.709 [INFO][6052] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.711 [INFO][6052] ipam/ipam.go 511: Trying affinity for 192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.712 [INFO][6052] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.713 [INFO][6052] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.192/26 host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.713 [INFO][6052] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.192/26 handle="k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.714 [INFO][6052] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220 Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.716 [INFO][6052] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.192/26 handle="k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.722 [INFO][6052] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.200/26] block=192.168.55.192/26 handle="k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.722 [INFO][6052] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.200/26] handle="k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" host="ci-4452.0.0-n-7ab43648c0" Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.722 [INFO][6052] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 06:06:36.754789 containerd[1949]: 2025-09-09 06:06:36.722 [INFO][6052] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.200/26] IPv6=[] ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" HandleID="k8s-pod-network.1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Workload="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" Sep 9 06:06:36.756490 containerd[1949]: 2025-09-09 06:06:36.727 [INFO][6004] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Namespace="calico-system" Pod="csi-node-driver-lwj49" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"70ad17b8-206d-4ff7-a93b-019a4878fd81", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"", Pod:"csi-node-driver-lwj49", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.55.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0baf60a4ffc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:36.756490 containerd[1949]: 2025-09-09 06:06:36.727 [INFO][6004] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.200/32] ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Namespace="calico-system" Pod="csi-node-driver-lwj49" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" Sep 9 06:06:36.756490 containerd[1949]: 2025-09-09 06:06:36.727 [INFO][6004] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0baf60a4ffc ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Namespace="calico-system" Pod="csi-node-driver-lwj49" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" Sep 9 06:06:36.756490 containerd[1949]: 2025-09-09 06:06:36.734 [INFO][6004] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Namespace="calico-system" Pod="csi-node-driver-lwj49" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" Sep 9 06:06:36.756490 containerd[1949]: 2025-09-09 06:06:36.735 [INFO][6004] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Namespace="calico-system" Pod="csi-node-driver-lwj49" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"70ad17b8-206d-4ff7-a93b-019a4878fd81", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 6, 6, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452.0.0-n-7ab43648c0", ContainerID:"1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220", Pod:"csi-node-driver-lwj49", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.55.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0baf60a4ffc", MAC:"82:03:b8:f6:af:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 06:06:36.756490 containerd[1949]: 2025-09-09 06:06:36.751 [INFO][6004] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" Namespace="calico-system" Pod="csi-node-driver-lwj49" WorkloadEndpoint="ci--4452.0.0--n--7ab43648c0-k8s-csi--node--driver--lwj49-eth0" Sep 9 06:06:36.767361 containerd[1949]: time="2025-09-09T06:06:36.767304218Z" level=info msg="connecting to shim 1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220" address="unix:///run/containerd/s/e3233342d26cd80db6f2a6080b6dc1bb33eb180462aa1191d313e557d97b2725" namespace=k8s.io protocol=ttrpc version=3 Sep 9 06:06:36.770658 kubelet[3324]: I0909 06:06:36.770561 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-m4bj8" podStartSLOduration=21.871790064 podStartE2EDuration="25.770525215s" podCreationTimestamp="2025-09-09 06:06:11 +0000 UTC" firstStartedPulling="2025-09-09 06:06:32.699204527 +0000 UTC m=+36.177849586" lastFinishedPulling="2025-09-09 06:06:36.597939676 +0000 UTC m=+40.076584737" observedRunningTime="2025-09-09 06:06:36.762771626 +0000 UTC m=+40.241416686" watchObservedRunningTime="2025-09-09 06:06:36.770525215 +0000 UTC m=+40.249170273" Sep 9 06:06:36.792793 systemd[1]: Started cri-containerd-1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220.scope - libcontainer container 1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220. Sep 9 06:06:36.804051 containerd[1949]: time="2025-09-09T06:06:36.804004057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lwj49,Uid:70ad17b8-206d-4ff7-a93b-019a4878fd81,Namespace:calico-system,Attempt:0,} returns sandbox id \"1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220\"" Sep 9 06:06:36.840025 systemd-networkd[1870]: cali477657555ba: Gained IPv6LL Sep 9 06:06:37.739635 kubelet[3324]: I0909 06:06:37.739614 3324 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 06:06:38.057113 kubelet[3324]: I0909 06:06:38.056963 3324 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 06:06:38.530653 systemd-networkd[1870]: vxlan.calico: Link UP Sep 9 06:06:38.530657 systemd-networkd[1870]: vxlan.calico: Gained carrier Sep 9 06:06:38.567863 systemd-networkd[1870]: cali0baf60a4ffc: Gained IPv6LL Sep 9 06:06:38.568052 systemd-networkd[1870]: calib82c1226374: Gained IPv6LL Sep 9 06:06:39.321039 containerd[1949]: time="2025-09-09T06:06:39.321009977Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:39.321262 containerd[1949]: time="2025-09-09T06:06:39.321208892Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 06:06:39.321582 containerd[1949]: time="2025-09-09T06:06:39.321570912Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:39.322396 containerd[1949]: time="2025-09-09T06:06:39.322355295Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:39.322789 containerd[1949]: time="2025-09-09T06:06:39.322748683Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.724735089s" Sep 9 06:06:39.322789 containerd[1949]: time="2025-09-09T06:06:39.322762253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 06:06:39.323269 containerd[1949]: time="2025-09-09T06:06:39.323229966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 06:06:39.323791 containerd[1949]: time="2025-09-09T06:06:39.323750047Z" level=info msg="CreateContainer within sandbox \"b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 06:06:39.326323 containerd[1949]: time="2025-09-09T06:06:39.326310579Z" level=info msg="Container daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:39.329093 containerd[1949]: time="2025-09-09T06:06:39.329053009Z" level=info msg="CreateContainer within sandbox \"b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49\"" Sep 9 06:06:39.329294 containerd[1949]: time="2025-09-09T06:06:39.329279553Z" level=info msg="StartContainer for \"daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49\"" Sep 9 06:06:39.329888 containerd[1949]: time="2025-09-09T06:06:39.329847764Z" level=info msg="connecting to shim daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49" address="unix:///run/containerd/s/7960969818144d4dad1061f245ab760fd8904f2bf5c0680aab3b29bb935c6310" protocol=ttrpc version=3 Sep 9 06:06:39.354992 systemd[1]: Started cri-containerd-daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49.scope - libcontainer container daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49. Sep 9 06:06:39.386483 containerd[1949]: time="2025-09-09T06:06:39.386434887Z" level=info msg="StartContainer for \"daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49\" returns successfully" Sep 9 06:06:39.592775 systemd-networkd[1870]: vxlan.calico: Gained IPv6LL Sep 9 06:06:39.773335 kubelet[3324]: I0909 06:06:39.773219 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5454b67db4-qhwhh" podStartSLOduration=24.349480619 podStartE2EDuration="30.773177244s" podCreationTimestamp="2025-09-09 06:06:09 +0000 UTC" firstStartedPulling="2025-09-09 06:06:32.899479553 +0000 UTC m=+36.378124613" lastFinishedPulling="2025-09-09 06:06:39.323176178 +0000 UTC m=+42.801821238" observedRunningTime="2025-09-09 06:06:39.771833388 +0000 UTC m=+43.250478532" watchObservedRunningTime="2025-09-09 06:06:39.773177244 +0000 UTC m=+43.251822351" Sep 9 06:06:39.797058 containerd[1949]: time="2025-09-09T06:06:39.797034465Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:39.797205 containerd[1949]: time="2025-09-09T06:06:39.797191008Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 06:06:39.798433 containerd[1949]: time="2025-09-09T06:06:39.798417772Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 475.172446ms" Sep 9 06:06:39.798476 containerd[1949]: time="2025-09-09T06:06:39.798434938Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 06:06:39.798914 containerd[1949]: time="2025-09-09T06:06:39.798900502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 06:06:39.799468 containerd[1949]: time="2025-09-09T06:06:39.799455673Z" level=info msg="CreateContainer within sandbox \"4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 06:06:39.802076 containerd[1949]: time="2025-09-09T06:06:39.802058408Z" level=info msg="Container 448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:39.804879 containerd[1949]: time="2025-09-09T06:06:39.804861495Z" level=info msg="CreateContainer within sandbox \"4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac\"" Sep 9 06:06:39.805132 containerd[1949]: time="2025-09-09T06:06:39.805122141Z" level=info msg="StartContainer for \"448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac\"" Sep 9 06:06:39.805846 containerd[1949]: time="2025-09-09T06:06:39.805830541Z" level=info msg="connecting to shim 448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac" address="unix:///run/containerd/s/a4b199f2588f570fe792e7e4682f87946b895573b9c2ace0626172c1ebc2d522" protocol=ttrpc version=3 Sep 9 06:06:39.822807 systemd[1]: Started cri-containerd-448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac.scope - libcontainer container 448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac. Sep 9 06:06:39.853852 containerd[1949]: time="2025-09-09T06:06:39.853826739Z" level=info msg="StartContainer for \"448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac\" returns successfully" Sep 9 06:06:40.762339 kubelet[3324]: I0909 06:06:40.762258 3324 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 06:06:40.782451 kubelet[3324]: I0909 06:06:40.782289 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5454b67db4-jzppx" podStartSLOduration=25.664390914 podStartE2EDuration="31.782243025s" podCreationTimestamp="2025-09-09 06:06:09 +0000 UTC" firstStartedPulling="2025-09-09 06:06:33.680983582 +0000 UTC m=+37.159628642" lastFinishedPulling="2025-09-09 06:06:39.798835692 +0000 UTC m=+43.277480753" observedRunningTime="2025-09-09 06:06:40.781781658 +0000 UTC m=+44.260426778" watchObservedRunningTime="2025-09-09 06:06:40.782243025 +0000 UTC m=+44.260888138" Sep 9 06:06:42.586289 containerd[1949]: time="2025-09-09T06:06:42.586262439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:42.586523 containerd[1949]: time="2025-09-09T06:06:42.586493097Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 06:06:42.586872 containerd[1949]: time="2025-09-09T06:06:42.586859851Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:42.587735 containerd[1949]: time="2025-09-09T06:06:42.587719903Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:42.588152 containerd[1949]: time="2025-09-09T06:06:42.588138617Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 2.78922055s" Sep 9 06:06:42.588184 containerd[1949]: time="2025-09-09T06:06:42.588155597Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 06:06:42.588588 containerd[1949]: time="2025-09-09T06:06:42.588576544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 06:06:42.591600 containerd[1949]: time="2025-09-09T06:06:42.591579586Z" level=info msg="CreateContainer within sandbox \"bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 06:06:42.594324 containerd[1949]: time="2025-09-09T06:06:42.594283778Z" level=info msg="Container a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:42.596903 containerd[1949]: time="2025-09-09T06:06:42.596889481Z" level=info msg="CreateContainer within sandbox \"bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\"" Sep 9 06:06:42.597107 containerd[1949]: time="2025-09-09T06:06:42.597092982Z" level=info msg="StartContainer for \"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\"" Sep 9 06:06:42.597824 containerd[1949]: time="2025-09-09T06:06:42.597781842Z" level=info msg="connecting to shim a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2" address="unix:///run/containerd/s/968f05eb84d2c96edeceba458aa21e1945fd99749455edaa9f04278e4367be07" protocol=ttrpc version=3 Sep 9 06:06:42.622855 systemd[1]: Started cri-containerd-a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2.scope - libcontainer container a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2. Sep 9 06:06:42.657741 containerd[1949]: time="2025-09-09T06:06:42.657715394Z" level=info msg="StartContainer for \"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" returns successfully" Sep 9 06:06:42.795064 kubelet[3324]: I0909 06:06:42.794904 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-55f466fc98-st25q" podStartSLOduration=25.895374902 podStartE2EDuration="31.794848874s" podCreationTimestamp="2025-09-09 06:06:11 +0000 UTC" firstStartedPulling="2025-09-09 06:06:36.689055639 +0000 UTC m=+40.167700698" lastFinishedPulling="2025-09-09 06:06:42.58852961 +0000 UTC m=+46.067174670" observedRunningTime="2025-09-09 06:06:42.791994331 +0000 UTC m=+46.270639473" watchObservedRunningTime="2025-09-09 06:06:42.794848874 +0000 UTC m=+46.273493994" Sep 9 06:06:43.870388 containerd[1949]: time="2025-09-09T06:06:43.870364020Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"a03dc7640ad55a8f2f60bf0f4a23974e0e1caa371aeb3d43934fe3d6aaf16857\" pid:6676 exited_at:{seconds:1757398003 nanos:870208972}" Sep 9 06:06:44.296333 kubelet[3324]: I0909 06:06:44.296252 3324 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 06:06:44.318475 containerd[1949]: time="2025-09-09T06:06:44.318452852Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:44.318728 containerd[1949]: time="2025-09-09T06:06:44.318666843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 06:06:44.319055 containerd[1949]: time="2025-09-09T06:06:44.319033454Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:44.319862 containerd[1949]: time="2025-09-09T06:06:44.319816211Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:44.320215 containerd[1949]: time="2025-09-09T06:06:44.320174854Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.731584407s" Sep 9 06:06:44.320215 containerd[1949]: time="2025-09-09T06:06:44.320189824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 06:06:44.321151 containerd[1949]: time="2025-09-09T06:06:44.321108703Z" level=info msg="CreateContainer within sandbox \"1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 06:06:44.324634 containerd[1949]: time="2025-09-09T06:06:44.324622092Z" level=info msg="Container 3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:44.328068 containerd[1949]: time="2025-09-09T06:06:44.328053700Z" level=info msg="CreateContainer within sandbox \"1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c\"" Sep 9 06:06:44.328304 containerd[1949]: time="2025-09-09T06:06:44.328291641Z" level=info msg="StartContainer for \"3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c\"" Sep 9 06:06:44.329372 containerd[1949]: time="2025-09-09T06:06:44.329360708Z" level=info msg="connecting to shim 3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c" address="unix:///run/containerd/s/e3233342d26cd80db6f2a6080b6dc1bb33eb180462aa1191d313e557d97b2725" protocol=ttrpc version=3 Sep 9 06:06:44.344864 systemd[1]: Started cri-containerd-3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c.scope - libcontainer container 3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c. Sep 9 06:06:44.348420 containerd[1949]: time="2025-09-09T06:06:44.348396507Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"ae72bb40780d2e07d4d7fc9b4f3682580f924e761e97e3b18ad94895279ed7ca\" pid:6711 exit_status:1 exited_at:{seconds:1757398004 nanos:348117267}" Sep 9 06:06:44.376645 containerd[1949]: time="2025-09-09T06:06:44.376624708Z" level=info msg="StartContainer for \"3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c\" returns successfully" Sep 9 06:06:44.377278 containerd[1949]: time="2025-09-09T06:06:44.377263519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 06:06:44.402099 containerd[1949]: time="2025-09-09T06:06:44.402071097Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"f7ec64b2c0ba1c9352890a1153c194b29fa1f80d5c25f6b16adac75664f8bad1\" pid:6764 exit_status:1 exited_at:{seconds:1757398004 nanos:401889892}" Sep 9 06:06:46.346449 containerd[1949]: time="2025-09-09T06:06:46.346422875Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:46.346781 containerd[1949]: time="2025-09-09T06:06:46.346660376Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 06:06:46.347273 containerd[1949]: time="2025-09-09T06:06:46.347251573Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:46.348945 containerd[1949]: time="2025-09-09T06:06:46.348930133Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 06:06:46.349324 containerd[1949]: time="2025-09-09T06:06:46.349309267Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.972024458s" Sep 9 06:06:46.349370 containerd[1949]: time="2025-09-09T06:06:46.349325693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 06:06:46.350432 containerd[1949]: time="2025-09-09T06:06:46.350420172Z" level=info msg="CreateContainer within sandbox \"1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 06:06:46.353433 containerd[1949]: time="2025-09-09T06:06:46.353420169Z" level=info msg="Container 3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd: CDI devices from CRI Config.CDIDevices: []" Sep 9 06:06:46.357154 containerd[1949]: time="2025-09-09T06:06:46.357112156Z" level=info msg="CreateContainer within sandbox \"1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd\"" Sep 9 06:06:46.357406 containerd[1949]: time="2025-09-09T06:06:46.357327822Z" level=info msg="StartContainer for \"3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd\"" Sep 9 06:06:46.358147 containerd[1949]: time="2025-09-09T06:06:46.358125402Z" level=info msg="connecting to shim 3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd" address="unix:///run/containerd/s/e3233342d26cd80db6f2a6080b6dc1bb33eb180462aa1191d313e557d97b2725" protocol=ttrpc version=3 Sep 9 06:06:46.382175 systemd[1]: Started cri-containerd-3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd.scope - libcontainer container 3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd. Sep 9 06:06:46.413383 containerd[1949]: time="2025-09-09T06:06:46.413357006Z" level=info msg="StartContainer for \"3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd\" returns successfully" Sep 9 06:06:46.613321 kubelet[3324]: I0909 06:06:46.613265 3324 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 06:06:46.614314 kubelet[3324]: I0909 06:06:46.613376 3324 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 06:06:46.821364 kubelet[3324]: I0909 06:06:46.821213 3324 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-lwj49" podStartSLOduration=26.275983905 podStartE2EDuration="35.821162224s" podCreationTimestamp="2025-09-09 06:06:11 +0000 UTC" firstStartedPulling="2025-09-09 06:06:36.804492302 +0000 UTC m=+40.283137362" lastFinishedPulling="2025-09-09 06:06:46.34967062 +0000 UTC m=+49.828315681" observedRunningTime="2025-09-09 06:06:46.821134971 +0000 UTC m=+50.299780154" watchObservedRunningTime="2025-09-09 06:06:46.821162224 +0000 UTC m=+50.299807385" Sep 9 06:06:56.436324 containerd[1949]: time="2025-09-09T06:06:56.436191191Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"c7abf1975941f2afeeff86c1f78273a3567ed1fc1a7feec9d541c26b123ac565\" pid:6861 exited_at:{seconds:1757398016 nanos:435927452}" Sep 9 06:07:00.760073 containerd[1949]: time="2025-09-09T06:07:00.760018177Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"7476df7eec278b6bb611c8864f4d464de8798ffd3aa20856f8e4e91ad112a32b\" pid:6894 exited_at:{seconds:1757398020 nanos:759753702}" Sep 9 06:07:12.193331 kubelet[3324]: I0909 06:07:12.193210 3324 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 06:07:13.874839 containerd[1949]: time="2025-09-09T06:07:13.874809704Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"2e1e510ffbf4e820757fd7ff9eec7c3339bae0950afde91d592d5df926ce2182\" pid:6934 exited_at:{seconds:1757398033 nanos:874644190}" Sep 9 06:07:14.408885 containerd[1949]: time="2025-09-09T06:07:14.408850623Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"d1013f5239f78c31bf2b591df257c7d0e5d93e3f766c1cc3617b83b650b2b078\" pid:6956 exited_at:{seconds:1757398034 nanos:408603094}" Sep 9 06:07:25.881690 containerd[1949]: time="2025-09-09T06:07:25.881651907Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"9d71260204d2cd40cc1f6898405cae70c28b96e97c451f9b4e33937f7ef70f1d\" pid:6998 exited_at:{seconds:1757398045 nanos:881446854}" Sep 9 06:07:30.766195 containerd[1949]: time="2025-09-09T06:07:30.766170091Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"c3472ab4179d58898c8a4db4f770a6259ce3767a50be2c58d956565404478551\" pid:7032 exited_at:{seconds:1757398050 nanos:765992894}" Sep 9 06:07:43.855867 containerd[1949]: time="2025-09-09T06:07:43.855838233Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"84e9b16cdd3ad97f843d7cd0248ee901ddc1ba61fdbd2cb59f49025073265d6c\" pid:7073 exited_at:{seconds:1757398063 nanos:855679468}" Sep 9 06:07:44.419597 containerd[1949]: time="2025-09-09T06:07:44.419565685Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"af5f55185851094b23212826966fdeba7296fbd7743c23fb8e94be115c41531d\" pid:7095 exited_at:{seconds:1757398064 nanos:419317834}" Sep 9 06:07:56.479015 containerd[1949]: time="2025-09-09T06:07:56.478987096Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"338adc7c1fc96537c036fb3763f4141c86d024d1d5d38ff2318e53d338323cba\" pid:7131 exited_at:{seconds:1757398076 nanos:478853585}" Sep 9 06:08:00.758084 containerd[1949]: time="2025-09-09T06:08:00.758020664Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"f528d10ae22a134d29750bb93559a0a45ff5da5e97168bf96140c50ae0a8e73c\" pid:7164 exited_at:{seconds:1757398080 nanos:757796379}" Sep 9 06:08:13.831605 containerd[1949]: time="2025-09-09T06:08:13.831568496Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"c95713bbcaf5ea97a2fdef9bf7b38b4c352da25df07db1c7d47e592b98ed69e9\" pid:7208 exited_at:{seconds:1757398093 nanos:831413371}" Sep 9 06:08:14.416900 containerd[1949]: time="2025-09-09T06:08:14.416873144Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"bfdc3d840527adc5ff595b5b1dc7b76e12090b365ba4ef5064742ea198f277e8\" pid:7230 exited_at:{seconds:1757398094 nanos:416661582}" Sep 9 06:08:25.882497 containerd[1949]: time="2025-09-09T06:08:25.882467128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"3e07e79a0a28aff37a0b9f50053753137bf9c745b67cd93485bccd5e4888141d\" pid:7277 exited_at:{seconds:1757398105 nanos:882153973}" Sep 9 06:08:30.760650 containerd[1949]: time="2025-09-09T06:08:30.760618632Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"4160242873652765dcba1092df9766366bf5c390a3f549e1d571dd30e312fcf7\" pid:7312 exited_at:{seconds:1757398110 nanos:760303594}" Sep 9 06:08:43.823085 containerd[1949]: time="2025-09-09T06:08:43.823046284Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"984fc147903a11c3e9768d93cc52391bae5748b2df426d281f6ef6d55c22b64f\" pid:7349 exited_at:{seconds:1757398123 nanos:822894577}" Sep 9 06:08:44.419681 containerd[1949]: time="2025-09-09T06:08:44.419646284Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"4f1ca50db277bf669d83b5919f7dab3efaf1cc9ba5c12d88b16913d2318f7cd5\" pid:7371 exited_at:{seconds:1757398124 nanos:419379087}" Sep 9 06:08:56.453469 containerd[1949]: time="2025-09-09T06:08:56.453434027Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"234b9de0dc34969696f11828c8b99a959c4a88bac7c5caa89ac4bbbda96ab40d\" pid:7406 exited_at:{seconds:1757398136 nanos:453267065}" Sep 9 06:09:00.787355 containerd[1949]: time="2025-09-09T06:09:00.787274607Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"bdac618d87cad0630cd45af786b4d46f336b09e04f073c00d07b7fbacc7be6a3\" pid:7429 exited_at:{seconds:1757398140 nanos:787032341}" Sep 9 06:09:13.823049 containerd[1949]: time="2025-09-09T06:09:13.823025496Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"22ddec5675eead473215588eebcaa02fff6cdf4427fa5abdad07a5bb24e19a68\" pid:7470 exited_at:{seconds:1757398153 nanos:822920565}" Sep 9 06:09:14.464471 containerd[1949]: time="2025-09-09T06:09:14.464422563Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"09223d8d0635248dea87bda20f35c1f34c712db4d3a0b58e7a4b92af0dec438d\" pid:7491 exited_at:{seconds:1757398154 nanos:464164693}" Sep 9 06:09:25.894050 containerd[1949]: time="2025-09-09T06:09:25.894018055Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"2a5a0a873e5a611b2a482c749c69ed8a1d26c92c33060f14578721a4dd4779b6\" pid:7533 exited_at:{seconds:1757398165 nanos:893826898}" Sep 9 06:09:30.801568 containerd[1949]: time="2025-09-09T06:09:30.801537774Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"f13e6efe45c74d6dae2fe268dec762ac645cde1730e792138a7bd682a24a8589\" pid:7565 exited_at:{seconds:1757398170 nanos:801299738}" Sep 9 06:09:43.853771 containerd[1949]: time="2025-09-09T06:09:43.853743405Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"f760f83dc6e72e64b4d68178b86f9c2e3d99bb62be41e945527adf6944cfdab6\" pid:7602 exited_at:{seconds:1757398183 nanos:853482173}" Sep 9 06:09:44.412604 containerd[1949]: time="2025-09-09T06:09:44.412545030Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"7e50d8e8b294cf27fb82586131f9fff6ca6e5f0b7f20f0cab2b8db8fb0a61165\" pid:7624 exited_at:{seconds:1757398184 nanos:412326805}" Sep 9 06:09:56.432695 containerd[1949]: time="2025-09-09T06:09:56.432634976Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"b5adaa801b39469951545bcf93e2ea3c61f15abd47035ab8acffb10164889416\" pid:7680 exited_at:{seconds:1757398196 nanos:432539548}" Sep 9 06:10:00.759867 containerd[1949]: time="2025-09-09T06:10:00.759815897Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"aeb774c66bfd092eaf85e3c9c78251747de70d95843e2032bf51a78b9b131557\" pid:7704 exited_at:{seconds:1757398200 nanos:759621685}" Sep 9 06:10:13.818005 containerd[1949]: time="2025-09-09T06:10:13.817981253Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"d01f0c4edb8b40233083a2d08ed901326a1039369324b05df367b8643f76f6b9\" pid:7740 exited_at:{seconds:1757398213 nanos:817865815}" Sep 9 06:10:14.415928 containerd[1949]: time="2025-09-09T06:10:14.415894459Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"6cabd7cf95aec8390014971c83a6bedcac627deeb468101381d787723930e9ca\" pid:7762 exited_at:{seconds:1757398214 nanos:415607385}" Sep 9 06:10:25.930031 containerd[1949]: time="2025-09-09T06:10:25.930002582Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"9a67a6ac77359c3d452b99640c259dc3066ea05ee66a4fa11156caead1b4d9a1\" pid:7797 exited_at:{seconds:1757398225 nanos:929790324}" Sep 9 06:10:30.770337 containerd[1949]: time="2025-09-09T06:10:30.770305306Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"6bc0745c535c3ba0c7284768e7cb87096c3aa488626796c86ed0d9a75d1a992f\" pid:7831 exited_at:{seconds:1757398230 nanos:769914671}" Sep 9 06:10:43.821851 containerd[1949]: time="2025-09-09T06:10:43.821819694Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"0cdf61f69256bace7176ebf0d9d994efbd0120decbf64ca681909479c0be73c3\" pid:7872 exited_at:{seconds:1757398243 nanos:821647347}" Sep 9 06:10:44.416147 containerd[1949]: time="2025-09-09T06:10:44.416121120Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"608781b478536ec800c2dd463fcfc1ae2807a3f91eb241cf24c081af90261187\" pid:7894 exited_at:{seconds:1757398244 nanos:415942679}" Sep 9 06:10:52.816745 containerd[1949]: time="2025-09-09T06:10:52.816543824Z" level=warning msg="container event discarded" container=b0a79de09ef7e5b8bde34628385ef8d9b2e42165d1427a6e27762d5c157b6a0c type=CONTAINER_CREATED_EVENT Sep 9 06:10:52.816745 containerd[1949]: time="2025-09-09T06:10:52.816702172Z" level=warning msg="container event discarded" container=b0a79de09ef7e5b8bde34628385ef8d9b2e42165d1427a6e27762d5c157b6a0c type=CONTAINER_STARTED_EVENT Sep 9 06:10:52.832218 containerd[1949]: time="2025-09-09T06:10:52.832060481Z" level=warning msg="container event discarded" container=fe93835923401de6170d90f59480e85edf54b6f3071bc5250fd9d226182a6c5f type=CONTAINER_CREATED_EVENT Sep 9 06:10:52.832577 containerd[1949]: time="2025-09-09T06:10:52.832207012Z" level=warning msg="container event discarded" container=fe93835923401de6170d90f59480e85edf54b6f3071bc5250fd9d226182a6c5f type=CONTAINER_STARTED_EVENT Sep 9 06:10:52.832577 containerd[1949]: time="2025-09-09T06:10:52.832271439Z" level=warning msg="container event discarded" container=33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8 type=CONTAINER_CREATED_EVENT Sep 9 06:10:52.832577 containerd[1949]: time="2025-09-09T06:10:52.832315831Z" level=warning msg="container event discarded" container=92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9 type=CONTAINER_CREATED_EVENT Sep 9 06:10:52.849848 containerd[1949]: time="2025-09-09T06:10:52.849752531Z" level=warning msg="container event discarded" container=dce2a7f1d89f089e34d2dc006c8132c6582c41afd59bc03f0ab84753c6b1f1c2 type=CONTAINER_CREATED_EVENT Sep 9 06:10:52.849848 containerd[1949]: time="2025-09-09T06:10:52.849822644Z" level=warning msg="container event discarded" container=dce2a7f1d89f089e34d2dc006c8132c6582c41afd59bc03f0ab84753c6b1f1c2 type=CONTAINER_STARTED_EVENT Sep 9 06:10:52.849848 containerd[1949]: time="2025-09-09T06:10:52.849852838Z" level=warning msg="container event discarded" container=f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9 type=CONTAINER_CREATED_EVENT Sep 9 06:10:52.877437 containerd[1949]: time="2025-09-09T06:10:52.877293124Z" level=warning msg="container event discarded" container=33e0f14e6989fdb3b8afb69e25ece44f59112fdecbc8fa6b3cc5402de6c7c1e8 type=CONTAINER_STARTED_EVENT Sep 9 06:10:52.877437 containerd[1949]: time="2025-09-09T06:10:52.877374826Z" level=warning msg="container event discarded" container=92cad8226cd9bd8bb019d01920dcaa6b3642e4fc4cd83641036fb747b7a354e9 type=CONTAINER_STARTED_EVENT Sep 9 06:10:52.893948 containerd[1949]: time="2025-09-09T06:10:52.893821976Z" level=warning msg="container event discarded" container=f803af0cf509a455180687c66a512fd85b50e93a748ef41718b4784867f2a3c9 type=CONTAINER_STARTED_EVENT Sep 9 06:10:56.438183 containerd[1949]: time="2025-09-09T06:10:56.438160269Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"f20a52cde3aa40ee690890b739ac49976b844c40a9c5dfcf3c234f24f39b16ce\" pid:7925 exited_at:{seconds:1757398256 nanos:438035293}" Sep 9 06:11:00.814200 containerd[1949]: time="2025-09-09T06:11:00.814167054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"88503560f17f9cefcc87db9a091736c3e9856bedc5c0b6222d54c7e76234543c\" pid:7950 exited_at:{seconds:1757398260 nanos:813932459}" Sep 9 06:11:03.013560 containerd[1949]: time="2025-09-09T06:11:03.013353055Z" level=warning msg="container event discarded" container=783ba7c780c5b86e0320980a2629a685424af7a9a734c41f0db16ae8821fac73 type=CONTAINER_CREATED_EVENT Sep 9 06:11:03.013560 containerd[1949]: time="2025-09-09T06:11:03.013509479Z" level=warning msg="container event discarded" container=783ba7c780c5b86e0320980a2629a685424af7a9a734c41f0db16ae8821fac73 type=CONTAINER_STARTED_EVENT Sep 9 06:11:03.263218 containerd[1949]: time="2025-09-09T06:11:03.263064021Z" level=warning msg="container event discarded" container=f98dda3dabc251500bb38f1defd1a111316b5a6df59161b9b1159db943550cc1 type=CONTAINER_CREATED_EVENT Sep 9 06:11:03.263218 containerd[1949]: time="2025-09-09T06:11:03.263161363Z" level=warning msg="container event discarded" container=f98dda3dabc251500bb38f1defd1a111316b5a6df59161b9b1159db943550cc1 type=CONTAINER_STARTED_EVENT Sep 9 06:11:03.263218 containerd[1949]: time="2025-09-09T06:11:03.263195744Z" level=warning msg="container event discarded" container=1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946 type=CONTAINER_CREATED_EVENT Sep 9 06:11:03.324871 containerd[1949]: time="2025-09-09T06:11:03.324771187Z" level=warning msg="container event discarded" container=1b9ca5cc8c86a50a446bde08532dc5125b44417f263144539142e90330a8c946 type=CONTAINER_STARTED_EVENT Sep 9 06:11:04.705809 containerd[1949]: time="2025-09-09T06:11:04.705656020Z" level=warning msg="container event discarded" container=e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330 type=CONTAINER_CREATED_EVENT Sep 9 06:11:04.736333 containerd[1949]: time="2025-09-09T06:11:04.736175990Z" level=warning msg="container event discarded" container=e278b34b13f2c738c367f392bd580e1fa3ee021cb531375c60ae679e1eb20330 type=CONTAINER_STARTED_EVENT Sep 9 06:11:11.564405 containerd[1949]: time="2025-09-09T06:11:11.564238025Z" level=warning msg="container event discarded" container=f7c644b2039b06d604ec1a4d26f10bfa5a7f4590bdf689ed965d29800dd86a7d type=CONTAINER_CREATED_EVENT Sep 9 06:11:11.564405 containerd[1949]: time="2025-09-09T06:11:11.564327357Z" level=warning msg="container event discarded" container=f7c644b2039b06d604ec1a4d26f10bfa5a7f4590bdf689ed965d29800dd86a7d type=CONTAINER_STARTED_EVENT Sep 9 06:11:11.895023 containerd[1949]: time="2025-09-09T06:11:11.894852629Z" level=warning msg="container event discarded" container=e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1 type=CONTAINER_CREATED_EVENT Sep 9 06:11:11.895023 containerd[1949]: time="2025-09-09T06:11:11.894961358Z" level=warning msg="container event discarded" container=e3b2d9405f9e6366dd1aa725f0df2f3e6d1dd32b893a668e429323c07391d7a1 type=CONTAINER_STARTED_EVENT Sep 9 06:11:13.816490 containerd[1949]: time="2025-09-09T06:11:13.816229515Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"1b4e2e5b2f620881cd95e63eeee144be9b90c648dbc4348cbec1cb81599cf233\" pid:7988 exited_at:{seconds:1757398273 nanos:816088018}" Sep 9 06:11:14.340019 containerd[1949]: time="2025-09-09T06:11:14.339846527Z" level=warning msg="container event discarded" container=ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17 type=CONTAINER_CREATED_EVENT Sep 9 06:11:14.380651 containerd[1949]: time="2025-09-09T06:11:14.380618714Z" level=warning msg="container event discarded" container=ed9cca8ae3bb831591c809e5fb7d4e3fe9d5200da1fe0ca2d8fa4812a2bdcf17 type=CONTAINER_STARTED_EVENT Sep 9 06:11:14.405730 containerd[1949]: time="2025-09-09T06:11:14.405699895Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"f598eded9c3520d3acb5e691e455c9b41287c29e8c19c571215c4758be88a399\" pid:8009 exited_at:{seconds:1757398274 nanos:405435286}" Sep 9 06:11:16.052659 containerd[1949]: time="2025-09-09T06:11:16.052499395Z" level=warning msg="container event discarded" container=a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b type=CONTAINER_CREATED_EVENT Sep 9 06:11:16.085015 containerd[1949]: time="2025-09-09T06:11:16.084835511Z" level=warning msg="container event discarded" container=a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b type=CONTAINER_STARTED_EVENT Sep 9 06:11:17.113884 containerd[1949]: time="2025-09-09T06:11:17.113701959Z" level=warning msg="container event discarded" container=a37e6023a92abe50091d464c72cdecdad0dcead8aafb54b8f9da2048a2c9e72b type=CONTAINER_STOPPED_EVENT Sep 9 06:11:20.773661 containerd[1949]: time="2025-09-09T06:11:20.773407525Z" level=warning msg="container event discarded" container=9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224 type=CONTAINER_CREATED_EVENT Sep 9 06:11:20.812116 containerd[1949]: time="2025-09-09T06:11:20.811929084Z" level=warning msg="container event discarded" container=9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224 type=CONTAINER_STARTED_EVENT Sep 9 06:11:21.731874 containerd[1949]: time="2025-09-09T06:11:21.731699002Z" level=warning msg="container event discarded" container=9e49c92f89d16f83b61e3999a017734d3886a2225ffb6fb8a47be5d4f4d7e224 type=CONTAINER_STOPPED_EVENT Sep 9 06:11:25.888573 containerd[1949]: time="2025-09-09T06:11:25.888544070Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"610494160c55be4af5f24158354f883f2868c9b1fc107c7e39fef632ca2d27a4\" pid:8068 exited_at:{seconds:1757398285 nanos:888229746}" Sep 9 06:11:27.846273 containerd[1949]: time="2025-09-09T06:11:27.846084130Z" level=warning msg="container event discarded" container=c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a type=CONTAINER_CREATED_EVENT Sep 9 06:11:27.885618 containerd[1949]: time="2025-09-09T06:11:27.885485991Z" level=warning msg="container event discarded" container=c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a type=CONTAINER_STARTED_EVENT Sep 9 06:11:29.265261 containerd[1949]: time="2025-09-09T06:11:29.265120433Z" level=warning msg="container event discarded" container=e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7 type=CONTAINER_CREATED_EVENT Sep 9 06:11:29.265261 containerd[1949]: time="2025-09-09T06:11:29.265204882Z" level=warning msg="container event discarded" container=e3daafc9ec3bd3ce9e8bc22d0142874865c5ece2d5a7afe4d015a127a86feaa7 type=CONTAINER_STARTED_EVENT Sep 9 06:11:30.763462 containerd[1949]: time="2025-09-09T06:11:30.763430526Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"f54bc4327453f7e5dd58a6de1d40585fb66fefb8739621ab15d69044afdf806e\" pid:8101 exited_at:{seconds:1757398290 nanos:763157300}" Sep 9 06:11:31.357920 containerd[1949]: time="2025-09-09T06:11:31.357758402Z" level=warning msg="container event discarded" container=d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7 type=CONTAINER_CREATED_EVENT Sep 9 06:11:31.408313 containerd[1949]: time="2025-09-09T06:11:31.408174624Z" level=warning msg="container event discarded" container=d0b603328226b89670201c894a2de07aa15769f9b6dcf04d364deb45d37954b7 type=CONTAINER_STARTED_EVENT Sep 9 06:11:32.709722 containerd[1949]: time="2025-09-09T06:11:32.709539036Z" level=warning msg="container event discarded" container=31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a type=CONTAINER_CREATED_EVENT Sep 9 06:11:32.709722 containerd[1949]: time="2025-09-09T06:11:32.709655375Z" level=warning msg="container event discarded" container=31e31a1ad7a89cc1781251ebcc63d7cba5453e977a8d3192b62b4a38f9f80b0a type=CONTAINER_STARTED_EVENT Sep 9 06:11:32.797220 containerd[1949]: time="2025-09-09T06:11:32.797081566Z" level=warning msg="container event discarded" container=e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0 type=CONTAINER_CREATED_EVENT Sep 9 06:11:32.797220 containerd[1949]: time="2025-09-09T06:11:32.797185101Z" level=warning msg="container event discarded" container=e6ac0975fb5d48f6e9761eaf064a1942a4a9e87617ce66f51a5d233bf8f94ee0 type=CONTAINER_STARTED_EVENT Sep 9 06:11:32.797220 containerd[1949]: time="2025-09-09T06:11:32.797214437Z" level=warning msg="container event discarded" container=ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1 type=CONTAINER_CREATED_EVENT Sep 9 06:11:32.829778 containerd[1949]: time="2025-09-09T06:11:32.829624633Z" level=warning msg="container event discarded" container=ab3e5a29a811449984bb91dd00e5f88fe8acbd8bb5a1bb52dfb906b1b25e26a1 type=CONTAINER_STARTED_EVENT Sep 9 06:11:32.909182 containerd[1949]: time="2025-09-09T06:11:32.909041561Z" level=warning msg="container event discarded" container=b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed type=CONTAINER_CREATED_EVENT Sep 9 06:11:32.909182 containerd[1949]: time="2025-09-09T06:11:32.909154235Z" level=warning msg="container event discarded" container=b528795d27d7fc25a9f07444bc5d8d9dd4690fc010a5ffc743520f9bebd372ed type=CONTAINER_STARTED_EVENT Sep 9 06:11:33.691557 containerd[1949]: time="2025-09-09T06:11:33.691429056Z" level=warning msg="container event discarded" container=4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db type=CONTAINER_CREATED_EVENT Sep 9 06:11:33.691557 containerd[1949]: time="2025-09-09T06:11:33.691540934Z" level=warning msg="container event discarded" container=4b8ee75b61c0bddf1327e1db1e352d226292fef84ef35d1803a9a01fe3b6b6db type=CONTAINER_STARTED_EVENT Sep 9 06:11:33.812268 containerd[1949]: time="2025-09-09T06:11:33.812110926Z" level=warning msg="container event discarded" container=742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06 type=CONTAINER_CREATED_EVENT Sep 9 06:11:33.862754 containerd[1949]: time="2025-09-09T06:11:33.862598854Z" level=warning msg="container event discarded" container=742c32e694c463ea19878e71adf5f530bf081bd2ccb24c278c4c28c25aea7f06 type=CONTAINER_STARTED_EVENT Sep 9 06:11:35.729323 containerd[1949]: time="2025-09-09T06:11:35.729162952Z" level=warning msg="container event discarded" container=a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547 type=CONTAINER_CREATED_EVENT Sep 9 06:11:35.729323 containerd[1949]: time="2025-09-09T06:11:35.729255588Z" level=warning msg="container event discarded" container=a6c5f21266c39a6512e8dda204bc2eeae29e27ffbcd613164938d7df10900547 type=CONTAINER_STARTED_EVENT Sep 9 06:11:35.729323 containerd[1949]: time="2025-09-09T06:11:35.729284278Z" level=warning msg="container event discarded" container=0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c type=CONTAINER_CREATED_EVENT Sep 9 06:11:35.768857 containerd[1949]: time="2025-09-09T06:11:35.768740582Z" level=warning msg="container event discarded" container=0306f171a2e4a42b2ce4f43bbd7cccfd593cd02a0cdce1be826358993ae6183c type=CONTAINER_STARTED_EVENT Sep 9 06:11:36.614848 containerd[1949]: time="2025-09-09T06:11:36.614758553Z" level=warning msg="container event discarded" container=33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f type=CONTAINER_CREATED_EVENT Sep 9 06:11:36.673545 containerd[1949]: time="2025-09-09T06:11:36.673461539Z" level=warning msg="container event discarded" container=33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f type=CONTAINER_STARTED_EVENT Sep 9 06:11:36.699158 containerd[1949]: time="2025-09-09T06:11:36.698988930Z" level=warning msg="container event discarded" container=bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18 type=CONTAINER_CREATED_EVENT Sep 9 06:11:36.699158 containerd[1949]: time="2025-09-09T06:11:36.699096746Z" level=warning msg="container event discarded" container=bca8a831dca20e56079e716335fd148b5db8e5dc78a882b274d708ebddef9e18 type=CONTAINER_STARTED_EVENT Sep 9 06:11:36.814961 containerd[1949]: time="2025-09-09T06:11:36.814777336Z" level=warning msg="container event discarded" container=1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220 type=CONTAINER_CREATED_EVENT Sep 9 06:11:36.814961 containerd[1949]: time="2025-09-09T06:11:36.814893794Z" level=warning msg="container event discarded" container=1a13531062e24ef2289a4edfaebfa5048ec09f968c0199edd1c59ad0a8298220 type=CONTAINER_STARTED_EVENT Sep 9 06:11:39.339530 containerd[1949]: time="2025-09-09T06:11:39.339315541Z" level=warning msg="container event discarded" container=daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49 type=CONTAINER_CREATED_EVENT Sep 9 06:11:39.396980 containerd[1949]: time="2025-09-09T06:11:39.396880213Z" level=warning msg="container event discarded" container=daafdf58aed9601410cedde9cff1130dce9920b3e279e4a428ccedfa3f0ffc49 type=CONTAINER_STARTED_EVENT Sep 9 06:11:39.815715 containerd[1949]: time="2025-09-09T06:11:39.815523744Z" level=warning msg="container event discarded" container=448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac type=CONTAINER_CREATED_EVENT Sep 9 06:11:39.864077 containerd[1949]: time="2025-09-09T06:11:39.863941387Z" level=warning msg="container event discarded" container=448be205cb574623b819653f706e5640b6c4430c035799f911bcc52d3ff028ac type=CONTAINER_STARTED_EVENT Sep 9 06:11:42.606936 containerd[1949]: time="2025-09-09T06:11:42.606787574Z" level=warning msg="container event discarded" container=a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2 type=CONTAINER_CREATED_EVENT Sep 9 06:11:42.668443 containerd[1949]: time="2025-09-09T06:11:42.668274974Z" level=warning msg="container event discarded" container=a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2 type=CONTAINER_STARTED_EVENT Sep 9 06:11:43.878204 containerd[1949]: time="2025-09-09T06:11:43.878172990Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"8cf55d3edd7362ac141f0ad6f6e41d9e7f905aabe99a01f7f21bbbc9f0bb4f50\" pid:8138 exited_at:{seconds:1757398303 nanos:878005930}" Sep 9 06:11:44.338025 containerd[1949]: time="2025-09-09T06:11:44.337741526Z" level=warning msg="container event discarded" container=3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c type=CONTAINER_CREATED_EVENT Sep 9 06:11:44.386818 containerd[1949]: time="2025-09-09T06:11:44.386653132Z" level=warning msg="container event discarded" container=3d7aa8594563600039429c120eea528631caac7da066a60d694d0904efbeaa0c type=CONTAINER_STARTED_EVENT Sep 9 06:11:44.426081 containerd[1949]: time="2025-09-09T06:11:44.426016731Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"67ff4f72d34d52c1263db0148de5b0402821978526d354b7cd740821dd62b6b2\" pid:8160 exited_at:{seconds:1757398304 nanos:425750964}" Sep 9 06:11:46.367474 containerd[1949]: time="2025-09-09T06:11:46.367328289Z" level=warning msg="container event discarded" container=3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd type=CONTAINER_CREATED_EVENT Sep 9 06:11:46.422775 containerd[1949]: time="2025-09-09T06:11:46.422708287Z" level=warning msg="container event discarded" container=3162059ac59f4c680c5881e3e3afc46d53edd47b7d5fb1dfa32f5808bbc9decd type=CONTAINER_STARTED_EVENT Sep 9 06:11:56.433879 containerd[1949]: time="2025-09-09T06:11:56.433852202Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"ad9a5cfe2201faa07db624451023e9768443564082375a93bd24d014667bd216\" pid:8194 exited_at:{seconds:1757398316 nanos:433657349}" Sep 9 06:12:00.754604 containerd[1949]: time="2025-09-09T06:12:00.754542725Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"0b0c1b61835f5cef66b43373305f1c2db500d75bf1633690937e9b51a520c53c\" pid:8225 exited_at:{seconds:1757398320 nanos:754292093}" Sep 9 06:12:04.932111 systemd[1]: Started sshd@9-139.178.90.255:22-139.178.89.65:39186.service - OpenSSH per-connection server daemon (139.178.89.65:39186). Sep 9 06:12:04.999173 sshd[8255]: Accepted publickey for core from 139.178.89.65 port 39186 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:05.000309 sshd-session[8255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:05.005286 systemd-logind[1938]: New session 12 of user core. Sep 9 06:12:05.017864 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 06:12:05.147676 sshd[8258]: Connection closed by 139.178.89.65 port 39186 Sep 9 06:12:05.147876 sshd-session[8255]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:05.149713 systemd[1]: sshd@9-139.178.90.255:22-139.178.89.65:39186.service: Deactivated successfully. Sep 9 06:12:05.150685 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 06:12:05.151411 systemd-logind[1938]: Session 12 logged out. Waiting for processes to exit. Sep 9 06:12:05.152031 systemd-logind[1938]: Removed session 12. Sep 9 06:12:10.179772 systemd[1]: Started sshd@10-139.178.90.255:22-139.178.89.65:35580.service - OpenSSH per-connection server daemon (139.178.89.65:35580). Sep 9 06:12:10.262685 sshd[8292]: Accepted publickey for core from 139.178.89.65 port 35580 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:10.263792 sshd-session[8292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:10.268310 systemd-logind[1938]: New session 13 of user core. Sep 9 06:12:10.284862 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 06:12:10.369198 sshd[8295]: Connection closed by 139.178.89.65 port 35580 Sep 9 06:12:10.369399 sshd-session[8292]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:10.371073 systemd[1]: sshd@10-139.178.90.255:22-139.178.89.65:35580.service: Deactivated successfully. Sep 9 06:12:10.372105 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 06:12:10.372812 systemd-logind[1938]: Session 13 logged out. Waiting for processes to exit. Sep 9 06:12:10.373452 systemd-logind[1938]: Removed session 13. Sep 9 06:12:13.812883 containerd[1949]: time="2025-09-09T06:12:13.812852761Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"652b425cade19c4c6a30b241b5369f45a177573a5278fb42056f25ef8517b3c1\" pid:8331 exited_at:{seconds:1757398333 nanos:812665368}" Sep 9 06:12:14.415541 containerd[1949]: time="2025-09-09T06:12:14.415513979Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"02847297cab9c6345a15706bfcce0d872c612668cc809abbaa50ea4fcda8652a\" pid:8353 exited_at:{seconds:1757398334 nanos:415300669}" Sep 9 06:12:15.379014 systemd[1]: Started sshd@11-139.178.90.255:22-139.178.89.65:35586.service - OpenSSH per-connection server daemon (139.178.89.65:35586). Sep 9 06:12:15.418130 sshd[8376]: Accepted publickey for core from 139.178.89.65 port 35586 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:15.418787 sshd-session[8376]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:15.421516 systemd-logind[1938]: New session 14 of user core. Sep 9 06:12:15.427835 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 06:12:15.516988 sshd[8379]: Connection closed by 139.178.89.65 port 35586 Sep 9 06:12:15.517178 sshd-session[8376]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:15.530698 systemd[1]: sshd@11-139.178.90.255:22-139.178.89.65:35586.service: Deactivated successfully. Sep 9 06:12:15.531612 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 06:12:15.532172 systemd-logind[1938]: Session 14 logged out. Waiting for processes to exit. Sep 9 06:12:15.533242 systemd[1]: Started sshd@12-139.178.90.255:22-139.178.89.65:35594.service - OpenSSH per-connection server daemon (139.178.89.65:35594). Sep 9 06:12:15.533810 systemd-logind[1938]: Removed session 14. Sep 9 06:12:15.570984 sshd[8405]: Accepted publickey for core from 139.178.89.65 port 35594 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:15.571625 sshd-session[8405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:15.574486 systemd-logind[1938]: New session 15 of user core. Sep 9 06:12:15.584844 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 06:12:15.686091 sshd[8408]: Connection closed by 139.178.89.65 port 35594 Sep 9 06:12:15.686242 sshd-session[8405]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:15.698895 systemd[1]: sshd@12-139.178.90.255:22-139.178.89.65:35594.service: Deactivated successfully. Sep 9 06:12:15.700075 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 06:12:15.700552 systemd-logind[1938]: Session 15 logged out. Waiting for processes to exit. Sep 9 06:12:15.701793 systemd[1]: Started sshd@13-139.178.90.255:22-139.178.89.65:35604.service - OpenSSH per-connection server daemon (139.178.89.65:35604). Sep 9 06:12:15.702186 systemd-logind[1938]: Removed session 15. Sep 9 06:12:15.739939 sshd[8431]: Accepted publickey for core from 139.178.89.65 port 35604 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:15.740596 sshd-session[8431]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:15.743466 systemd-logind[1938]: New session 16 of user core. Sep 9 06:12:15.757953 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 06:12:15.899336 sshd[8434]: Connection closed by 139.178.89.65 port 35604 Sep 9 06:12:15.899530 sshd-session[8431]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:15.901517 systemd[1]: sshd@13-139.178.90.255:22-139.178.89.65:35604.service: Deactivated successfully. Sep 9 06:12:15.902493 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 06:12:15.903228 systemd-logind[1938]: Session 16 logged out. Waiting for processes to exit. Sep 9 06:12:15.903931 systemd-logind[1938]: Removed session 16. Sep 9 06:12:20.928193 systemd[1]: Started sshd@14-139.178.90.255:22-139.178.89.65:41236.service - OpenSSH per-connection server daemon (139.178.89.65:41236). Sep 9 06:12:20.980652 sshd[8463]: Accepted publickey for core from 139.178.89.65 port 41236 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:20.981306 sshd-session[8463]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:20.984252 systemd-logind[1938]: New session 17 of user core. Sep 9 06:12:20.997122 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 06:12:21.092057 sshd[8466]: Connection closed by 139.178.89.65 port 41236 Sep 9 06:12:21.092242 sshd-session[8463]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:21.094200 systemd[1]: sshd@14-139.178.90.255:22-139.178.89.65:41236.service: Deactivated successfully. Sep 9 06:12:21.095248 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 06:12:21.096196 systemd-logind[1938]: Session 17 logged out. Waiting for processes to exit. Sep 9 06:12:21.096783 systemd-logind[1938]: Removed session 17. Sep 9 06:12:25.878677 containerd[1949]: time="2025-09-09T06:12:25.878646201Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"deeb299b9d6c0d48e37870fe5970e15d441c808e55e09a70bd2e7bb5bcbe6e1d\" pid:8500 exited_at:{seconds:1757398345 nanos:878387520}" Sep 9 06:12:26.121120 systemd[1]: Started sshd@15-139.178.90.255:22-139.178.89.65:41238.service - OpenSSH per-connection server daemon (139.178.89.65:41238). Sep 9 06:12:26.172789 sshd[8521]: Accepted publickey for core from 139.178.89.65 port 41238 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:26.173415 sshd-session[8521]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:26.176455 systemd-logind[1938]: New session 18 of user core. Sep 9 06:12:26.192939 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 06:12:26.279660 sshd[8524]: Connection closed by 139.178.89.65 port 41238 Sep 9 06:12:26.279895 sshd-session[8521]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:26.282223 systemd[1]: sshd@15-139.178.90.255:22-139.178.89.65:41238.service: Deactivated successfully. Sep 9 06:12:26.283227 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 06:12:26.283652 systemd-logind[1938]: Session 18 logged out. Waiting for processes to exit. Sep 9 06:12:26.284229 systemd-logind[1938]: Removed session 18. Sep 9 06:12:30.768008 containerd[1949]: time="2025-09-09T06:12:30.767946086Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3ec16c0843a36789b6095577d058e7aae15a768eb5b75ffa8bda962e2c3da3a\" id:\"45d56d0ea26a72d1eb9e01aebbe2476c1e96823a6e39e4fc7fd120ca7b9bddd3\" pid:8562 exited_at:{seconds:1757398350 nanos:767645920}" Sep 9 06:12:31.299583 systemd[1]: Started sshd@16-139.178.90.255:22-139.178.89.65:49814.service - OpenSSH per-connection server daemon (139.178.89.65:49814). Sep 9 06:12:31.338072 sshd[8587]: Accepted publickey for core from 139.178.89.65 port 49814 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:31.338784 sshd-session[8587]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:31.341666 systemd-logind[1938]: New session 19 of user core. Sep 9 06:12:31.362830 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 06:12:31.446055 sshd[8590]: Connection closed by 139.178.89.65 port 49814 Sep 9 06:12:31.446262 sshd-session[8587]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:31.448400 systemd[1]: sshd@16-139.178.90.255:22-139.178.89.65:49814.service: Deactivated successfully. Sep 9 06:12:31.449533 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 06:12:31.450450 systemd-logind[1938]: Session 19 logged out. Waiting for processes to exit. Sep 9 06:12:31.451360 systemd-logind[1938]: Removed session 19. Sep 9 06:12:36.476257 systemd[1]: Started sshd@17-139.178.90.255:22-139.178.89.65:49816.service - OpenSSH per-connection server daemon (139.178.89.65:49816). Sep 9 06:12:36.524117 sshd[8618]: Accepted publickey for core from 139.178.89.65 port 49816 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:36.524755 sshd-session[8618]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:36.527585 systemd-logind[1938]: New session 20 of user core. Sep 9 06:12:36.549141 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 06:12:36.645476 sshd[8621]: Connection closed by 139.178.89.65 port 49816 Sep 9 06:12:36.645658 sshd-session[8618]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:36.671849 systemd[1]: sshd@17-139.178.90.255:22-139.178.89.65:49816.service: Deactivated successfully. Sep 9 06:12:36.676173 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 06:12:36.678523 systemd-logind[1938]: Session 20 logged out. Waiting for processes to exit. Sep 9 06:12:36.685471 systemd[1]: Started sshd@18-139.178.90.255:22-139.178.89.65:49828.service - OpenSSH per-connection server daemon (139.178.89.65:49828). Sep 9 06:12:36.687415 systemd-logind[1938]: Removed session 20. Sep 9 06:12:36.789582 sshd[8645]: Accepted publickey for core from 139.178.89.65 port 49828 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:36.790724 sshd-session[8645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:36.794798 systemd-logind[1938]: New session 21 of user core. Sep 9 06:12:36.815903 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 06:12:37.054948 sshd[8648]: Connection closed by 139.178.89.65 port 49828 Sep 9 06:12:37.055547 sshd-session[8645]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:37.079267 systemd[1]: sshd@18-139.178.90.255:22-139.178.89.65:49828.service: Deactivated successfully. Sep 9 06:12:37.083504 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 06:12:37.085976 systemd-logind[1938]: Session 21 logged out. Waiting for processes to exit. Sep 9 06:12:37.092084 systemd[1]: Started sshd@19-139.178.90.255:22-139.178.89.65:49830.service - OpenSSH per-connection server daemon (139.178.89.65:49830). Sep 9 06:12:37.094019 systemd-logind[1938]: Removed session 21. Sep 9 06:12:37.191097 sshd[8671]: Accepted publickey for core from 139.178.89.65 port 49830 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:37.192042 sshd-session[8671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:37.196206 systemd-logind[1938]: New session 22 of user core. Sep 9 06:12:37.206118 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 06:12:38.001657 sshd[8675]: Connection closed by 139.178.89.65 port 49830 Sep 9 06:12:38.002561 sshd-session[8671]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:38.021966 systemd[1]: sshd@19-139.178.90.255:22-139.178.89.65:49830.service: Deactivated successfully. Sep 9 06:12:38.024502 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 06:12:38.025599 systemd-logind[1938]: Session 22 logged out. Waiting for processes to exit. Sep 9 06:12:38.028551 systemd[1]: Started sshd@20-139.178.90.255:22-139.178.89.65:49840.service - OpenSSH per-connection server daemon (139.178.89.65:49840). Sep 9 06:12:38.029221 systemd-logind[1938]: Removed session 22. Sep 9 06:12:38.088340 sshd[8708]: Accepted publickey for core from 139.178.89.65 port 49840 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:38.089238 sshd-session[8708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:38.091937 systemd-logind[1938]: New session 23 of user core. Sep 9 06:12:38.107785 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 06:12:38.244509 sshd[8713]: Connection closed by 139.178.89.65 port 49840 Sep 9 06:12:38.244736 sshd-session[8708]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:38.259764 systemd[1]: sshd@20-139.178.90.255:22-139.178.89.65:49840.service: Deactivated successfully. Sep 9 06:12:38.260723 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 06:12:38.261172 systemd-logind[1938]: Session 23 logged out. Waiting for processes to exit. Sep 9 06:12:38.262310 systemd[1]: Started sshd@21-139.178.90.255:22-139.178.89.65:49842.service - OpenSSH per-connection server daemon (139.178.89.65:49842). Sep 9 06:12:38.262851 systemd-logind[1938]: Removed session 23. Sep 9 06:12:38.301723 sshd[8736]: Accepted publickey for core from 139.178.89.65 port 49842 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:38.304146 sshd-session[8736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:38.308637 systemd-logind[1938]: New session 24 of user core. Sep 9 06:12:38.317881 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 06:12:38.398323 sshd[8741]: Connection closed by 139.178.89.65 port 49842 Sep 9 06:12:38.398499 sshd-session[8736]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:38.400572 systemd[1]: sshd@21-139.178.90.255:22-139.178.89.65:49842.service: Deactivated successfully. Sep 9 06:12:38.401472 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 06:12:38.401908 systemd-logind[1938]: Session 24 logged out. Waiting for processes to exit. Sep 9 06:12:38.402588 systemd-logind[1938]: Removed session 24. Sep 9 06:12:43.414551 systemd[1]: Started sshd@22-139.178.90.255:22-139.178.89.65:58506.service - OpenSSH per-connection server daemon (139.178.89.65:58506). Sep 9 06:12:43.452546 sshd[8769]: Accepted publickey for core from 139.178.89.65 port 58506 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:43.453236 sshd-session[8769]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:43.455912 systemd-logind[1938]: New session 25 of user core. Sep 9 06:12:43.478836 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 06:12:43.609638 sshd[8772]: Connection closed by 139.178.89.65 port 58506 Sep 9 06:12:43.609848 sshd-session[8769]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:43.611678 systemd[1]: sshd@22-139.178.90.255:22-139.178.89.65:58506.service: Deactivated successfully. Sep 9 06:12:43.612696 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 06:12:43.613387 systemd-logind[1938]: Session 25 logged out. Waiting for processes to exit. Sep 9 06:12:43.613968 systemd-logind[1938]: Removed session 25. Sep 9 06:12:43.879722 containerd[1949]: time="2025-09-09T06:12:43.879695982Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a586d9d5f8953ef5b5e12ec5ffb3c287840435ebc9fda4e6d3416aedfd4524e2\" id:\"93b47e5f884a1de203b23093160d7cc910d85d2590acefd283f1c2d2524472ad\" pid:8809 exited_at:{seconds:1757398363 nanos:879534739}" Sep 9 06:12:44.435275 containerd[1949]: time="2025-09-09T06:12:44.435245190Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33aa7fc073954371e8dc83c8fe36b52babdcdb1cdedf4ee795579479ba6d258f\" id:\"707235c538452adf970c55bc3094d989fbf5412ee7293b18c5e607b75bb6837c\" pid:8831 exited_at:{seconds:1757398364 nanos:435039986}" Sep 9 06:12:48.634514 systemd[1]: Started sshd@23-139.178.90.255:22-139.178.89.65:58512.service - OpenSSH per-connection server daemon (139.178.89.65:58512). Sep 9 06:12:48.695052 sshd[8853]: Accepted publickey for core from 139.178.89.65 port 58512 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:48.695977 sshd-session[8853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:48.699642 systemd-logind[1938]: New session 26 of user core. Sep 9 06:12:48.716009 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 06:12:48.845591 sshd[8856]: Connection closed by 139.178.89.65 port 58512 Sep 9 06:12:48.845820 sshd-session[8853]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:48.847722 systemd[1]: sshd@23-139.178.90.255:22-139.178.89.65:58512.service: Deactivated successfully. Sep 9 06:12:48.848678 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 06:12:48.849344 systemd-logind[1938]: Session 26 logged out. Waiting for processes to exit. Sep 9 06:12:48.849928 systemd-logind[1938]: Removed session 26. Sep 9 06:12:53.859532 systemd[1]: Started sshd@24-139.178.90.255:22-139.178.89.65:53766.service - OpenSSH per-connection server daemon (139.178.89.65:53766). Sep 9 06:12:53.892110 sshd[8905]: Accepted publickey for core from 139.178.89.65 port 53766 ssh2: RSA SHA256:qlS8/HvhvxEMEDQpIXrXtfn+mCzRm856SUoa3b2tftI Sep 9 06:12:53.892869 sshd-session[8905]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 06:12:53.895510 systemd-logind[1938]: New session 27 of user core. Sep 9 06:12:53.916814 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 9 06:12:54.041454 sshd[8908]: Connection closed by 139.178.89.65 port 53766 Sep 9 06:12:54.041633 sshd-session[8905]: pam_unix(sshd:session): session closed for user core Sep 9 06:12:54.043740 systemd[1]: sshd@24-139.178.90.255:22-139.178.89.65:53766.service: Deactivated successfully. Sep 9 06:12:54.044714 systemd[1]: session-27.scope: Deactivated successfully. Sep 9 06:12:54.045567 systemd-logind[1938]: Session 27 logged out. Waiting for processes to exit. Sep 9 06:12:54.046159 systemd-logind[1938]: Removed session 27.