Oct 13 06:52:48.906453 kernel: Linux version 6.12.51-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Sun Oct 12 22:37:12 -00 2025 Oct 13 06:52:48.906468 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=a48d469b0deb49c328e6faf6cf366b11952d47f2d24963c866a0ea8221fb0039 Oct 13 06:52:48.906475 kernel: BIOS-provided physical RAM map: Oct 13 06:52:48.906479 kernel: BIOS-e820: [mem 0x0000000000000000-0x00000000000997ff] usable Oct 13 06:52:48.906483 kernel: BIOS-e820: [mem 0x0000000000099800-0x000000000009ffff] reserved Oct 13 06:52:48.906487 kernel: BIOS-e820: [mem 0x00000000000e0000-0x00000000000fffff] reserved Oct 13 06:52:48.906492 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003fffffff] usable Oct 13 06:52:48.906496 kernel: BIOS-e820: [mem 0x0000000040000000-0x00000000403fffff] reserved Oct 13 06:52:48.906500 kernel: BIOS-e820: [mem 0x0000000040400000-0x0000000081b2cfff] usable Oct 13 06:52:48.906505 kernel: BIOS-e820: [mem 0x0000000081b2d000-0x0000000081b2dfff] ACPI NVS Oct 13 06:52:48.906509 kernel: BIOS-e820: [mem 0x0000000081b2e000-0x0000000081b2efff] reserved Oct 13 06:52:48.906513 kernel: BIOS-e820: [mem 0x0000000081b2f000-0x000000008afccfff] usable Oct 13 06:52:48.906517 kernel: BIOS-e820: [mem 0x000000008afcd000-0x000000008c0b1fff] reserved Oct 13 06:52:48.906522 kernel: BIOS-e820: [mem 0x000000008c0b2000-0x000000008c23afff] usable Oct 13 06:52:48.906527 kernel: BIOS-e820: [mem 0x000000008c23b000-0x000000008c66cfff] ACPI NVS Oct 13 06:52:48.906532 kernel: BIOS-e820: [mem 0x000000008c66d000-0x000000008eefefff] reserved Oct 13 06:52:48.906537 kernel: BIOS-e820: [mem 0x000000008eeff000-0x000000008eefffff] usable Oct 13 06:52:48.906542 kernel: BIOS-e820: [mem 0x000000008ef00000-0x000000008fffffff] reserved Oct 13 06:52:48.906546 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Oct 13 06:52:48.906551 kernel: BIOS-e820: [mem 0x00000000fe000000-0x00000000fe010fff] reserved Oct 13 06:52:48.906555 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec00fff] reserved Oct 13 06:52:48.906560 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Oct 13 06:52:48.906565 kernel: BIOS-e820: [mem 0x00000000ff000000-0x00000000ffffffff] reserved Oct 13 06:52:48.906571 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000086effffff] usable Oct 13 06:52:48.906583 kernel: NX (Execute Disable) protection: active Oct 13 06:52:48.906588 kernel: APIC: Static calls initialized Oct 13 06:52:48.906594 kernel: SMBIOS 3.2.1 present. Oct 13 06:52:48.906599 kernel: DMI: Supermicro SYS-5019C-MR-PH004/X11SCM-F, BIOS 1.9 09/16/2022 Oct 13 06:52:48.906604 kernel: DMI: Memory slots populated: 2/4 Oct 13 06:52:48.906608 kernel: tsc: Detected 3400.000 MHz processor Oct 13 06:52:48.906613 kernel: tsc: Detected 3399.906 MHz TSC Oct 13 06:52:48.906617 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Oct 13 06:52:48.906623 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Oct 13 06:52:48.906627 kernel: last_pfn = 0x86f000 max_arch_pfn = 0x400000000 Oct 13 06:52:48.906632 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 23), built from 10 variable MTRRs Oct 13 06:52:48.906637 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Oct 13 06:52:48.906642 kernel: last_pfn = 0x8ef00 max_arch_pfn = 0x400000000 Oct 13 06:52:48.906647 kernel: Using GB pages for direct mapping Oct 13 06:52:48.906652 kernel: ACPI: Early table checksum verification disabled Oct 13 06:52:48.906662 kernel: ACPI: RSDP 0x00000000000F05B0 000024 (v02 SUPERM) Oct 13 06:52:48.906671 kernel: ACPI: XSDT 0x000000008C54E0C8 00010C (v01 SUPERM SUPERM 01072009 AMI 00010013) Oct 13 06:52:48.906676 kernel: ACPI: FACP 0x000000008C58A670 000114 (v06 01072009 AMI 00010013) Oct 13 06:52:48.906681 kernel: ACPI: DSDT 0x000000008C54E268 03C404 (v02 SUPERM SMCI--MB 01072009 INTL 20160527) Oct 13 06:52:48.906687 kernel: ACPI: FACS 0x000000008C66CF80 000040 Oct 13 06:52:48.906692 kernel: ACPI: APIC 0x000000008C58A788 00012C (v04 01072009 AMI 00010013) Oct 13 06:52:48.906697 kernel: ACPI: FPDT 0x000000008C58A8B8 000044 (v01 01072009 AMI 00010013) Oct 13 06:52:48.906702 kernel: ACPI: FIDT 0x000000008C58A900 00009C (v01 SUPERM SMCI--MB 01072009 AMI 00010013) Oct 13 06:52:48.906707 kernel: ACPI: MCFG 0x000000008C58A9A0 00003C (v01 SUPERM SMCI--MB 01072009 MSFT 00000097) Oct 13 06:52:48.906712 kernel: ACPI: SPMI 0x000000008C58A9E0 000041 (v05 SUPERM SMCI--MB 00000000 AMI. 00000000) Oct 13 06:52:48.906717 kernel: ACPI: SSDT 0x000000008C58AA28 001B1C (v02 CpuRef CpuSsdt 00003000 INTL 20160527) Oct 13 06:52:48.906723 kernel: ACPI: SSDT 0x000000008C58C548 0031C6 (v02 SaSsdt SaSsdt 00003000 INTL 20160527) Oct 13 06:52:48.906728 kernel: ACPI: SSDT 0x000000008C58F710 00232B (v02 PegSsd PegSsdt 00001000 INTL 20160527) Oct 13 06:52:48.906733 kernel: ACPI: HPET 0x000000008C591A40 000038 (v01 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:52:48.906738 kernel: ACPI: SSDT 0x000000008C591A78 000FAE (v02 SUPERM Ther_Rvp 00001000 INTL 20160527) Oct 13 06:52:48.906742 kernel: ACPI: SSDT 0x000000008C592A28 0008F4 (v02 INTEL xh_mossb 00000000 INTL 20160527) Oct 13 06:52:48.906747 kernel: ACPI: UEFI 0x000000008C593320 000042 (v01 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:52:48.906752 kernel: ACPI: LPIT 0x000000008C593368 000094 (v01 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:52:48.906757 kernel: ACPI: SSDT 0x000000008C593400 0027DE (v02 SUPERM PtidDevc 00001000 INTL 20160527) Oct 13 06:52:48.906763 kernel: ACPI: SSDT 0x000000008C595BE0 0014E2 (v02 SUPERM TbtTypeC 00000000 INTL 20160527) Oct 13 06:52:48.906768 kernel: ACPI: DBGP 0x000000008C5970C8 000034 (v01 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:52:48.906773 kernel: ACPI: DBG2 0x000000008C597100 000054 (v00 SUPERM SMCI--MB 00000002 01000013) Oct 13 06:52:48.906778 kernel: ACPI: SSDT 0x000000008C597158 001B67 (v02 SUPERM UsbCTabl 00001000 INTL 20160527) Oct 13 06:52:48.906783 kernel: ACPI: DMAR 0x000000008C598CC0 000070 (v01 INTEL EDK2 00000002 01000013) Oct 13 06:52:48.906788 kernel: ACPI: SSDT 0x000000008C598D30 000144 (v02 Intel ADebTabl 00001000 INTL 20160527) Oct 13 06:52:48.906793 kernel: ACPI: TPM2 0x000000008C598E78 000034 (v04 SUPERM SMCI--MB 00000001 AMI 00000000) Oct 13 06:52:48.906798 kernel: ACPI: SSDT 0x000000008C598EB0 000D8F (v02 INTEL SpsNm 00000002 INTL 20160527) Oct 13 06:52:48.906803 kernel: ACPI: WSMT 0x000000008C599C40 000028 (v01 SUPERM 01072009 AMI 00010013) Oct 13 06:52:48.906809 kernel: ACPI: EINJ 0x000000008C599C68 000130 (v01 AMI AMI.EINJ 00000000 AMI. 00000000) Oct 13 06:52:48.906814 kernel: ACPI: ERST 0x000000008C599D98 000230 (v01 AMIER AMI.ERST 00000000 AMI. 00000000) Oct 13 06:52:48.906819 kernel: ACPI: BERT 0x000000008C599FC8 000030 (v01 AMI AMI.BERT 00000000 AMI. 00000000) Oct 13 06:52:48.906824 kernel: ACPI: HEST 0x000000008C599FF8 00027C (v01 AMI AMI.HEST 00000000 AMI. 00000000) Oct 13 06:52:48.906829 kernel: ACPI: SSDT 0x000000008C59A278 000162 (v01 SUPERM SMCCDN 00000000 INTL 20181221) Oct 13 06:52:48.906833 kernel: ACPI: Reserving FACP table memory at [mem 0x8c58a670-0x8c58a783] Oct 13 06:52:48.906838 kernel: ACPI: Reserving DSDT table memory at [mem 0x8c54e268-0x8c58a66b] Oct 13 06:52:48.906843 kernel: ACPI: Reserving FACS table memory at [mem 0x8c66cf80-0x8c66cfbf] Oct 13 06:52:48.906848 kernel: ACPI: Reserving APIC table memory at [mem 0x8c58a788-0x8c58a8b3] Oct 13 06:52:48.906854 kernel: ACPI: Reserving FPDT table memory at [mem 0x8c58a8b8-0x8c58a8fb] Oct 13 06:52:48.906859 kernel: ACPI: Reserving FIDT table memory at [mem 0x8c58a900-0x8c58a99b] Oct 13 06:52:48.906864 kernel: ACPI: Reserving MCFG table memory at [mem 0x8c58a9a0-0x8c58a9db] Oct 13 06:52:48.906868 kernel: ACPI: Reserving SPMI table memory at [mem 0x8c58a9e0-0x8c58aa20] Oct 13 06:52:48.906873 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58aa28-0x8c58c543] Oct 13 06:52:48.906878 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58c548-0x8c58f70d] Oct 13 06:52:48.906883 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c58f710-0x8c591a3a] Oct 13 06:52:48.906888 kernel: ACPI: Reserving HPET table memory at [mem 0x8c591a40-0x8c591a77] Oct 13 06:52:48.906893 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c591a78-0x8c592a25] Oct 13 06:52:48.906899 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c592a28-0x8c59331b] Oct 13 06:52:48.906904 kernel: ACPI: Reserving UEFI table memory at [mem 0x8c593320-0x8c593361] Oct 13 06:52:48.906909 kernel: ACPI: Reserving LPIT table memory at [mem 0x8c593368-0x8c5933fb] Oct 13 06:52:48.906913 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c593400-0x8c595bdd] Oct 13 06:52:48.906918 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c595be0-0x8c5970c1] Oct 13 06:52:48.906923 kernel: ACPI: Reserving DBGP table memory at [mem 0x8c5970c8-0x8c5970fb] Oct 13 06:52:48.906928 kernel: ACPI: Reserving DBG2 table memory at [mem 0x8c597100-0x8c597153] Oct 13 06:52:48.906933 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c597158-0x8c598cbe] Oct 13 06:52:48.906938 kernel: ACPI: Reserving DMAR table memory at [mem 0x8c598cc0-0x8c598d2f] Oct 13 06:52:48.906943 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598d30-0x8c598e73] Oct 13 06:52:48.906948 kernel: ACPI: Reserving TPM2 table memory at [mem 0x8c598e78-0x8c598eab] Oct 13 06:52:48.906953 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c598eb0-0x8c599c3e] Oct 13 06:52:48.906958 kernel: ACPI: Reserving WSMT table memory at [mem 0x8c599c40-0x8c599c67] Oct 13 06:52:48.906963 kernel: ACPI: Reserving EINJ table memory at [mem 0x8c599c68-0x8c599d97] Oct 13 06:52:48.906968 kernel: ACPI: Reserving ERST table memory at [mem 0x8c599d98-0x8c599fc7] Oct 13 06:52:48.906973 kernel: ACPI: Reserving BERT table memory at [mem 0x8c599fc8-0x8c599ff7] Oct 13 06:52:48.906978 kernel: ACPI: Reserving HEST table memory at [mem 0x8c599ff8-0x8c59a273] Oct 13 06:52:48.906983 kernel: ACPI: Reserving SSDT table memory at [mem 0x8c59a278-0x8c59a3d9] Oct 13 06:52:48.906987 kernel: No NUMA configuration found Oct 13 06:52:48.906993 kernel: Faking a node at [mem 0x0000000000000000-0x000000086effffff] Oct 13 06:52:48.906998 kernel: NODE_DATA(0) allocated [mem 0x86eff5dc0-0x86effcfff] Oct 13 06:52:48.907003 kernel: Zone ranges: Oct 13 06:52:48.907008 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Oct 13 06:52:48.907013 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Oct 13 06:52:48.907018 kernel: Normal [mem 0x0000000100000000-0x000000086effffff] Oct 13 06:52:48.907023 kernel: Device empty Oct 13 06:52:48.907028 kernel: Movable zone start for each node Oct 13 06:52:48.907033 kernel: Early memory node ranges Oct 13 06:52:48.907038 kernel: node 0: [mem 0x0000000000001000-0x0000000000098fff] Oct 13 06:52:48.907043 kernel: node 0: [mem 0x0000000000100000-0x000000003fffffff] Oct 13 06:52:48.907048 kernel: node 0: [mem 0x0000000040400000-0x0000000081b2cfff] Oct 13 06:52:48.907053 kernel: node 0: [mem 0x0000000081b2f000-0x000000008afccfff] Oct 13 06:52:48.907058 kernel: node 0: [mem 0x000000008c0b2000-0x000000008c23afff] Oct 13 06:52:48.907067 kernel: node 0: [mem 0x000000008eeff000-0x000000008eefffff] Oct 13 06:52:48.907072 kernel: node 0: [mem 0x0000000100000000-0x000000086effffff] Oct 13 06:52:48.907078 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000086effffff] Oct 13 06:52:48.907083 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Oct 13 06:52:48.907089 kernel: On node 0, zone DMA: 103 pages in unavailable ranges Oct 13 06:52:48.907094 kernel: On node 0, zone DMA32: 1024 pages in unavailable ranges Oct 13 06:52:48.907100 kernel: On node 0, zone DMA32: 2 pages in unavailable ranges Oct 13 06:52:48.907105 kernel: On node 0, zone DMA32: 4325 pages in unavailable ranges Oct 13 06:52:48.907110 kernel: On node 0, zone DMA32: 11460 pages in unavailable ranges Oct 13 06:52:48.907115 kernel: On node 0, zone Normal: 4352 pages in unavailable ranges Oct 13 06:52:48.907121 kernel: On node 0, zone Normal: 4096 pages in unavailable ranges Oct 13 06:52:48.907126 kernel: ACPI: PM-Timer IO Port: 0x1808 Oct 13 06:52:48.907132 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Oct 13 06:52:48.907137 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Oct 13 06:52:48.907142 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Oct 13 06:52:48.907148 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Oct 13 06:52:48.907153 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Oct 13 06:52:48.907158 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Oct 13 06:52:48.907163 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Oct 13 06:52:48.907168 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Oct 13 06:52:48.907174 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Oct 13 06:52:48.907180 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Oct 13 06:52:48.907185 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Oct 13 06:52:48.907190 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Oct 13 06:52:48.907195 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Oct 13 06:52:48.907200 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Oct 13 06:52:48.907206 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Oct 13 06:52:48.907211 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Oct 13 06:52:48.907216 kernel: IOAPIC[0]: apic_id 2, version 32, address 0xfec00000, GSI 0-119 Oct 13 06:52:48.907221 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Oct 13 06:52:48.907227 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Oct 13 06:52:48.907233 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Oct 13 06:52:48.907238 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Oct 13 06:52:48.907243 kernel: TSC deadline timer available Oct 13 06:52:48.907248 kernel: CPU topo: Max. logical packages: 1 Oct 13 06:52:48.907254 kernel: CPU topo: Max. logical dies: 1 Oct 13 06:52:48.907259 kernel: CPU topo: Max. dies per package: 1 Oct 13 06:52:48.907264 kernel: CPU topo: Max. threads per core: 2 Oct 13 06:52:48.907269 kernel: CPU topo: Num. cores per package: 8 Oct 13 06:52:48.907274 kernel: CPU topo: Num. threads per package: 16 Oct 13 06:52:48.907280 kernel: CPU topo: Allowing 16 present CPUs plus 0 hotplug CPUs Oct 13 06:52:48.907286 kernel: [mem 0x90000000-0xdfffffff] available for PCI devices Oct 13 06:52:48.907291 kernel: Booting paravirtualized kernel on bare hardware Oct 13 06:52:48.907296 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Oct 13 06:52:48.907302 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 Oct 13 06:52:48.907307 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 Oct 13 06:52:48.907312 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 Oct 13 06:52:48.907317 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 Oct 13 06:52:48.907323 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=a48d469b0deb49c328e6faf6cf366b11952d47f2d24963c866a0ea8221fb0039 Oct 13 06:52:48.907330 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Oct 13 06:52:48.907335 kernel: random: crng init done Oct 13 06:52:48.907340 kernel: Dentry cache hash table entries: 4194304 (order: 13, 33554432 bytes, linear) Oct 13 06:52:48.907345 kernel: Inode-cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Oct 13 06:52:48.907350 kernel: Fallback order for Node 0: 0 Oct 13 06:52:48.907356 kernel: Built 1 zonelists, mobility grouping on. Total pages: 8363245 Oct 13 06:52:48.907361 kernel: Policy zone: Normal Oct 13 06:52:48.907366 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Oct 13 06:52:48.907372 kernel: software IO TLB: area num 16. Oct 13 06:52:48.907378 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 Oct 13 06:52:48.907383 kernel: ftrace: allocating 40139 entries in 157 pages Oct 13 06:52:48.907388 kernel: ftrace: allocated 157 pages with 5 groups Oct 13 06:52:48.907393 kernel: Dynamic Preempt: voluntary Oct 13 06:52:48.907399 kernel: rcu: Preemptible hierarchical RCU implementation. Oct 13 06:52:48.907404 kernel: rcu: RCU event tracing is enabled. Oct 13 06:52:48.907410 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. Oct 13 06:52:48.907415 kernel: Trampoline variant of Tasks RCU enabled. Oct 13 06:52:48.907421 kernel: Rude variant of Tasks RCU enabled. Oct 13 06:52:48.907426 kernel: Tracing variant of Tasks RCU enabled. Oct 13 06:52:48.907432 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Oct 13 06:52:48.907437 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 Oct 13 06:52:48.907442 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Oct 13 06:52:48.907447 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Oct 13 06:52:48.907453 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. Oct 13 06:52:48.907459 kernel: NR_IRQS: 33024, nr_irqs: 2184, preallocated irqs: 16 Oct 13 06:52:48.907464 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Oct 13 06:52:48.907470 kernel: Console: colour VGA+ 80x25 Oct 13 06:52:48.907475 kernel: printk: legacy console [tty0] enabled Oct 13 06:52:48.907480 kernel: printk: legacy console [ttyS1] enabled Oct 13 06:52:48.907486 kernel: ACPI: Core revision 20240827 Oct 13 06:52:48.907491 kernel: hpet: HPET dysfunctional in PC10. Force disabled. Oct 13 06:52:48.907496 kernel: APIC: Switch to symmetric I/O mode setup Oct 13 06:52:48.907501 kernel: DMAR: Host address width 39 Oct 13 06:52:48.907506 kernel: DMAR: DRHD base: 0x000000fed91000 flags: 0x1 Oct 13 06:52:48.907512 kernel: DMAR: dmar0: reg_base_addr fed91000 ver 1:0 cap d2008c40660462 ecap f050da Oct 13 06:52:48.907518 kernel: DMAR: RMRR base: 0x0000008cf18000 end: 0x0000008d161fff Oct 13 06:52:48.907523 kernel: DMAR-IR: IOAPIC id 2 under DRHD base 0xfed91000 IOMMU 0 Oct 13 06:52:48.907529 kernel: DMAR-IR: HPET id 0 under DRHD base 0xfed91000 Oct 13 06:52:48.907534 kernel: DMAR-IR: Queued invalidation will be enabled to support x2apic and Intr-remapping. Oct 13 06:52:48.907539 kernel: DMAR-IR: Enabled IRQ remapping in x2apic mode Oct 13 06:52:48.907544 kernel: x2apic enabled Oct 13 06:52:48.907550 kernel: APIC: Switched APIC routing to: cluster x2apic Oct 13 06:52:48.907555 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x3101f59f5e6, max_idle_ns: 440795259996 ns Oct 13 06:52:48.907560 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 6799.81 BogoMIPS (lpj=3399906) Oct 13 06:52:48.907566 kernel: CPU0: Thermal monitoring enabled (TM1) Oct 13 06:52:48.907572 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Oct 13 06:52:48.907577 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Oct 13 06:52:48.907587 kernel: process: using mwait in idle threads Oct 13 06:52:48.907593 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Oct 13 06:52:48.907598 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Oct 13 06:52:48.907603 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Oct 13 06:52:48.907609 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Oct 13 06:52:48.907614 kernel: RETBleed: Mitigation: Enhanced IBRS Oct 13 06:52:48.907619 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Oct 13 06:52:48.907624 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Oct 13 06:52:48.907630 kernel: TAA: Mitigation: TSX disabled Oct 13 06:52:48.907635 kernel: MMIO Stale Data: Mitigation: Clear CPU buffers Oct 13 06:52:48.907640 kernel: SRBDS: Mitigation: Microcode Oct 13 06:52:48.907645 kernel: GDS: Vulnerable: No microcode Oct 13 06:52:48.907651 kernel: active return thunk: its_return_thunk Oct 13 06:52:48.907659 kernel: ITS: Mitigation: Aligned branch/return thunks Oct 13 06:52:48.907665 kernel: VMSCAPE: Mitigation: IBPB before exit to userspace Oct 13 06:52:48.907670 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Oct 13 06:52:48.907675 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Oct 13 06:52:48.907681 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Oct 13 06:52:48.907686 kernel: x86/fpu: Supporting XSAVE feature 0x008: 'MPX bounds registers' Oct 13 06:52:48.907692 kernel: x86/fpu: Supporting XSAVE feature 0x010: 'MPX CSR' Oct 13 06:52:48.907697 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Oct 13 06:52:48.907702 kernel: x86/fpu: xstate_offset[3]: 832, xstate_sizes[3]: 64 Oct 13 06:52:48.907708 kernel: x86/fpu: xstate_offset[4]: 896, xstate_sizes[4]: 64 Oct 13 06:52:48.907713 kernel: x86/fpu: Enabled xstate features 0x1f, context size is 960 bytes, using 'compacted' format. Oct 13 06:52:48.907718 kernel: Freeing SMP alternatives memory: 32K Oct 13 06:52:48.907723 kernel: pid_max: default: 32768 minimum: 301 Oct 13 06:52:48.907728 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Oct 13 06:52:48.907733 kernel: landlock: Up and running. Oct 13 06:52:48.907738 kernel: SELinux: Initializing. Oct 13 06:52:48.907744 kernel: Mount-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Oct 13 06:52:48.907750 kernel: Mountpoint-cache hash table entries: 65536 (order: 7, 524288 bytes, linear) Oct 13 06:52:48.907755 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Oct 13 06:52:48.907760 kernel: Performance Events: PEBS fmt3+, Skylake events, 32-deep LBR, full-width counters, Intel PMU driver. Oct 13 06:52:48.907766 kernel: ... version: 4 Oct 13 06:52:48.907771 kernel: ... bit width: 48 Oct 13 06:52:48.907776 kernel: ... generic registers: 4 Oct 13 06:52:48.907781 kernel: ... value mask: 0000ffffffffffff Oct 13 06:52:48.907787 kernel: ... max period: 00007fffffffffff Oct 13 06:52:48.907792 kernel: ... fixed-purpose events: 3 Oct 13 06:52:48.907797 kernel: ... event mask: 000000070000000f Oct 13 06:52:48.907803 kernel: signal: max sigframe size: 2032 Oct 13 06:52:48.907809 kernel: Estimated ratio of average max frequency by base frequency (times 1024): 1445 Oct 13 06:52:48.907814 kernel: rcu: Hierarchical SRCU implementation. Oct 13 06:52:48.907819 kernel: rcu: Max phase no-delay instances is 400. Oct 13 06:52:48.907825 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level Oct 13 06:52:48.907830 kernel: NMI watchdog: Enabled. Permanently consumes one hw-PMU counter. Oct 13 06:52:48.907835 kernel: smp: Bringing up secondary CPUs ... Oct 13 06:52:48.907840 kernel: smpboot: x86: Booting SMP configuration: Oct 13 06:52:48.907846 kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 #8 #9 #10 #11 #12 #13 #14 #15 Oct 13 06:52:48.907852 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Oct 13 06:52:48.907857 kernel: smp: Brought up 1 node, 16 CPUs Oct 13 06:52:48.907863 kernel: smpboot: Total of 16 processors activated (108796.99 BogoMIPS) Oct 13 06:52:48.907868 kernel: Memory: 32695424K/33452980K available (14336K kernel code, 2443K rwdata, 10000K rodata, 54096K init, 2852K bss, 732536K reserved, 0K cma-reserved) Oct 13 06:52:48.907873 kernel: devtmpfs: initialized Oct 13 06:52:48.907879 kernel: x86/mm: Memory block size: 128MB Oct 13 06:52:48.907884 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x81b2d000-0x81b2dfff] (4096 bytes) Oct 13 06:52:48.907889 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x8c23b000-0x8c66cfff] (4399104 bytes) Oct 13 06:52:48.907895 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Oct 13 06:52:48.907901 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) Oct 13 06:52:48.907906 kernel: pinctrl core: initialized pinctrl subsystem Oct 13 06:52:48.907911 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Oct 13 06:52:48.907916 kernel: audit: initializing netlink subsys (disabled) Oct 13 06:52:48.907922 kernel: audit: type=2000 audit(1760338360.041:1): state=initialized audit_enabled=0 res=1 Oct 13 06:52:48.907927 kernel: thermal_sys: Registered thermal governor 'step_wise' Oct 13 06:52:48.907932 kernel: thermal_sys: Registered thermal governor 'user_space' Oct 13 06:52:48.907937 kernel: cpuidle: using governor menu Oct 13 06:52:48.907943 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Oct 13 06:52:48.907949 kernel: dca service started, version 1.12.1 Oct 13 06:52:48.907954 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Oct 13 06:52:48.907959 kernel: PCI: Using configuration type 1 for base access Oct 13 06:52:48.907965 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Oct 13 06:52:48.907970 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Oct 13 06:52:48.907975 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Oct 13 06:52:48.907980 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Oct 13 06:52:48.907986 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Oct 13 06:52:48.907992 kernel: ACPI: Added _OSI(Module Device) Oct 13 06:52:48.907997 kernel: ACPI: Added _OSI(Processor Device) Oct 13 06:52:48.908002 kernel: ACPI: Added _OSI(Processor Aggregator Device) Oct 13 06:52:48.908007 kernel: ACPI: 12 ACPI AML tables successfully acquired and loaded Oct 13 06:52:48.908013 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:52:48.908018 kernel: ACPI: SSDT 0xFFFF8C7642097C00 000400 (v02 PmRef Cpu0Cst 00003001 INTL 20160527) Oct 13 06:52:48.908023 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:52:48.908028 kernel: ACPI: SSDT 0xFFFF8C764219E800 000683 (v02 PmRef Cpu0Ist 00003000 INTL 20160527) Oct 13 06:52:48.908034 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:52:48.908040 kernel: ACPI: SSDT 0xFFFF8C7640249300 0000F4 (v02 PmRef Cpu0Psd 00003000 INTL 20160527) Oct 13 06:52:48.908045 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:52:48.908050 kernel: ACPI: SSDT 0xFFFF8C764219D000 0005FC (v02 PmRef ApIst 00003000 INTL 20160527) Oct 13 06:52:48.908055 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:52:48.908060 kernel: ACPI: SSDT 0xFFFF8C76401A7000 000AB0 (v02 PmRef ApPsd 00003000 INTL 20160527) Oct 13 06:52:48.908066 kernel: ACPI: Dynamic OEM Table Load: Oct 13 06:52:48.908071 kernel: ACPI: SSDT 0xFFFF8C7642094800 00030A (v02 PmRef ApCst 00003000 INTL 20160527) Oct 13 06:52:48.908076 kernel: ACPI: Interpreter enabled Oct 13 06:52:48.908081 kernel: ACPI: PM: (supports S0 S5) Oct 13 06:52:48.908086 kernel: ACPI: Using IOAPIC for interrupt routing Oct 13 06:52:48.908093 kernel: HEST: Enabling Firmware First mode for corrected errors. Oct 13 06:52:48.908098 kernel: mce: [Firmware Bug]: Ignoring request to disable invalid MCA bank 14. Oct 13 06:52:48.908103 kernel: HEST: Table parsing has been initialized. Oct 13 06:52:48.908108 kernel: GHES: APEI firmware first mode is enabled by APEI bit and WHEA _OSC. Oct 13 06:52:48.908114 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Oct 13 06:52:48.908119 kernel: PCI: Using E820 reservations for host bridge windows Oct 13 06:52:48.908124 kernel: ACPI: Enabled 9 GPEs in block 00 to 7F Oct 13 06:52:48.908130 kernel: ACPI: \_SB_.PCI0.XDCI.USBC: New power resource Oct 13 06:52:48.908135 kernel: ACPI: \_SB_.PCI0.SAT0.VOL0.V0PR: New power resource Oct 13 06:52:48.908141 kernel: ACPI: \_SB_.PCI0.SAT0.VOL1.V1PR: New power resource Oct 13 06:52:48.908146 kernel: ACPI: \_SB_.PCI0.SAT0.VOL2.V2PR: New power resource Oct 13 06:52:48.908152 kernel: ACPI: \_SB_.PCI0.CNVW.WRST: New power resource Oct 13 06:52:48.908157 kernel: ACPI: \_TZ_.FN00: New power resource Oct 13 06:52:48.908162 kernel: ACPI: \_TZ_.FN01: New power resource Oct 13 06:52:48.908167 kernel: ACPI: \_TZ_.FN02: New power resource Oct 13 06:52:48.908173 kernel: ACPI: \_TZ_.FN03: New power resource Oct 13 06:52:48.908178 kernel: ACPI: \_TZ_.FN04: New power resource Oct 13 06:52:48.908183 kernel: ACPI: \PIN_: New power resource Oct 13 06:52:48.908189 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-fe]) Oct 13 06:52:48.908263 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Oct 13 06:52:48.908315 kernel: acpi PNP0A08:00: _OSC: platform does not support [AER] Oct 13 06:52:48.908364 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability LTR] Oct 13 06:52:48.908372 kernel: PCI host bridge to bus 0000:00 Oct 13 06:52:48.908421 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Oct 13 06:52:48.908468 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Oct 13 06:52:48.908511 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Oct 13 06:52:48.908553 kernel: pci_bus 0000:00: root bus resource [mem 0x90000000-0xdfffffff window] Oct 13 06:52:48.908604 kernel: pci_bus 0000:00: root bus resource [mem 0xfc800000-0xfe7fffff window] Oct 13 06:52:48.908648 kernel: pci_bus 0000:00: root bus resource [bus 00-fe] Oct 13 06:52:48.908714 kernel: pci 0000:00:00.0: [8086:3e31] type 00 class 0x060000 conventional PCI endpoint Oct 13 06:52:48.908771 kernel: pci 0000:00:01.0: [8086:1901] type 01 class 0x060400 PCIe Root Port Oct 13 06:52:48.908825 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 13 06:52:48.908874 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Oct 13 06:52:48.908923 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Oct 13 06:52:48.908972 kernel: pci 0000:00:01.0: PME# supported from D0 D3hot D3cold Oct 13 06:52:48.909026 kernel: pci 0000:00:08.0: [8086:1911] type 00 class 0x088000 conventional PCI endpoint Oct 13 06:52:48.909075 kernel: pci 0000:00:08.0: BAR 0 [mem 0x9551f000-0x9551ffff 64bit] Oct 13 06:52:48.909127 kernel: pci 0000:00:12.0: [8086:a379] type 00 class 0x118000 conventional PCI endpoint Oct 13 06:52:48.909178 kernel: pci 0000:00:12.0: BAR 0 [mem 0x9551e000-0x9551efff 64bit] Oct 13 06:52:48.909232 kernel: pci 0000:00:14.0: [8086:a36d] type 00 class 0x0c0330 conventional PCI endpoint Oct 13 06:52:48.909281 kernel: pci 0000:00:14.0: BAR 0 [mem 0x95500000-0x9550ffff 64bit] Oct 13 06:52:48.909329 kernel: pci 0000:00:14.0: PME# supported from D3hot D3cold Oct 13 06:52:48.909382 kernel: pci 0000:00:14.2: [8086:a36f] type 00 class 0x050000 conventional PCI endpoint Oct 13 06:52:48.909434 kernel: pci 0000:00:14.2: BAR 0 [mem 0x95512000-0x95513fff 64bit] Oct 13 06:52:48.909491 kernel: pci 0000:00:14.2: BAR 2 [mem 0x9551d000-0x9551dfff 64bit] Oct 13 06:52:48.909547 kernel: pci 0000:00:15.0: [8086:a368] type 00 class 0x0c8000 conventional PCI endpoint Oct 13 06:52:48.909596 kernel: pci 0000:00:15.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Oct 13 06:52:48.909659 kernel: pci 0000:00:15.1: [8086:a369] type 00 class 0x0c8000 conventional PCI endpoint Oct 13 06:52:48.909710 kernel: pci 0000:00:15.1: BAR 0 [mem 0x00000000-0x00000fff 64bit] Oct 13 06:52:48.909763 kernel: pci 0000:00:16.0: [8086:a360] type 00 class 0x078000 conventional PCI endpoint Oct 13 06:52:48.909814 kernel: pci 0000:00:16.0: BAR 0 [mem 0x9551a000-0x9551afff 64bit] Oct 13 06:52:48.909863 kernel: pci 0000:00:16.0: PME# supported from D3hot Oct 13 06:52:48.909915 kernel: pci 0000:00:16.1: [8086:a361] type 00 class 0x078000 conventional PCI endpoint Oct 13 06:52:48.909964 kernel: pci 0000:00:16.1: BAR 0 [mem 0x95519000-0x95519fff 64bit] Oct 13 06:52:48.910012 kernel: pci 0000:00:16.1: PME# supported from D3hot Oct 13 06:52:48.910065 kernel: pci 0000:00:16.4: [8086:a364] type 00 class 0x078000 conventional PCI endpoint Oct 13 06:52:48.910116 kernel: pci 0000:00:16.4: BAR 0 [mem 0x95518000-0x95518fff 64bit] Oct 13 06:52:48.910165 kernel: pci 0000:00:16.4: PME# supported from D3hot Oct 13 06:52:48.910217 kernel: pci 0000:00:17.0: [8086:a352] type 00 class 0x010601 conventional PCI endpoint Oct 13 06:52:48.910267 kernel: pci 0000:00:17.0: BAR 0 [mem 0x95510000-0x95511fff] Oct 13 06:52:48.910315 kernel: pci 0000:00:17.0: BAR 1 [mem 0x95517000-0x955170ff] Oct 13 06:52:48.910366 kernel: pci 0000:00:17.0: BAR 2 [io 0x6050-0x6057] Oct 13 06:52:48.910414 kernel: pci 0000:00:17.0: BAR 3 [io 0x6040-0x6043] Oct 13 06:52:48.910463 kernel: pci 0000:00:17.0: BAR 4 [io 0x6020-0x603f] Oct 13 06:52:48.910512 kernel: pci 0000:00:17.0: BAR 5 [mem 0x95516000-0x955167ff] Oct 13 06:52:48.910561 kernel: pci 0000:00:17.0: PME# supported from D3hot Oct 13 06:52:48.910628 kernel: pci 0000:00:1b.0: [8086:a340] type 01 class 0x060400 PCIe Root Port Oct 13 06:52:48.910687 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Oct 13 06:52:48.910739 kernel: pci 0000:00:1b.0: PME# supported from D0 D3hot D3cold Oct 13 06:52:48.910794 kernel: pci 0000:00:1b.4: [8086:a32c] type 01 class 0x060400 PCIe Root Port Oct 13 06:52:48.910844 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Oct 13 06:52:48.910894 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Oct 13 06:52:48.910943 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Oct 13 06:52:48.910992 kernel: pci 0000:00:1b.4: PME# supported from D0 D3hot D3cold Oct 13 06:52:48.911048 kernel: pci 0000:00:1b.5: [8086:a32d] type 01 class 0x060400 PCIe Root Port Oct 13 06:52:48.911098 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Oct 13 06:52:48.911147 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Oct 13 06:52:48.911196 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Oct 13 06:52:48.911245 kernel: pci 0000:00:1b.5: PME# supported from D0 D3hot D3cold Oct 13 06:52:48.911298 kernel: pci 0000:00:1c.0: [8086:a338] type 01 class 0x060400 PCIe Root Port Oct 13 06:52:48.911347 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Oct 13 06:52:48.911400 kernel: pci 0000:00:1c.0: PME# supported from D0 D3hot D3cold Oct 13 06:52:48.911453 kernel: pci 0000:00:1c.3: [8086:a33b] type 01 class 0x060400 PCIe Root Port Oct 13 06:52:48.911503 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Oct 13 06:52:48.911552 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Oct 13 06:52:48.911601 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Oct 13 06:52:48.911667 kernel: pci 0000:00:1c.3: PME# supported from D0 D3hot D3cold Oct 13 06:52:48.911723 kernel: pci 0000:00:1e.0: [8086:a328] type 00 class 0x078000 conventional PCI endpoint Oct 13 06:52:48.911775 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x00000000-0x00000fff 64bit] Oct 13 06:52:48.911828 kernel: pci 0000:00:1f.0: [8086:a309] type 00 class 0x060100 conventional PCI endpoint Oct 13 06:52:48.911884 kernel: pci 0000:00:1f.4: [8086:a323] type 00 class 0x0c0500 conventional PCI endpoint Oct 13 06:52:48.911933 kernel: pci 0000:00:1f.4: BAR 0 [mem 0x95514000-0x955140ff 64bit] Oct 13 06:52:48.911981 kernel: pci 0000:00:1f.4: BAR 4 [io 0xefa0-0xefbf] Oct 13 06:52:48.912033 kernel: pci 0000:00:1f.5: [8086:a324] type 00 class 0x0c8000 conventional PCI endpoint Oct 13 06:52:48.912084 kernel: pci 0000:00:1f.5: BAR 0 [mem 0xfe010000-0xfe010fff] Oct 13 06:52:48.912139 kernel: pci 0000:01:00.0: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Oct 13 06:52:48.912190 kernel: pci 0000:01:00.0: BAR 0 [mem 0x92000000-0x93ffffff 64bit pref] Oct 13 06:52:48.912241 kernel: pci 0000:01:00.0: ROM [mem 0x95200000-0x952fffff pref] Oct 13 06:52:48.912291 kernel: pci 0000:01:00.0: PME# supported from D3cold Oct 13 06:52:48.912341 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Oct 13 06:52:48.912393 kernel: pci 0000:01:00.0: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Oct 13 06:52:48.912450 kernel: pci 0000:01:00.1: [15b3:1015] type 00 class 0x020000 PCIe Endpoint Oct 13 06:52:48.912500 kernel: pci 0000:01:00.1: BAR 0 [mem 0x90000000-0x91ffffff 64bit pref] Oct 13 06:52:48.912551 kernel: pci 0000:01:00.1: ROM [mem 0x95100000-0x951fffff pref] Oct 13 06:52:48.912601 kernel: pci 0000:01:00.1: PME# supported from D3cold Oct 13 06:52:48.912666 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x000fffff 64bit pref] Oct 13 06:52:48.912718 kernel: pci 0000:01:00.1: VF BAR 0 [mem 0x00000000-0x007fffff 64bit pref]: contains BAR 0 for 8 VFs Oct 13 06:52:48.912769 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 13 06:52:48.912820 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Oct 13 06:52:48.912875 kernel: pci 0000:03:00.0: working around ROM BAR overlap defect Oct 13 06:52:48.912925 kernel: pci 0000:03:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Oct 13 06:52:48.912976 kernel: pci 0000:03:00.0: BAR 0 [mem 0x95400000-0x9547ffff] Oct 13 06:52:48.913026 kernel: pci 0000:03:00.0: BAR 2 [io 0x5000-0x501f] Oct 13 06:52:48.913076 kernel: pci 0000:03:00.0: BAR 3 [mem 0x95480000-0x95483fff] Oct 13 06:52:48.913125 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Oct 13 06:52:48.913178 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Oct 13 06:52:48.913232 kernel: pci 0000:04:00.0: working around ROM BAR overlap defect Oct 13 06:52:48.913283 kernel: pci 0000:04:00.0: [8086:1533] type 00 class 0x020000 PCIe Endpoint Oct 13 06:52:48.913333 kernel: pci 0000:04:00.0: BAR 0 [mem 0x95300000-0x9537ffff] Oct 13 06:52:48.913383 kernel: pci 0000:04:00.0: BAR 2 [io 0x4000-0x401f] Oct 13 06:52:48.913433 kernel: pci 0000:04:00.0: BAR 3 [mem 0x95380000-0x95383fff] Oct 13 06:52:48.913483 kernel: pci 0000:04:00.0: PME# supported from D0 D3hot D3cold Oct 13 06:52:48.913535 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Oct 13 06:52:48.913586 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Oct 13 06:52:48.913651 kernel: pci 0000:06:00.0: [1a03:1150] type 01 class 0x060400 PCIe to PCI/PCI-X bridge Oct 13 06:52:48.913709 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Oct 13 06:52:48.913760 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Oct 13 06:52:48.913813 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Oct 13 06:52:48.913863 kernel: pci 0000:06:00.0: enabling Extended Tags Oct 13 06:52:48.913913 kernel: pci 0000:06:00.0: supports D1 D2 Oct 13 06:52:48.913965 kernel: pci 0000:06:00.0: PME# supported from D0 D1 D2 D3hot D3cold Oct 13 06:52:48.914015 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Oct 13 06:52:48.914071 kernel: pci_bus 0000:07: extended config space not accessible Oct 13 06:52:48.914128 kernel: pci 0000:07:00.0: [1a03:2000] type 00 class 0x030000 conventional PCI endpoint Oct 13 06:52:48.914182 kernel: pci 0000:07:00.0: BAR 0 [mem 0x94000000-0x94ffffff] Oct 13 06:52:48.914233 kernel: pci 0000:07:00.0: BAR 1 [mem 0x95000000-0x9501ffff] Oct 13 06:52:48.914288 kernel: pci 0000:07:00.0: BAR 2 [io 0x3000-0x307f] Oct 13 06:52:48.914342 kernel: pci 0000:07:00.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Oct 13 06:52:48.914444 kernel: pci 0000:07:00.0: supports D1 D2 Oct 13 06:52:48.914496 kernel: pci 0000:07:00.0: PME# supported from D0 D1 D2 D3hot D3cold Oct 13 06:52:48.914546 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Oct 13 06:52:48.914555 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 0 Oct 13 06:52:48.914560 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 1 Oct 13 06:52:48.914566 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 0 Oct 13 06:52:48.914574 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 0 Oct 13 06:52:48.914579 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 0 Oct 13 06:52:48.914585 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 0 Oct 13 06:52:48.914590 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 0 Oct 13 06:52:48.914596 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 0 Oct 13 06:52:48.914601 kernel: iommu: Default domain type: Translated Oct 13 06:52:48.914607 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Oct 13 06:52:48.914612 kernel: PCI: Using ACPI for IRQ routing Oct 13 06:52:48.914618 kernel: PCI: pci_cache_line_size set to 64 bytes Oct 13 06:52:48.914623 kernel: e820: reserve RAM buffer [mem 0x00099800-0x0009ffff] Oct 13 06:52:48.914630 kernel: e820: reserve RAM buffer [mem 0x81b2d000-0x83ffffff] Oct 13 06:52:48.914635 kernel: e820: reserve RAM buffer [mem 0x8afcd000-0x8bffffff] Oct 13 06:52:48.914641 kernel: e820: reserve RAM buffer [mem 0x8c23b000-0x8fffffff] Oct 13 06:52:48.914646 kernel: e820: reserve RAM buffer [mem 0x8ef00000-0x8fffffff] Oct 13 06:52:48.914667 kernel: e820: reserve RAM buffer [mem 0x86f000000-0x86fffffff] Oct 13 06:52:48.914725 kernel: pci 0000:07:00.0: vgaarb: setting as boot VGA device Oct 13 06:52:48.914834 kernel: pci 0000:07:00.0: vgaarb: bridge control possible Oct 13 06:52:48.914925 kernel: pci 0000:07:00.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Oct 13 06:52:48.914936 kernel: vgaarb: loaded Oct 13 06:52:48.914941 kernel: clocksource: Switched to clocksource tsc-early Oct 13 06:52:48.914947 kernel: VFS: Disk quotas dquot_6.6.0 Oct 13 06:52:48.914953 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Oct 13 06:52:48.914958 kernel: pnp: PnP ACPI init Oct 13 06:52:48.915021 kernel: system 00:00: [mem 0x40000000-0x403fffff] has been reserved Oct 13 06:52:48.915116 kernel: pnp 00:02: [dma 0 disabled] Oct 13 06:52:48.915219 kernel: pnp 00:03: [dma 0 disabled] Oct 13 06:52:48.915273 kernel: system 00:04: [io 0x0680-0x069f] has been reserved Oct 13 06:52:48.915318 kernel: system 00:04: [io 0x164e-0x164f] has been reserved Oct 13 06:52:48.915367 kernel: system 00:05: [mem 0xfed10000-0xfed17fff] has been reserved Oct 13 06:52:48.915412 kernel: system 00:05: [mem 0xfed18000-0xfed18fff] has been reserved Oct 13 06:52:48.915456 kernel: system 00:05: [mem 0xfed19000-0xfed19fff] has been reserved Oct 13 06:52:48.915500 kernel: system 00:05: [mem 0xe0000000-0xefffffff] has been reserved Oct 13 06:52:48.915546 kernel: system 00:05: [mem 0xfed20000-0xfed3ffff] has been reserved Oct 13 06:52:48.915590 kernel: system 00:05: [mem 0xfed90000-0xfed93fff] could not be reserved Oct 13 06:52:48.915634 kernel: system 00:05: [mem 0xfed45000-0xfed8ffff] has been reserved Oct 13 06:52:48.915698 kernel: system 00:05: [mem 0xfee00000-0xfeefffff] could not be reserved Oct 13 06:52:48.915746 kernel: system 00:06: [io 0x1800-0x18fe] could not be reserved Oct 13 06:52:48.915791 kernel: system 00:06: [mem 0xfd000000-0xfd69ffff] has been reserved Oct 13 06:52:48.915863 kernel: system 00:06: [mem 0xfd6c0000-0xfd6cffff] has been reserved Oct 13 06:52:48.915940 kernel: system 00:06: [mem 0xfd6f0000-0xfdffffff] has been reserved Oct 13 06:52:48.916037 kernel: system 00:06: [mem 0xfe000000-0xfe01ffff] could not be reserved Oct 13 06:52:48.916124 kernel: system 00:06: [mem 0xfe200000-0xfe7fffff] has been reserved Oct 13 06:52:48.916211 kernel: system 00:06: [mem 0xff000000-0xffffffff] has been reserved Oct 13 06:52:48.916290 kernel: system 00:07: [io 0x2000-0x20fe] has been reserved Oct 13 06:52:48.916300 kernel: pnp: PnP ACPI: found 9 devices Oct 13 06:52:48.916320 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Oct 13 06:52:48.916327 kernel: NET: Registered PF_INET protocol family Oct 13 06:52:48.916335 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 13 06:52:48.916341 kernel: tcp_listen_portaddr_hash hash table entries: 16384 (order: 6, 262144 bytes, linear) Oct 13 06:52:48.916346 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Oct 13 06:52:48.916376 kernel: TCP established hash table entries: 262144 (order: 9, 2097152 bytes, linear) Oct 13 06:52:48.916382 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Oct 13 06:52:48.916404 kernel: TCP: Hash tables configured (established 262144 bind 65536) Oct 13 06:52:48.916428 kernel: UDP hash table entries: 16384 (order: 7, 524288 bytes, linear) Oct 13 06:52:48.916436 kernel: UDP-Lite hash table entries: 16384 (order: 7, 524288 bytes, linear) Oct 13 06:52:48.916443 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Oct 13 06:52:48.916449 kernel: NET: Registered PF_XDP protocol family Oct 13 06:52:48.916502 kernel: pci 0000:00:15.0: BAR 0 [mem 0x95515000-0x95515fff 64bit]: assigned Oct 13 06:52:48.916552 kernel: pci 0000:00:15.1: BAR 0 [mem 0x9551b000-0x9551bfff 64bit]: assigned Oct 13 06:52:48.916602 kernel: pci 0000:00:1e.0: BAR 0 [mem 0x9551c000-0x9551cfff 64bit]: assigned Oct 13 06:52:48.916653 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Oct 13 06:52:48.916720 kernel: pci 0000:01:00.0: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Oct 13 06:52:48.916774 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: can't assign; no space Oct 13 06:52:48.916824 kernel: pci 0000:01:00.1: VF BAR 0 [mem size 0x00800000 64bit pref]: failed to assign Oct 13 06:52:48.916874 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Oct 13 06:52:48.916992 kernel: pci 0000:00:01.0: bridge window [mem 0x95100000-0x952fffff] Oct 13 06:52:48.917111 kernel: pci 0000:00:01.0: bridge window [mem 0x90000000-0x93ffffff 64bit pref] Oct 13 06:52:48.917161 kernel: pci 0000:00:1b.0: PCI bridge to [bus 02] Oct 13 06:52:48.917211 kernel: pci 0000:00:1b.4: PCI bridge to [bus 03] Oct 13 06:52:48.917260 kernel: pci 0000:00:1b.4: bridge window [io 0x5000-0x5fff] Oct 13 06:52:48.917310 kernel: pci 0000:00:1b.4: bridge window [mem 0x95400000-0x954fffff] Oct 13 06:52:48.917359 kernel: pci 0000:00:1b.5: PCI bridge to [bus 04] Oct 13 06:52:48.917409 kernel: pci 0000:00:1b.5: bridge window [io 0x4000-0x4fff] Oct 13 06:52:48.917458 kernel: pci 0000:00:1b.5: bridge window [mem 0x95300000-0x953fffff] Oct 13 06:52:48.917506 kernel: pci 0000:00:1c.0: PCI bridge to [bus 05] Oct 13 06:52:48.917556 kernel: pci 0000:06:00.0: PCI bridge to [bus 07] Oct 13 06:52:48.917606 kernel: pci 0000:06:00.0: bridge window [io 0x3000-0x3fff] Oct 13 06:52:48.917660 kernel: pci 0000:06:00.0: bridge window [mem 0x94000000-0x950fffff] Oct 13 06:52:48.917725 kernel: pci 0000:00:1c.3: PCI bridge to [bus 06-07] Oct 13 06:52:48.917774 kernel: pci 0000:00:1c.3: bridge window [io 0x3000-0x3fff] Oct 13 06:52:48.917823 kernel: pci 0000:00:1c.3: bridge window [mem 0x94000000-0x950fffff] Oct 13 06:52:48.917868 kernel: pci_bus 0000:00: Some PCI device resources are unassigned, try booting with pci=realloc Oct 13 06:52:48.917913 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Oct 13 06:52:48.917956 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Oct 13 06:52:48.917998 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Oct 13 06:52:48.918041 kernel: pci_bus 0000:00: resource 7 [mem 0x90000000-0xdfffffff window] Oct 13 06:52:48.918084 kernel: pci_bus 0000:00: resource 8 [mem 0xfc800000-0xfe7fffff window] Oct 13 06:52:48.918136 kernel: pci_bus 0000:01: resource 1 [mem 0x95100000-0x952fffff] Oct 13 06:52:48.918182 kernel: pci_bus 0000:01: resource 2 [mem 0x90000000-0x93ffffff 64bit pref] Oct 13 06:52:48.918231 kernel: pci_bus 0000:03: resource 0 [io 0x5000-0x5fff] Oct 13 06:52:48.918276 kernel: pci_bus 0000:03: resource 1 [mem 0x95400000-0x954fffff] Oct 13 06:52:48.918328 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Oct 13 06:52:48.918373 kernel: pci_bus 0000:04: resource 1 [mem 0x95300000-0x953fffff] Oct 13 06:52:48.918423 kernel: pci_bus 0000:06: resource 0 [io 0x3000-0x3fff] Oct 13 06:52:48.918469 kernel: pci_bus 0000:06: resource 1 [mem 0x94000000-0x950fffff] Oct 13 06:52:48.918517 kernel: pci_bus 0000:07: resource 0 [io 0x3000-0x3fff] Oct 13 06:52:48.918564 kernel: pci_bus 0000:07: resource 1 [mem 0x94000000-0x950fffff] Oct 13 06:52:48.918572 kernel: PCI: CLS 64 bytes, default 64 Oct 13 06:52:48.918578 kernel: DMAR: No ATSR found Oct 13 06:52:48.918584 kernel: DMAR: No SATC found Oct 13 06:52:48.918591 kernel: DMAR: dmar0: Using Queued invalidation Oct 13 06:52:48.918641 kernel: pci 0000:00:00.0: Adding to iommu group 0 Oct 13 06:52:48.918709 kernel: pci 0000:00:01.0: Adding to iommu group 1 Oct 13 06:52:48.918760 kernel: pci 0000:00:08.0: Adding to iommu group 2 Oct 13 06:52:48.918809 kernel: pci 0000:00:12.0: Adding to iommu group 3 Oct 13 06:52:48.918858 kernel: pci 0000:00:14.0: Adding to iommu group 4 Oct 13 06:52:48.918906 kernel: pci 0000:00:14.2: Adding to iommu group 4 Oct 13 06:52:48.918954 kernel: pci 0000:00:15.0: Adding to iommu group 5 Oct 13 06:52:48.919003 kernel: pci 0000:00:15.1: Adding to iommu group 5 Oct 13 06:52:48.919054 kernel: pci 0000:00:16.0: Adding to iommu group 6 Oct 13 06:52:48.919102 kernel: pci 0000:00:16.1: Adding to iommu group 6 Oct 13 06:52:48.919150 kernel: pci 0000:00:16.4: Adding to iommu group 6 Oct 13 06:52:48.919199 kernel: pci 0000:00:17.0: Adding to iommu group 7 Oct 13 06:52:48.919248 kernel: pci 0000:00:1b.0: Adding to iommu group 8 Oct 13 06:52:48.919297 kernel: pci 0000:00:1b.4: Adding to iommu group 9 Oct 13 06:52:48.919346 kernel: pci 0000:00:1b.5: Adding to iommu group 10 Oct 13 06:52:48.919395 kernel: pci 0000:00:1c.0: Adding to iommu group 11 Oct 13 06:52:48.919446 kernel: pci 0000:00:1c.3: Adding to iommu group 12 Oct 13 06:52:48.919495 kernel: pci 0000:00:1e.0: Adding to iommu group 13 Oct 13 06:52:48.919544 kernel: pci 0000:00:1f.0: Adding to iommu group 14 Oct 13 06:52:48.919592 kernel: pci 0000:00:1f.4: Adding to iommu group 14 Oct 13 06:52:48.919640 kernel: pci 0000:00:1f.5: Adding to iommu group 14 Oct 13 06:52:48.919731 kernel: pci 0000:01:00.0: Adding to iommu group 1 Oct 13 06:52:48.919783 kernel: pci 0000:01:00.1: Adding to iommu group 1 Oct 13 06:52:48.919836 kernel: pci 0000:03:00.0: Adding to iommu group 15 Oct 13 06:52:48.919887 kernel: pci 0000:04:00.0: Adding to iommu group 16 Oct 13 06:52:48.919937 kernel: pci 0000:06:00.0: Adding to iommu group 17 Oct 13 06:52:48.919989 kernel: pci 0000:07:00.0: Adding to iommu group 17 Oct 13 06:52:48.919997 kernel: DMAR: Intel(R) Virtualization Technology for Directed I/O Oct 13 06:52:48.920003 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Oct 13 06:52:48.920009 kernel: software IO TLB: mapped [mem 0x0000000086fcd000-0x000000008afcd000] (64MB) Oct 13 06:52:48.920015 kernel: RAPL PMU: API unit is 2^-32 Joules, 3 fixed counters, 655360 ms ovfl timer Oct 13 06:52:48.920021 kernel: RAPL PMU: hw unit of domain pp0-core 2^-14 Joules Oct 13 06:52:48.920028 kernel: RAPL PMU: hw unit of domain package 2^-14 Joules Oct 13 06:52:48.920034 kernel: RAPL PMU: hw unit of domain dram 2^-14 Joules Oct 13 06:52:48.920085 kernel: platform rtc_cmos: registered platform RTC device (no PNP device found) Oct 13 06:52:48.920094 kernel: Initialise system trusted keyrings Oct 13 06:52:48.920100 kernel: workingset: timestamp_bits=39 max_order=23 bucket_order=0 Oct 13 06:52:48.920106 kernel: Key type asymmetric registered Oct 13 06:52:48.920111 kernel: Asymmetric key parser 'x509' registered Oct 13 06:52:48.920117 kernel: tsc: Refined TSC clocksource calibration: 3408.000 MHz Oct 13 06:52:48.920124 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Oct 13 06:52:48.920130 kernel: clocksource: Switched to clocksource tsc Oct 13 06:52:48.920135 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Oct 13 06:52:48.920141 kernel: io scheduler mq-deadline registered Oct 13 06:52:48.920146 kernel: io scheduler kyber registered Oct 13 06:52:48.920152 kernel: io scheduler bfq registered Oct 13 06:52:48.920200 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 121 Oct 13 06:52:48.920251 kernel: pcieport 0000:00:1b.0: PME: Signaling with IRQ 122 Oct 13 06:52:48.920300 kernel: pcieport 0000:00:1b.4: PME: Signaling with IRQ 123 Oct 13 06:52:48.920352 kernel: pcieport 0000:00:1b.5: PME: Signaling with IRQ 124 Oct 13 06:52:48.920402 kernel: pcieport 0000:00:1c.0: PME: Signaling with IRQ 125 Oct 13 06:52:48.920451 kernel: pcieport 0000:00:1c.3: PME: Signaling with IRQ 126 Oct 13 06:52:48.920508 kernel: thermal LNXTHERM:00: registered as thermal_zone0 Oct 13 06:52:48.920517 kernel: ACPI: thermal: Thermal Zone [TZ00] (28 C) Oct 13 06:52:48.920523 kernel: ERST: Error Record Serialization Table (ERST) support is initialized. Oct 13 06:52:48.920528 kernel: pstore: Using crash dump compression: deflate Oct 13 06:52:48.920534 kernel: pstore: Registered erst as persistent store backend Oct 13 06:52:48.920541 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Oct 13 06:52:48.920547 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Oct 13 06:52:48.920552 kernel: 00:02: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Oct 13 06:52:48.920558 kernel: 00:03: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Oct 13 06:52:48.920563 kernel: hpet_acpi_add: no address or irqs in _CRS Oct 13 06:52:48.920612 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x1B, rev-id 16) Oct 13 06:52:48.920621 kernel: i8042: PNP: No PS/2 controller found. Oct 13 06:52:48.920682 kernel: rtc_cmos rtc_cmos: RTC can wake from S4 Oct 13 06:52:48.920732 kernel: rtc_cmos rtc_cmos: registered as rtc0 Oct 13 06:52:48.920777 kernel: rtc_cmos rtc_cmos: setting system clock to 2025-10-13T06:52:47 UTC (1760338367) Oct 13 06:52:48.920822 kernel: rtc_cmos rtc_cmos: alarms up to one month, y3k, 114 bytes nvram Oct 13 06:52:48.920831 kernel: intel_pstate: Intel P-state driver initializing Oct 13 06:52:48.920837 kernel: intel_pstate: Disabling energy efficiency optimization Oct 13 06:52:48.920842 kernel: intel_pstate: HWP enabled Oct 13 06:52:48.920848 kernel: NET: Registered PF_INET6 protocol family Oct 13 06:52:48.920853 kernel: Segment Routing with IPv6 Oct 13 06:52:48.920861 kernel: In-situ OAM (IOAM) with IPv6 Oct 13 06:52:48.920866 kernel: NET: Registered PF_PACKET protocol family Oct 13 06:52:48.920872 kernel: Key type dns_resolver registered Oct 13 06:52:48.920877 kernel: ENERGY_PERF_BIAS: Set to 'normal', was 'performance' Oct 13 06:52:48.920883 kernel: microcode: Current revision: 0x000000f4 Oct 13 06:52:48.920888 kernel: IPI shorthand broadcast: enabled Oct 13 06:52:48.920894 kernel: sched_clock: Marking stable (3623278552, 1494280145)->(6725425965, -1607867268) Oct 13 06:52:48.920899 kernel: registered taskstats version 1 Oct 13 06:52:48.920905 kernel: Loading compiled-in X.509 certificates Oct 13 06:52:48.920910 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.51-flatcar: d8dbf4abead15098249886d373d42a3af4f50ccd' Oct 13 06:52:48.920917 kernel: Demotion targets for Node 0: null Oct 13 06:52:48.920923 kernel: Key type .fscrypt registered Oct 13 06:52:48.920928 kernel: Key type fscrypt-provisioning registered Oct 13 06:52:48.920934 kernel: ima: Allocated hash algorithm: sha1 Oct 13 06:52:48.920939 kernel: ima: No architecture policies found Oct 13 06:52:48.920945 kernel: clk: Disabling unused clocks Oct 13 06:52:48.920950 kernel: Warning: unable to open an initial console. Oct 13 06:52:48.920956 kernel: Freeing unused kernel image (initmem) memory: 54096K Oct 13 06:52:48.920962 kernel: Write protecting the kernel read-only data: 24576k Oct 13 06:52:48.920968 kernel: Freeing unused kernel image (rodata/data gap) memory: 240K Oct 13 06:52:48.920973 kernel: Run /init as init process Oct 13 06:52:48.920979 kernel: with arguments: Oct 13 06:52:48.920985 kernel: /init Oct 13 06:52:48.920990 kernel: with environment: Oct 13 06:52:48.920995 kernel: HOME=/ Oct 13 06:52:48.921001 kernel: TERM=linux Oct 13 06:52:48.921006 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Oct 13 06:52:48.921013 systemd[1]: Successfully made /usr/ read-only. Oct 13 06:52:48.921021 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 13 06:52:48.921027 systemd[1]: Detected architecture x86-64. Oct 13 06:52:48.921033 systemd[1]: Running in initrd. Oct 13 06:52:48.921038 systemd[1]: No hostname configured, using default hostname. Oct 13 06:52:48.921044 systemd[1]: Hostname set to . Oct 13 06:52:48.921050 systemd[1]: Initializing machine ID from random generator. Oct 13 06:52:48.921057 systemd[1]: Queued start job for default target initrd.target. Oct 13 06:52:48.921063 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 13 06:52:48.921069 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 13 06:52:48.921075 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Oct 13 06:52:48.921081 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 13 06:52:48.921087 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Oct 13 06:52:48.921093 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Oct 13 06:52:48.921101 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Oct 13 06:52:48.921107 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Oct 13 06:52:48.921113 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 13 06:52:48.921119 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 13 06:52:48.921125 systemd[1]: Reached target paths.target - Path Units. Oct 13 06:52:48.921130 systemd[1]: Reached target slices.target - Slice Units. Oct 13 06:52:48.921136 systemd[1]: Reached target swap.target - Swaps. Oct 13 06:52:48.921142 systemd[1]: Reached target timers.target - Timer Units. Oct 13 06:52:48.921149 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Oct 13 06:52:48.921155 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 13 06:52:48.921161 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Oct 13 06:52:48.921166 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Oct 13 06:52:48.921172 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 13 06:52:48.921178 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 13 06:52:48.921184 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 13 06:52:48.921190 systemd[1]: Reached target sockets.target - Socket Units. Oct 13 06:52:48.921196 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Oct 13 06:52:48.921203 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 13 06:52:48.921208 systemd[1]: Finished network-cleanup.service - Network Cleanup. Oct 13 06:52:48.921215 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Oct 13 06:52:48.921221 systemd[1]: Starting systemd-fsck-usr.service... Oct 13 06:52:48.921226 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 13 06:52:48.921232 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 13 06:52:48.921249 systemd-journald[300]: Collecting audit messages is disabled. Oct 13 06:52:48.921265 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 13 06:52:48.921272 systemd-journald[300]: Journal started Oct 13 06:52:48.921285 systemd-journald[300]: Runtime Journal (/run/log/journal/45dc3bfccc7e4f8aab24e5e2bcf2b0c7) is 8M, max 640.1M, 632.1M free. Oct 13 06:52:48.915185 systemd-modules-load[302]: Inserted module 'overlay' Oct 13 06:52:48.952247 systemd[1]: Started systemd-journald.service - Journal Service. Oct 13 06:52:48.952259 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Oct 13 06:52:48.960028 systemd-modules-load[302]: Inserted module 'br_netfilter' Oct 13 06:52:48.966752 kernel: Bridge firewalling registered Oct 13 06:52:48.966569 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Oct 13 06:52:48.966871 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 13 06:52:48.966972 systemd[1]: Finished systemd-fsck-usr.service. Oct 13 06:52:48.967053 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 13 06:52:48.968045 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 13 06:52:48.968469 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 13 06:52:48.968900 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 13 06:52:48.995650 systemd-tmpfiles[315]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Oct 13 06:52:48.996170 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 13 06:52:49.115691 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:52:49.137380 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 13 06:52:49.155287 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 13 06:52:49.178562 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Oct 13 06:52:49.194509 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 13 06:52:49.226261 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 13 06:52:49.231325 systemd-resolved[328]: Positive Trust Anchors: Oct 13 06:52:49.231332 systemd-resolved[328]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 13 06:52:49.231354 systemd-resolved[328]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 13 06:52:49.231893 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 13 06:52:49.232966 systemd-resolved[328]: Defaulting to hostname 'linux'. Oct 13 06:52:49.233487 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 13 06:52:49.261010 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 13 06:52:49.270894 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 13 06:52:49.301021 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Oct 13 06:52:49.365450 dracut-cmdline[343]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty0 console=ttyS1,115200n8 flatcar.first_boot=detected flatcar.oem.id=packet flatcar.autologin verity.usrhash=a48d469b0deb49c328e6faf6cf366b11952d47f2d24963c866a0ea8221fb0039 Oct 13 06:52:49.572678 kernel: SCSI subsystem initialized Oct 13 06:52:49.586694 kernel: Loading iSCSI transport class v2.0-870. Oct 13 06:52:49.598730 kernel: iscsi: registered transport (tcp) Oct 13 06:52:49.621129 kernel: iscsi: registered transport (qla4xxx) Oct 13 06:52:49.621147 kernel: QLogic iSCSI HBA Driver Oct 13 06:52:49.631279 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 13 06:52:49.670526 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 13 06:52:49.682292 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 13 06:52:49.795741 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Oct 13 06:52:49.808455 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Oct 13 06:52:49.926690 kernel: raid6: avx2x4 gen() 16836 MB/s Oct 13 06:52:49.947697 kernel: raid6: avx2x2 gen() 38673 MB/s Oct 13 06:52:49.973733 kernel: raid6: avx2x1 gen() 46173 MB/s Oct 13 06:52:49.973751 kernel: raid6: using algorithm avx2x1 gen() 46173 MB/s Oct 13 06:52:50.000825 kernel: raid6: .... xor() 24965 MB/s, rmw enabled Oct 13 06:52:50.000841 kernel: raid6: using avx2x2 recovery algorithm Oct 13 06:52:50.020703 kernel: xor: automatically using best checksumming function avx Oct 13 06:52:50.122692 kernel: Btrfs loaded, zoned=no, fsverity=no Oct 13 06:52:50.126044 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Oct 13 06:52:50.135755 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 13 06:52:50.182392 systemd-udevd[553]: Using default interface naming scheme 'v255'. Oct 13 06:52:50.185634 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 13 06:52:50.212613 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Oct 13 06:52:50.261492 dracut-pre-trigger[565]: rd.md=0: removing MD RAID activation Oct 13 06:52:50.279147 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Oct 13 06:52:50.289734 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 13 06:52:50.424457 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 13 06:52:50.460164 kernel: cryptd: max_cpu_qlen set to 1000 Oct 13 06:52:50.460206 kernel: pps_core: LinuxPPS API ver. 1 registered Oct 13 06:52:50.460238 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Oct 13 06:52:50.427870 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Oct 13 06:52:50.462471 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 13 06:52:50.513744 kernel: AES CTR mode by8 optimization enabled Oct 13 06:52:50.513762 kernel: libata version 3.00 loaded. Oct 13 06:52:50.513770 kernel: PTP clock support registered Oct 13 06:52:50.513781 kernel: ACPI: bus type USB registered Oct 13 06:52:50.513789 kernel: usbcore: registered new interface driver usbfs Oct 13 06:52:50.513796 kernel: usbcore: registered new interface driver hub Oct 13 06:52:50.513803 kernel: usbcore: registered new device driver usb Oct 13 06:52:50.513809 kernel: ahci 0000:00:17.0: version 3.0 Oct 13 06:52:50.462599 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:52:50.557342 kernel: ahci 0000:00:17.0: AHCI vers 0001.0301, 32 command slots, 6 Gbps, SATA mode Oct 13 06:52:50.557477 kernel: ahci 0000:00:17.0: 7/7 ports implemented (port mask 0x7f) Oct 13 06:52:50.557546 kernel: ahci 0000:00:17.0: flags: 64bit ncq sntf clo only pio slum part ems deso sadm sds apst Oct 13 06:52:50.557610 kernel: igb: Intel(R) Gigabit Ethernet Network Driver Oct 13 06:52:50.557619 kernel: igb: Copyright (c) 2007-2014 Intel Corporation. Oct 13 06:52:50.557626 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Oct 13 06:52:50.516313 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Oct 13 06:52:50.789830 kernel: scsi host0: ahci Oct 13 06:52:50.789929 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 1 Oct 13 06:52:50.790004 kernel: scsi host1: ahci Oct 13 06:52:50.790071 kernel: xhci_hcd 0000:00:14.0: hcc params 0x200077c1 hci version 0x110 quirks 0x0000000000009810 Oct 13 06:52:50.790135 kernel: scsi host2: ahci Oct 13 06:52:50.790197 kernel: xhci_hcd 0000:00:14.0: xHCI Host Controller Oct 13 06:52:50.790262 kernel: igb 0000:03:00.0: added PHC on eth0 Oct 13 06:52:50.790333 kernel: igb 0000:03:00.0: Intel(R) Gigabit Ethernet Network Connection Oct 13 06:52:50.790397 kernel: scsi host3: ahci Oct 13 06:52:50.790458 kernel: scsi host4: ahci Oct 13 06:52:50.790516 kernel: scsi host5: ahci Oct 13 06:52:50.790574 kernel: scsi host6: ahci Oct 13 06:52:50.790634 kernel: ata1: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516100 irq 127 lpm-pol 0 Oct 13 06:52:50.790644 kernel: ata2: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516180 irq 127 lpm-pol 0 Oct 13 06:52:50.790652 kernel: ata3: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516200 irq 127 lpm-pol 0 Oct 13 06:52:50.790666 kernel: ata4: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516280 irq 127 lpm-pol 0 Oct 13 06:52:50.790674 kernel: ata5: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516300 irq 127 lpm-pol 0 Oct 13 06:52:50.790681 kernel: ata6: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516380 irq 127 lpm-pol 0 Oct 13 06:52:50.790688 kernel: ata7: SATA max UDMA/133 abar m2048@0x95516000 port 0x95516400 irq 127 lpm-pol 0 Oct 13 06:52:50.790695 kernel: xhci_hcd 0000:00:14.0: new USB bus registered, assigned bus number 2 Oct 13 06:52:50.790759 kernel: igb 0000:03:00.0: eth0: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:6a:32:40 Oct 13 06:52:50.790825 kernel: xhci_hcd 0000:00:14.0: Host supports USB 3.1 Enhanced SuperSpeed Oct 13 06:52:50.790887 kernel: igb 0000:03:00.0: eth0: PBA No: 010000-000 Oct 13 06:52:50.790950 kernel: hub 1-0:1.0: USB hub found Oct 13 06:52:50.791087 kernel: igb 0000:03:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Oct 13 06:52:50.791150 kernel: hub 1-0:1.0: 16 ports detected Oct 13 06:52:50.791210 kernel: igb 0000:04:00.0: added PHC on eth1 Oct 13 06:52:50.791275 kernel: hub 2-0:1.0: USB hub found Oct 13 06:52:50.791342 kernel: igb 0000:04:00.0: Intel(R) Gigabit Ethernet Network Connection Oct 13 06:52:50.791405 kernel: hub 2-0:1.0: 10 ports detected Oct 13 06:52:50.791464 kernel: igb 0000:04:00.0: eth1: (PCIe:2.5Gb/s:Width x1) 3c:ec:ef:6a:32:41 Oct 13 06:52:50.791527 kernel: igb 0000:04:00.0: eth1: PBA No: 010000-000 Oct 13 06:52:50.791589 kernel: igb 0000:04:00.0: Using MSI-X interrupts. 4 rx queue(s), 4 tx queue(s) Oct 13 06:52:50.781648 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 13 06:52:50.789945 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Oct 13 06:52:50.909474 kernel: ata1: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Oct 13 06:52:50.909495 kernel: ata1.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Oct 13 06:52:50.917055 kernel: ata1.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Oct 13 06:52:50.917068 kernel: usb 1-14: new high-speed USB device number 2 using xhci_hcd Oct 13 06:52:50.917689 kernel: ata4: SATA link down (SStatus 0 SControl 300) Oct 13 06:52:50.930659 kernel: ata7: SATA link down (SStatus 0 SControl 300) Oct 13 06:52:50.936683 kernel: ata5: SATA link down (SStatus 0 SControl 300) Oct 13 06:52:50.941659 kernel: ata6: SATA link down (SStatus 0 SControl 300) Oct 13 06:52:50.947665 kernel: ata3: SATA link down (SStatus 0 SControl 300) Oct 13 06:52:50.953660 kernel: ata2: SATA link up 6.0 Gbps (SStatus 133 SControl 300) Oct 13 06:52:50.968973 kernel: ata2.00: Model 'Micron_5300_MTFDDAK480TDT', rev ' D3MU001', applying quirks: zeroaftertrim Oct 13 06:52:50.976502 kernel: ata2.00: ATA-11: Micron_5300_MTFDDAK480TDT, D3MU001, max UDMA/133 Oct 13 06:52:50.986722 kernel: ata1.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Oct 13 06:52:50.986739 kernel: mlx5_core 0000:01:00.0: PTM is not supported by PCIe Oct 13 06:52:50.998811 kernel: ata2.00: 937703088 sectors, multi 16: LBA48 NCQ (depth 32), AA Oct 13 06:52:50.998827 kernel: mlx5_core 0000:01:00.0: firmware version: 14.27.1016 Oct 13 06:52:51.008709 kernel: ata1.00: Features: NCQ-prio Oct 13 06:52:51.008724 kernel: mlx5_core 0000:01:00.0: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Oct 13 06:52:51.017673 kernel: ata2.00: Features: NCQ-prio Oct 13 06:52:51.034663 kernel: ata1.00: configured for UDMA/133 Oct 13 06:52:51.034683 kernel: scsi 0:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Oct 13 06:52:51.036663 kernel: ata2.00: configured for UDMA/133 Oct 13 06:52:51.036680 kernel: scsi 1:0:0:0: Direct-Access ATA Micron_5300_MTFD U001 PQ: 0 ANSI: 5 Oct 13 06:52:51.037662 kernel: igb 0000:04:00.0 eno2: renamed from eth1 Oct 13 06:52:51.037762 kernel: igb 0000:03:00.0 eno1: renamed from eth0 Oct 13 06:52:51.041664 kernel: ata2.00: Enabling discard_zeroes_data Oct 13 06:52:51.041682 kernel: ata1.00: Enabling discard_zeroes_data Oct 13 06:52:51.041692 kernel: sd 1:0:0:0: [sda] 937703088 512-byte logical blocks: (480 GB/447 GiB) Oct 13 06:52:51.041772 kernel: sd 0:0:0:0: [sdb] 937703088 512-byte logical blocks: (480 GB/447 GiB) Oct 13 06:52:51.041849 kernel: sd 1:0:0:0: [sda] 4096-byte physical blocks Oct 13 06:52:51.041922 kernel: sd 0:0:0:0: [sdb] 4096-byte physical blocks Oct 13 06:52:51.041982 kernel: sd 1:0:0:0: [sda] Write Protect is off Oct 13 06:52:51.042046 kernel: sd 0:0:0:0: [sdb] Write Protect is off Oct 13 06:52:51.042106 kernel: sd 1:0:0:0: [sda] Mode Sense: 00 3a 00 00 Oct 13 06:52:51.042164 kernel: sd 0:0:0:0: [sdb] Mode Sense: 00 3a 00 00 Oct 13 06:52:51.042222 kernel: sd 1:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Oct 13 06:52:51.042281 kernel: sd 0:0:0:0: [sdb] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Oct 13 06:52:51.042338 kernel: sd 1:0:0:0: [sda] Preferred minimum I/O size 4096 bytes Oct 13 06:52:51.042396 kernel: sd 0:0:0:0: [sdb] Preferred minimum I/O size 4096 bytes Oct 13 06:52:51.042456 kernel: ata2.00: Enabling discard_zeroes_data Oct 13 06:52:51.042464 kernel: ata1.00: Enabling discard_zeroes_data Oct 13 06:52:51.054712 kernel: sd 1:0:0:0: [sda] Attached SCSI disk Oct 13 06:52:51.054839 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Oct 13 06:52:51.054854 kernel: GPT:9289727 != 937703087 Oct 13 06:52:51.054866 kernel: GPT:Alternate GPT header not at the end of the disk. Oct 13 06:52:51.054878 kernel: GPT:9289727 != 937703087 Oct 13 06:52:51.054889 kernel: GPT: Use GNU Parted to correct GPT errors. Oct 13 06:52:51.054901 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Oct 13 06:52:51.054916 kernel: sd 0:0:0:0: [sdb] Attached SCSI disk Oct 13 06:52:51.063661 kernel: hub 1-14:1.0: USB hub found Oct 13 06:52:51.113056 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Micron_5300_MTFDDAK480TDT ROOT. Oct 13 06:52:51.223575 kernel: hub 1-14:1.0: 4 ports detected Oct 13 06:52:51.224380 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Micron_5300_MTFDDAK480TDT EFI-SYSTEM. Oct 13 06:52:51.251943 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:52:51.297798 kernel: mlx5_core 0000:01:00.0: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Oct 13 06:52:51.297889 kernel: mlx5_core 0000:01:00.0: Port module event: module 0, Cable plugged Oct 13 06:52:51.287547 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Micron_5300_MTFDDAK480TDT USR-A. Oct 13 06:52:51.308849 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Micron_5300_MTFDDAK480TDT USR-A. Oct 13 06:52:51.325676 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Oct 13 06:52:51.344293 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Oct 13 06:52:51.402852 disk-uuid[760]: Primary Header is updated. Oct 13 06:52:51.402852 disk-uuid[760]: Secondary Entries is updated. Oct 13 06:52:51.402852 disk-uuid[760]: Secondary Header is updated. Oct 13 06:52:51.430737 kernel: ata1.00: Enabling discard_zeroes_data Oct 13 06:52:51.430783 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Oct 13 06:52:51.498724 kernel: mlx5_core 0000:01:00.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Oct 13 06:52:51.512778 kernel: mlx5_core 0000:01:00.1: PTM is not supported by PCIe Oct 13 06:52:51.512871 kernel: mlx5_core 0000:01:00.1: firmware version: 14.27.1016 Oct 13 06:52:51.512940 kernel: usb 1-14.1: new low-speed USB device number 3 using xhci_hcd Oct 13 06:52:51.512958 kernel: mlx5_core 0000:01:00.1: 63.008 Gb/s available PCIe bandwidth (8.0 GT/s PCIe x8 link) Oct 13 06:52:51.623697 kernel: hid: raw HID events driver (C) Jiri Kosina Oct 13 06:52:51.636148 kernel: usbcore: registered new interface driver usbhid Oct 13 06:52:51.636166 kernel: usbhid: USB HID core driver Oct 13 06:52:51.650733 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.0/0003:0557:2419.0001/input/input0 Oct 13 06:52:51.724628 kernel: hid-generic 0003:0557:2419.0001: input,hidraw0: USB HID v1.00 Keyboard [HID 0557:2419] on usb-0000:00:14.0-14.1/input0 Oct 13 06:52:51.724728 kernel: input: HID 0557:2419 as /devices/pci0000:00/0000:00:14.0/usb1/1-14/1-14.1/1-14.1:1.1/0003:0557:2419.0002/input/input1 Oct 13 06:52:51.736524 kernel: hid-generic 0003:0557:2419.0002: input,hidraw1: USB HID v1.00 Mouse [HID 0557:2419] on usb-0000:00:14.0-14.1/input1 Oct 13 06:52:51.812661 kernel: mlx5_core 0000:01:00.1: E-Switch: Total vports 10, per vport: max uc(1024) max mc(16384) Oct 13 06:52:51.824612 kernel: mlx5_core 0000:01:00.1: Port module event: module 1, Cable plugged Oct 13 06:52:52.070700 kernel: mlx5_core 0000:01:00.1: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0 basic) Oct 13 06:52:52.079714 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: renamed from eth0 Oct 13 06:52:52.079832 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: renamed from eth1 Oct 13 06:52:52.096715 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Oct 13 06:52:52.106218 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Oct 13 06:52:52.114836 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 13 06:52:52.134918 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 13 06:52:52.165096 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Oct 13 06:52:52.206207 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Oct 13 06:52:52.434632 kernel: ata1.00: Enabling discard_zeroes_data Oct 13 06:52:52.453047 disk-uuid[761]: The operation has completed successfully. Oct 13 06:52:52.460739 kernel: sdb: sdb1 sdb2 sdb3 sdb4 sdb6 sdb7 sdb9 Oct 13 06:52:52.484862 systemd[1]: disk-uuid.service: Deactivated successfully. Oct 13 06:52:52.484925 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Oct 13 06:52:52.524810 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Oct 13 06:52:52.553738 sh[804]: Success Oct 13 06:52:52.581487 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Oct 13 06:52:52.581506 kernel: device-mapper: uevent: version 1.0.3 Oct 13 06:52:52.590727 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Oct 13 06:52:52.602660 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Oct 13 06:52:52.648360 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Oct 13 06:52:52.659033 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Oct 13 06:52:52.688528 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Oct 13 06:52:52.735756 kernel: BTRFS: device fsid c8746500-26f5-4ec1-9da8-aef51ec7db92 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (817) Oct 13 06:52:52.735773 kernel: BTRFS info (device dm-0): first mount of filesystem c8746500-26f5-4ec1-9da8-aef51ec7db92 Oct 13 06:52:52.735780 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Oct 13 06:52:52.752685 kernel: BTRFS info (device dm-0): enabling ssd optimizations Oct 13 06:52:52.752705 kernel: BTRFS info (device dm-0): disabling log replay at mount time Oct 13 06:52:52.758814 kernel: BTRFS info (device dm-0): enabling free space tree Oct 13 06:52:52.761030 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Oct 13 06:52:52.769041 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Oct 13 06:52:52.793858 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Oct 13 06:52:52.794344 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Oct 13 06:52:52.810427 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Oct 13 06:52:52.878791 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sdb6 (8:22) scanned by mount (841) Oct 13 06:52:52.878817 kernel: BTRFS info (device sdb6): first mount of filesystem 1cd10441-4b32-40b7-b370-b928e4bc90dd Oct 13 06:52:52.886872 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Oct 13 06:52:52.898961 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 13 06:52:52.938889 kernel: BTRFS info (device sdb6): enabling ssd optimizations Oct 13 06:52:52.938904 kernel: BTRFS info (device sdb6): turning on async discard Oct 13 06:52:52.938913 kernel: BTRFS info (device sdb6): enabling free space tree Oct 13 06:52:52.938921 kernel: BTRFS info (device sdb6): last unmount of filesystem 1cd10441-4b32-40b7-b370-b928e4bc90dd Oct 13 06:52:52.928888 systemd[1]: Finished ignition-setup.service - Ignition (setup). Oct 13 06:52:52.949814 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Oct 13 06:52:52.981571 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 13 06:52:53.027488 systemd-networkd[988]: lo: Link UP Oct 13 06:52:53.027492 systemd-networkd[988]: lo: Gained carrier Oct 13 06:52:53.030048 systemd-networkd[988]: Enumeration completed Oct 13 06:52:53.030135 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 13 06:52:53.030585 systemd-networkd[988]: eno1: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 13 06:52:53.033858 systemd[1]: Reached target network.target - Network. Oct 13 06:52:53.072523 ignition[987]: Ignition 2.22.0 Oct 13 06:52:53.057839 systemd-networkd[988]: eno2: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 13 06:52:53.072528 ignition[987]: Stage: fetch-offline Oct 13 06:52:53.075123 unknown[987]: fetched base config from "system" Oct 13 06:52:53.072547 ignition[987]: no configs at "/usr/lib/ignition/base.d" Oct 13 06:52:53.075127 unknown[987]: fetched user config from "system" Oct 13 06:52:53.072552 ignition[987]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:52:53.076416 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Oct 13 06:52:53.072607 ignition[987]: parsed url from cmdline: "" Oct 13 06:52:53.085581 systemd-networkd[988]: enp1s0f0np0: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 13 06:52:53.072608 ignition[987]: no config URL provided Oct 13 06:52:53.097004 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Oct 13 06:52:53.072611 ignition[987]: reading system config file "/usr/lib/ignition/user.ign" Oct 13 06:52:53.097539 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Oct 13 06:52:53.072636 ignition[987]: parsing config with SHA512: db18773bc00482fa73f76108f08f782c35afe229bfcf131e9b95b7fd949e7ec42699ecd1b9d05bfd1bfc6e52054d8f75958644356370127e9bfc989c60f7d5ef Oct 13 06:52:53.075319 ignition[987]: fetch-offline: fetch-offline passed Oct 13 06:52:53.075322 ignition[987]: POST message to Packet Timeline Oct 13 06:52:53.075325 ignition[987]: POST Status error: resource requires networking Oct 13 06:52:53.075358 ignition[987]: Ignition finished successfully Oct 13 06:52:53.248877 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Oct 13 06:52:53.159618 ignition[1004]: Ignition 2.22.0 Oct 13 06:52:53.159622 ignition[1004]: Stage: kargs Oct 13 06:52:53.249396 systemd-networkd[988]: enp1s0f1np1: Configuring with /usr/lib/systemd/network/zz-default.network. Oct 13 06:52:53.159704 ignition[1004]: no configs at "/usr/lib/ignition/base.d" Oct 13 06:52:53.159709 ignition[1004]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:52:53.160112 ignition[1004]: kargs: kargs passed Oct 13 06:52:53.160114 ignition[1004]: POST message to Packet Timeline Oct 13 06:52:53.160122 ignition[1004]: GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:52:53.160443 ignition[1004]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:45941->[::1]:53: read: connection refused Oct 13 06:52:53.361323 ignition[1004]: GET https://metadata.packet.net/metadata: attempt #2 Oct 13 06:52:53.362360 ignition[1004]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:49120->[::1]:53: read: connection refused Oct 13 06:52:53.429774 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Oct 13 06:52:53.430792 systemd-networkd[988]: eno1: Link UP Oct 13 06:52:53.430949 systemd-networkd[988]: eno2: Link UP Oct 13 06:52:53.431121 systemd-networkd[988]: enp1s0f0np0: Link UP Oct 13 06:52:53.431312 systemd-networkd[988]: enp1s0f0np0: Gained carrier Oct 13 06:52:53.457340 systemd-networkd[988]: enp1s0f1np1: Link UP Oct 13 06:52:53.458612 systemd-networkd[988]: enp1s0f1np1: Gained carrier Oct 13 06:52:53.487023 systemd-networkd[988]: enp1s0f0np0: DHCPv4 address 139.178.94.25/31, gateway 139.178.94.24 acquired from 145.40.83.140 Oct 13 06:52:53.762610 ignition[1004]: GET https://metadata.packet.net/metadata: attempt #3 Oct 13 06:52:53.763877 ignition[1004]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:54607->[::1]:53: read: connection refused Oct 13 06:52:54.464214 systemd-networkd[988]: enp1s0f0np0: Gained IPv6LL Oct 13 06:52:54.563944 ignition[1004]: GET https://metadata.packet.net/metadata: attempt #4 Oct 13 06:52:54.564346 ignition[1004]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:47613->[::1]:53: read: connection refused Oct 13 06:52:55.232155 systemd-networkd[988]: enp1s0f1np1: Gained IPv6LL Oct 13 06:52:56.165815 ignition[1004]: GET https://metadata.packet.net/metadata: attempt #5 Oct 13 06:52:56.166900 ignition[1004]: GET error: Get "https://metadata.packet.net/metadata": dial tcp: lookup metadata.packet.net on [::1]:53: read udp [::1]:34920->[::1]:53: read: connection refused Oct 13 06:52:59.367519 ignition[1004]: GET https://metadata.packet.net/metadata: attempt #6 Oct 13 06:53:00.411018 ignition[1004]: GET result: OK Oct 13 06:53:00.908974 ignition[1004]: Ignition finished successfully Oct 13 06:53:00.914618 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Oct 13 06:53:00.927601 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Oct 13 06:53:00.973073 ignition[1023]: Ignition 2.22.0 Oct 13 06:53:00.973078 ignition[1023]: Stage: disks Oct 13 06:53:00.973164 ignition[1023]: no configs at "/usr/lib/ignition/base.d" Oct 13 06:53:00.973169 ignition[1023]: no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:53:00.973617 ignition[1023]: disks: disks passed Oct 13 06:53:00.973619 ignition[1023]: POST message to Packet Timeline Oct 13 06:53:00.973627 ignition[1023]: GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:53:02.004717 ignition[1023]: GET result: OK Oct 13 06:53:02.440517 ignition[1023]: Ignition finished successfully Oct 13 06:53:02.445649 systemd[1]: Finished ignition-disks.service - Ignition (disks). Oct 13 06:53:02.457877 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Oct 13 06:53:02.475926 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Oct 13 06:53:02.496965 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 13 06:53:02.516964 systemd[1]: Reached target sysinit.target - System Initialization. Oct 13 06:53:02.534958 systemd[1]: Reached target basic.target - Basic System. Oct 13 06:53:02.554467 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Oct 13 06:53:02.605206 systemd-fsck[1044]: ROOT: clean, 15/553520 files, 52789/553472 blocks Oct 13 06:53:02.614031 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Oct 13 06:53:02.629086 systemd[1]: Mounting sysroot.mount - /sysroot... Oct 13 06:53:02.735660 kernel: EXT4-fs (sdb9): mounted filesystem 8b520359-9763-45f3-b7f7-db1e9fbc640d r/w with ordered data mode. Quota mode: none. Oct 13 06:53:02.736056 systemd[1]: Mounted sysroot.mount - /sysroot. Oct 13 06:53:02.744065 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Oct 13 06:53:02.769158 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 13 06:53:02.777541 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Oct 13 06:53:02.793246 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Oct 13 06:53:02.798133 systemd[1]: Starting flatcar-static-network.service - Flatcar Static Network Agent... Oct 13 06:53:02.871863 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1053) Oct 13 06:53:02.871880 kernel: BTRFS info (device sdb6): first mount of filesystem 1cd10441-4b32-40b7-b370-b928e4bc90dd Oct 13 06:53:02.871892 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Oct 13 06:53:02.871899 kernel: BTRFS info (device sdb6): enabling ssd optimizations Oct 13 06:53:02.871906 kernel: BTRFS info (device sdb6): turning on async discard Oct 13 06:53:02.871913 kernel: BTRFS info (device sdb6): enabling free space tree Oct 13 06:53:02.861114 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Oct 13 06:53:02.861139 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Oct 13 06:53:02.891817 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 13 06:53:02.939886 coreos-metadata[1055]: Oct 13 06:53:02.924 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Oct 13 06:53:02.914873 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Oct 13 06:53:02.967740 coreos-metadata[1056]: Oct 13 06:53:02.923 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Oct 13 06:53:02.933807 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Oct 13 06:53:02.997514 initrd-setup-root[1085]: cut: /sysroot/etc/passwd: No such file or directory Oct 13 06:53:03.006809 initrd-setup-root[1092]: cut: /sysroot/etc/group: No such file or directory Oct 13 06:53:03.015900 initrd-setup-root[1099]: cut: /sysroot/etc/shadow: No such file or directory Oct 13 06:53:03.024735 initrd-setup-root[1106]: cut: /sysroot/etc/gshadow: No such file or directory Oct 13 06:53:03.065302 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Oct 13 06:53:03.074793 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Oct 13 06:53:03.083548 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Oct 13 06:53:03.117261 systemd[1]: sysroot-oem.mount: Deactivated successfully. Oct 13 06:53:03.132709 kernel: BTRFS info (device sdb6): last unmount of filesystem 1cd10441-4b32-40b7-b370-b928e4bc90dd Oct 13 06:53:03.134119 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Oct 13 06:53:03.141568 ignition[1174]: INFO : Ignition 2.22.0 Oct 13 06:53:03.141568 ignition[1174]: INFO : Stage: mount Oct 13 06:53:03.162839 ignition[1174]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 13 06:53:03.162839 ignition[1174]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:53:03.162839 ignition[1174]: INFO : mount: mount passed Oct 13 06:53:03.162839 ignition[1174]: INFO : POST message to Packet Timeline Oct 13 06:53:03.162839 ignition[1174]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:53:03.915032 coreos-metadata[1056]: Oct 13 06:53:03.914 INFO Fetch successful Oct 13 06:53:03.998828 systemd[1]: flatcar-static-network.service: Deactivated successfully. Oct 13 06:53:03.998900 systemd[1]: Finished flatcar-static-network.service - Flatcar Static Network Agent. Oct 13 06:53:04.059508 coreos-metadata[1055]: Oct 13 06:53:04.059 INFO Fetch successful Oct 13 06:53:04.094056 coreos-metadata[1055]: Oct 13 06:53:04.094 INFO wrote hostname ci-4459.1.0-a-3e5fd6a38a to /sysroot/etc/hostname Oct 13 06:53:04.095405 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Oct 13 06:53:04.624409 ignition[1174]: INFO : GET result: OK Oct 13 06:53:05.424961 ignition[1174]: INFO : Ignition finished successfully Oct 13 06:53:05.429880 systemd[1]: Finished ignition-mount.service - Ignition (mount). Oct 13 06:53:05.445100 systemd[1]: Starting ignition-files.service - Ignition (files)... Oct 13 06:53:05.479494 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Oct 13 06:53:05.521663 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/sdb6 (8:22) scanned by mount (1199) Oct 13 06:53:05.539056 kernel: BTRFS info (device sdb6): first mount of filesystem 1cd10441-4b32-40b7-b370-b928e4bc90dd Oct 13 06:53:05.539072 kernel: BTRFS info (device sdb6): using crc32c (crc32c-intel) checksum algorithm Oct 13 06:53:05.554698 kernel: BTRFS info (device sdb6): enabling ssd optimizations Oct 13 06:53:05.554714 kernel: BTRFS info (device sdb6): turning on async discard Oct 13 06:53:05.560816 kernel: BTRFS info (device sdb6): enabling free space tree Oct 13 06:53:05.562544 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Oct 13 06:53:05.595003 ignition[1216]: INFO : Ignition 2.22.0 Oct 13 06:53:05.595003 ignition[1216]: INFO : Stage: files Oct 13 06:53:05.606919 ignition[1216]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 13 06:53:05.606919 ignition[1216]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:53:05.606919 ignition[1216]: DEBUG : files: compiled without relabeling support, skipping Oct 13 06:53:05.606919 ignition[1216]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Oct 13 06:53:05.606919 ignition[1216]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Oct 13 06:53:05.606919 ignition[1216]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Oct 13 06:53:05.606919 ignition[1216]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Oct 13 06:53:05.606919 ignition[1216]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Oct 13 06:53:05.606919 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Oct 13 06:53:05.606919 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Oct 13 06:53:05.598201 unknown[1216]: wrote ssh authorized keys file for user: core Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Oct 13 06:53:05.733864 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Oct 13 06:53:05.971966 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Oct 13 06:53:06.196140 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Oct 13 06:53:06.668292 ignition[1216]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Oct 13 06:53:06.668292 ignition[1216]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Oct 13 06:53:06.696882 ignition[1216]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 13 06:53:06.696882 ignition[1216]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Oct 13 06:53:06.696882 ignition[1216]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Oct 13 06:53:06.696882 ignition[1216]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Oct 13 06:53:06.696882 ignition[1216]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Oct 13 06:53:06.696882 ignition[1216]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Oct 13 06:53:06.696882 ignition[1216]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Oct 13 06:53:06.696882 ignition[1216]: INFO : files: files passed Oct 13 06:53:06.696882 ignition[1216]: INFO : POST message to Packet Timeline Oct 13 06:53:06.696882 ignition[1216]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:53:07.911914 ignition[1216]: INFO : GET result: OK Oct 13 06:53:08.368482 ignition[1216]: INFO : Ignition finished successfully Oct 13 06:53:08.372864 systemd[1]: Finished ignition-files.service - Ignition (files). Oct 13 06:53:08.387966 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Oct 13 06:53:08.402265 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Oct 13 06:53:08.430078 systemd[1]: ignition-quench.service: Deactivated successfully. Oct 13 06:53:08.430149 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Oct 13 06:53:08.450080 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 13 06:53:08.467202 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Oct 13 06:53:08.488951 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Oct 13 06:53:08.518869 initrd-setup-root-after-ignition[1256]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 13 06:53:08.518869 initrd-setup-root-after-ignition[1256]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Oct 13 06:53:08.532956 initrd-setup-root-after-ignition[1261]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Oct 13 06:53:08.577956 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Oct 13 06:53:08.578015 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Oct 13 06:53:08.594963 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Oct 13 06:53:08.604913 systemd[1]: Reached target initrd.target - Initrd Default Target. Oct 13 06:53:08.605019 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Oct 13 06:53:08.605834 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Oct 13 06:53:08.700102 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 13 06:53:08.714702 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Oct 13 06:53:08.783862 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Oct 13 06:53:08.794265 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 13 06:53:08.815353 systemd[1]: Stopped target timers.target - Timer Units. Oct 13 06:53:08.835280 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Oct 13 06:53:08.835710 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Oct 13 06:53:08.863313 systemd[1]: Stopped target initrd.target - Initrd Default Target. Oct 13 06:53:08.882307 systemd[1]: Stopped target basic.target - Basic System. Oct 13 06:53:08.901260 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Oct 13 06:53:08.918183 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Oct 13 06:53:08.938257 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Oct 13 06:53:08.959265 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Oct 13 06:53:08.979247 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Oct 13 06:53:08.997263 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Oct 13 06:53:09.016427 systemd[1]: Stopped target sysinit.target - System Initialization. Oct 13 06:53:09.035285 systemd[1]: Stopped target local-fs.target - Local File Systems. Oct 13 06:53:09.053373 systemd[1]: Stopped target swap.target - Swaps. Oct 13 06:53:09.070273 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Oct 13 06:53:09.070701 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Oct 13 06:53:09.094298 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Oct 13 06:53:09.112289 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 13 06:53:09.131140 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Oct 13 06:53:09.131602 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 13 06:53:09.152141 systemd[1]: dracut-initqueue.service: Deactivated successfully. Oct 13 06:53:09.152540 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Oct 13 06:53:09.182380 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Oct 13 06:53:09.182911 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Oct 13 06:53:09.200457 systemd[1]: Stopped target paths.target - Path Units. Oct 13 06:53:09.216060 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Oct 13 06:53:09.216531 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 13 06:53:09.235263 systemd[1]: Stopped target slices.target - Slice Units. Oct 13 06:53:09.252260 systemd[1]: Stopped target sockets.target - Socket Units. Oct 13 06:53:09.269236 systemd[1]: iscsid.socket: Deactivated successfully. Oct 13 06:53:09.269529 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Oct 13 06:53:09.287276 systemd[1]: iscsiuio.socket: Deactivated successfully. Oct 13 06:53:09.287562 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Oct 13 06:53:09.308493 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Oct 13 06:53:09.308943 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Oct 13 06:53:09.437846 ignition[1282]: INFO : Ignition 2.22.0 Oct 13 06:53:09.437846 ignition[1282]: INFO : Stage: umount Oct 13 06:53:09.437846 ignition[1282]: INFO : no configs at "/usr/lib/ignition/base.d" Oct 13 06:53:09.437846 ignition[1282]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/packet" Oct 13 06:53:09.437846 ignition[1282]: INFO : umount: umount passed Oct 13 06:53:09.437846 ignition[1282]: INFO : POST message to Packet Timeline Oct 13 06:53:09.437846 ignition[1282]: INFO : GET https://metadata.packet.net/metadata: attempt #1 Oct 13 06:53:09.326353 systemd[1]: ignition-files.service: Deactivated successfully. Oct 13 06:53:09.326793 systemd[1]: Stopped ignition-files.service - Ignition (files). Oct 13 06:53:09.342246 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Oct 13 06:53:09.342602 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Oct 13 06:53:09.362768 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Oct 13 06:53:09.375273 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Oct 13 06:53:09.389847 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Oct 13 06:53:09.389938 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Oct 13 06:53:09.398961 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Oct 13 06:53:09.399062 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Oct 13 06:53:09.440485 systemd[1]: sysroot-boot.mount: Deactivated successfully. Oct 13 06:53:09.441361 systemd[1]: sysroot-boot.service: Deactivated successfully. Oct 13 06:53:09.441447 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Oct 13 06:53:09.447037 systemd[1]: initrd-cleanup.service: Deactivated successfully. Oct 13 06:53:09.447129 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Oct 13 06:53:10.455314 ignition[1282]: INFO : GET result: OK Oct 13 06:53:10.892023 ignition[1282]: INFO : Ignition finished successfully Oct 13 06:53:10.896248 systemd[1]: ignition-mount.service: Deactivated successfully. Oct 13 06:53:10.896545 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Oct 13 06:53:10.909852 systemd[1]: Stopped target network.target - Network. Oct 13 06:53:10.922935 systemd[1]: ignition-disks.service: Deactivated successfully. Oct 13 06:53:10.923117 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Oct 13 06:53:10.940995 systemd[1]: ignition-kargs.service: Deactivated successfully. Oct 13 06:53:10.941142 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Oct 13 06:53:10.956979 systemd[1]: ignition-setup.service: Deactivated successfully. Oct 13 06:53:10.957139 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Oct 13 06:53:10.973072 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Oct 13 06:53:10.973239 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Oct 13 06:53:10.991031 systemd[1]: initrd-setup-root.service: Deactivated successfully. Oct 13 06:53:10.991220 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Oct 13 06:53:11.007399 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Oct 13 06:53:11.025186 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Oct 13 06:53:11.041809 systemd[1]: systemd-resolved.service: Deactivated successfully. Oct 13 06:53:11.042095 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Oct 13 06:53:11.064239 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Oct 13 06:53:11.064838 systemd[1]: systemd-networkd.service: Deactivated successfully. Oct 13 06:53:11.065112 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Oct 13 06:53:11.080495 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Oct 13 06:53:11.082753 systemd[1]: Stopped target network-pre.target - Preparation for Network. Oct 13 06:53:11.096268 systemd[1]: systemd-networkd.socket: Deactivated successfully. Oct 13 06:53:11.096380 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Oct 13 06:53:11.116852 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Oct 13 06:53:11.139854 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Oct 13 06:53:11.139920 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Oct 13 06:53:11.140029 systemd[1]: systemd-sysctl.service: Deactivated successfully. Oct 13 06:53:11.140055 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Oct 13 06:53:11.192258 systemd[1]: systemd-modules-load.service: Deactivated successfully. Oct 13 06:53:11.192421 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Oct 13 06:53:11.209962 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Oct 13 06:53:11.210104 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 13 06:53:11.230430 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 13 06:53:11.253232 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Oct 13 06:53:11.253431 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Oct 13 06:53:11.254495 systemd[1]: systemd-udevd.service: Deactivated successfully. Oct 13 06:53:11.254877 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 13 06:53:11.272610 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Oct 13 06:53:11.272789 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Oct 13 06:53:11.288001 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Oct 13 06:53:11.288114 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Oct 13 06:53:11.297116 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Oct 13 06:53:11.297278 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Oct 13 06:53:11.347884 systemd[1]: dracut-cmdline.service: Deactivated successfully. Oct 13 06:53:11.348053 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Oct 13 06:53:11.375140 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Oct 13 06:53:11.375306 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Oct 13 06:53:11.415955 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Oct 13 06:53:11.423991 systemd[1]: systemd-network-generator.service: Deactivated successfully. Oct 13 06:53:11.424157 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Oct 13 06:53:11.442318 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Oct 13 06:53:11.442460 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 13 06:53:11.472162 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Oct 13 06:53:11.472315 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 13 06:53:11.725915 systemd-journald[300]: Received SIGTERM from PID 1 (systemd). Oct 13 06:53:11.489342 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Oct 13 06:53:11.489483 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Oct 13 06:53:11.508891 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Oct 13 06:53:11.509035 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:53:11.532341 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Oct 13 06:53:11.532500 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Oct 13 06:53:11.532613 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Oct 13 06:53:11.532747 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Oct 13 06:53:11.533996 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Oct 13 06:53:11.534232 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Oct 13 06:53:11.583680 systemd[1]: network-cleanup.service: Deactivated successfully. Oct 13 06:53:11.583979 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Oct 13 06:53:11.600895 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Oct 13 06:53:11.620904 systemd[1]: Starting initrd-switch-root.service - Switch Root... Oct 13 06:53:11.674643 systemd[1]: Switching root. Oct 13 06:53:11.863817 systemd-journald[300]: Journal stopped Oct 13 06:53:13.616233 kernel: SELinux: policy capability network_peer_controls=1 Oct 13 06:53:13.616247 kernel: SELinux: policy capability open_perms=1 Oct 13 06:53:13.616255 kernel: SELinux: policy capability extended_socket_class=1 Oct 13 06:53:13.616261 kernel: SELinux: policy capability always_check_network=0 Oct 13 06:53:13.616266 kernel: SELinux: policy capability cgroup_seclabel=1 Oct 13 06:53:13.616271 kernel: SELinux: policy capability nnp_nosuid_transition=1 Oct 13 06:53:13.616277 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Oct 13 06:53:13.616282 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Oct 13 06:53:13.616287 kernel: SELinux: policy capability userspace_initial_context=0 Oct 13 06:53:13.616293 kernel: audit: type=1403 audit(1760338391.992:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Oct 13 06:53:13.616300 systemd[1]: Successfully loaded SELinux policy in 93.692ms. Oct 13 06:53:13.616307 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 3.727ms. Oct 13 06:53:13.616314 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Oct 13 06:53:13.616320 systemd[1]: Detected architecture x86-64. Oct 13 06:53:13.616327 systemd[1]: Detected first boot. Oct 13 06:53:13.616333 systemd[1]: Hostname set to . Oct 13 06:53:13.616339 systemd[1]: Initializing machine ID from random generator. Oct 13 06:53:13.616345 zram_generator::config[1335]: No configuration found. Oct 13 06:53:13.616352 systemd[1]: Populated /etc with preset unit settings. Oct 13 06:53:13.616359 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Oct 13 06:53:13.616366 systemd[1]: initrd-switch-root.service: Deactivated successfully. Oct 13 06:53:13.616372 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Oct 13 06:53:13.616378 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Oct 13 06:53:13.616384 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Oct 13 06:53:13.616390 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Oct 13 06:53:13.616397 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Oct 13 06:53:13.616403 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Oct 13 06:53:13.616410 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Oct 13 06:53:13.616417 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Oct 13 06:53:13.616423 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Oct 13 06:53:13.616430 systemd[1]: Created slice user.slice - User and Session Slice. Oct 13 06:53:13.616436 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Oct 13 06:53:13.616445 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Oct 13 06:53:13.616451 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Oct 13 06:53:13.616458 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Oct 13 06:53:13.616464 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Oct 13 06:53:13.616472 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Oct 13 06:53:13.616478 systemd[1]: Expecting device dev-ttyS1.device - /dev/ttyS1... Oct 13 06:53:13.616485 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Oct 13 06:53:13.616491 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Oct 13 06:53:13.616499 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Oct 13 06:53:13.616505 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Oct 13 06:53:13.616512 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Oct 13 06:53:13.616519 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Oct 13 06:53:13.616526 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Oct 13 06:53:13.616532 systemd[1]: Reached target remote-fs.target - Remote File Systems. Oct 13 06:53:13.616539 systemd[1]: Reached target slices.target - Slice Units. Oct 13 06:53:13.616545 systemd[1]: Reached target swap.target - Swaps. Oct 13 06:53:13.616551 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Oct 13 06:53:13.616558 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Oct 13 06:53:13.616564 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Oct 13 06:53:13.616572 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Oct 13 06:53:13.616578 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Oct 13 06:53:13.616585 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Oct 13 06:53:13.616591 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Oct 13 06:53:13.616598 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Oct 13 06:53:13.616605 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Oct 13 06:53:13.616612 systemd[1]: Mounting media.mount - External Media Directory... Oct 13 06:53:13.616618 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:53:13.616625 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Oct 13 06:53:13.616631 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Oct 13 06:53:13.616638 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Oct 13 06:53:13.616644 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Oct 13 06:53:13.616651 systemd[1]: Reached target machines.target - Containers. Oct 13 06:53:13.616661 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Oct 13 06:53:13.616668 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 13 06:53:13.616675 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Oct 13 06:53:13.616681 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Oct 13 06:53:13.616688 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 13 06:53:13.616694 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 13 06:53:13.616701 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 13 06:53:13.616707 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Oct 13 06:53:13.616714 kernel: ACPI: bus type drm_connector registered Oct 13 06:53:13.616721 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 13 06:53:13.616729 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Oct 13 06:53:13.616735 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Oct 13 06:53:13.616742 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Oct 13 06:53:13.616748 kernel: fuse: init (API version 7.41) Oct 13 06:53:13.616754 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Oct 13 06:53:13.616761 systemd[1]: Stopped systemd-fsck-usr.service. Oct 13 06:53:13.616767 kernel: loop: module loaded Oct 13 06:53:13.616775 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 13 06:53:13.616781 systemd[1]: Starting systemd-journald.service - Journal Service... Oct 13 06:53:13.616788 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Oct 13 06:53:13.616794 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Oct 13 06:53:13.616810 systemd-journald[1438]: Collecting audit messages is disabled. Oct 13 06:53:13.616826 systemd-journald[1438]: Journal started Oct 13 06:53:13.616840 systemd-journald[1438]: Runtime Journal (/run/log/journal/d71539d6d8564071afdbd61b58a226e6) is 8M, max 640.1M, 632.1M free. Oct 13 06:53:12.476910 systemd[1]: Queued start job for default target multi-user.target. Oct 13 06:53:12.491577 systemd[1]: Unnecessary job was removed for dev-sdb6.device - /dev/sdb6. Oct 13 06:53:12.491826 systemd[1]: systemd-journald.service: Deactivated successfully. Oct 13 06:53:13.641732 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Oct 13 06:53:13.662715 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Oct 13 06:53:13.673852 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Oct 13 06:53:13.701925 systemd[1]: verity-setup.service: Deactivated successfully. Oct 13 06:53:13.701950 systemd[1]: Stopped verity-setup.service. Oct 13 06:53:13.726721 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:53:13.734698 systemd[1]: Started systemd-journald.service - Journal Service. Oct 13 06:53:13.743150 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Oct 13 06:53:13.751836 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Oct 13 06:53:13.761959 systemd[1]: Mounted media.mount - External Media Directory. Oct 13 06:53:13.771928 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Oct 13 06:53:13.780923 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Oct 13 06:53:13.789935 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Oct 13 06:53:13.799037 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Oct 13 06:53:13.809034 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Oct 13 06:53:13.819083 systemd[1]: modprobe@configfs.service: Deactivated successfully. Oct 13 06:53:13.819223 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Oct 13 06:53:13.830068 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 13 06:53:13.830233 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 13 06:53:13.840155 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 13 06:53:13.840374 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 13 06:53:13.849324 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 13 06:53:13.849685 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 13 06:53:13.860703 systemd[1]: modprobe@fuse.service: Deactivated successfully. Oct 13 06:53:13.861182 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Oct 13 06:53:13.870580 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 13 06:53:13.871103 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 13 06:53:13.880914 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Oct 13 06:53:13.890638 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Oct 13 06:53:13.901886 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Oct 13 06:53:13.913725 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Oct 13 06:53:13.924648 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Oct 13 06:53:13.958201 systemd[1]: Reached target network-pre.target - Preparation for Network. Oct 13 06:53:13.970024 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Oct 13 06:53:13.998528 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Oct 13 06:53:14.007964 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Oct 13 06:53:14.008074 systemd[1]: Reached target local-fs.target - Local File Systems. Oct 13 06:53:14.011279 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Oct 13 06:53:14.032907 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Oct 13 06:53:14.042789 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 13 06:53:14.063116 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Oct 13 06:53:14.089923 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Oct 13 06:53:14.099766 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 13 06:53:14.105954 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Oct 13 06:53:14.111574 systemd-journald[1438]: Time spent on flushing to /var/log/journal/d71539d6d8564071afdbd61b58a226e6 is 12.415ms for 1394 entries. Oct 13 06:53:14.111574 systemd-journald[1438]: System Journal (/var/log/journal/d71539d6d8564071afdbd61b58a226e6) is 8M, max 195.6M, 187.6M free. Oct 13 06:53:14.135241 systemd-journald[1438]: Received client request to flush runtime journal. Oct 13 06:53:14.122784 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 13 06:53:14.129933 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Oct 13 06:53:14.139377 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Oct 13 06:53:14.150586 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Oct 13 06:53:14.162717 kernel: loop0: detected capacity change from 0 to 110984 Oct 13 06:53:14.168092 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Oct 13 06:53:14.178346 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Oct 13 06:53:14.178804 systemd-tmpfiles[1477]: ACLs are not supported, ignoring. Oct 13 06:53:14.178813 systemd-tmpfiles[1477]: ACLs are not supported, ignoring. Oct 13 06:53:14.187665 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Oct 13 06:53:14.193958 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Oct 13 06:53:14.203900 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Oct 13 06:53:14.214958 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Oct 13 06:53:14.225722 kernel: loop1: detected capacity change from 0 to 128016 Oct 13 06:53:14.230909 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Oct 13 06:53:14.242272 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Oct 13 06:53:14.252466 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Oct 13 06:53:14.270886 systemd[1]: Starting systemd-sysusers.service - Create System Users... Oct 13 06:53:14.278663 kernel: loop2: detected capacity change from 0 to 8 Oct 13 06:53:14.292079 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Oct 13 06:53:14.293819 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Oct 13 06:53:14.309666 kernel: loop3: detected capacity change from 0 to 224512 Oct 13 06:53:14.314288 systemd[1]: Finished systemd-sysusers.service - Create System Users. Oct 13 06:53:14.323590 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Oct 13 06:53:14.355538 systemd-tmpfiles[1496]: ACLs are not supported, ignoring. Oct 13 06:53:14.355548 systemd-tmpfiles[1496]: ACLs are not supported, ignoring. Oct 13 06:53:14.357011 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Oct 13 06:53:14.359745 kernel: loop4: detected capacity change from 0 to 110984 Oct 13 06:53:14.383721 kernel: loop5: detected capacity change from 0 to 128016 Oct 13 06:53:14.438342 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Oct 13 06:53:14.443697 kernel: loop6: detected capacity change from 0 to 8 Oct 13 06:53:14.450662 kernel: loop7: detected capacity change from 0 to 224512 Oct 13 06:53:14.459824 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Oct 13 06:53:14.466499 (sd-merge)[1499]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-packet'. Oct 13 06:53:14.466743 (sd-merge)[1499]: Merged extensions into '/usr'. Oct 13 06:53:14.472112 systemd[1]: Reload requested from client PID 1474 ('systemd-sysext') (unit systemd-sysext.service)... Oct 13 06:53:14.472122 systemd[1]: Reloading... Oct 13 06:53:14.494721 systemd-udevd[1502]: Using default interface naming scheme 'v255'. Oct 13 06:53:14.495699 zram_generator::config[1527]: No configuration found. Oct 13 06:53:14.503204 ldconfig[1468]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Oct 13 06:53:14.554477 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input2 Oct 13 06:53:14.554526 kernel: ACPI: button: Sleep Button [SLPB] Oct 13 06:53:14.562242 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Oct 13 06:53:14.568666 kernel: IPMI message handler: version 39.2 Oct 13 06:53:14.568717 kernel: ACPI: button: Power Button [PWRF] Oct 13 06:53:14.573666 kernel: mousedev: PS/2 mouse device common for all mice Oct 13 06:53:14.585665 kernel: ipmi device interface Oct 13 06:53:14.585785 kernel: mei_me 0000:00:16.0: Device doesn't have valid ME Interface Oct 13 06:53:14.585949 kernel: mei_me 0000:00:16.4: Device doesn't have valid ME Interface Oct 13 06:53:14.612745 kernel: ipmi_si: IPMI System Interface driver Oct 13 06:53:14.612796 kernel: ipmi_si dmi-ipmi-si.0: ipmi_platform: probing via SMBIOS Oct 13 06:53:14.620315 kernel: ipmi_platform: ipmi_si: SMBIOS: io 0xca2 regsize 1 spacing 1 irq 0 Oct 13 06:53:14.626482 kernel: ipmi_si: Adding SMBIOS-specified kcs state machine Oct 13 06:53:14.634672 kernel: ipmi_si IPI0001:00: ipmi_platform: probing via ACPI Oct 13 06:53:14.643188 kernel: ipmi_si IPI0001:00: ipmi_platform: [io 0x0ca2] regsize 1 spacing 1 irq 0 Oct 13 06:53:14.658698 kernel: ipmi_si dmi-ipmi-si.0: Removing SMBIOS-specified kcs state machine in favor of ACPI Oct 13 06:53:14.658937 kernel: ipmi_si: Adding ACPI-specified kcs state machine Oct 13 06:53:14.669249 kernel: ipmi_si: Trying ACPI-specified kcs state machine at i/o address 0xca2, slave address 0x20, irq 0 Oct 13 06:53:14.675665 kernel: MACsec IEEE 802.1AE Oct 13 06:53:14.675705 kernel: i801_smbus 0000:00:1f.4: SPD Write Disable is set Oct 13 06:53:14.687854 kernel: i801_smbus 0000:00:1f.4: SMBus using PCI interrupt Oct 13 06:53:14.697261 systemd[1]: Condition check resulted in dev-ttyS1.device - /dev/ttyS1 being skipped. Oct 13 06:53:14.697290 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Micron_5300_MTFDDAK480TDT OEM. Oct 13 06:53:14.706705 kernel: ipmi_si IPI0001:00: The BMC does not support clearing the recv irq bit, compensating, but the BMC needs to be fixed. Oct 13 06:53:14.716994 systemd[1]: Reloading finished in 244 ms. Oct 13 06:53:14.732672 kernel: iTCO_vendor_support: vendor-support=0 Oct 13 06:53:14.747668 kernel: ipmi_si IPI0001:00: IPMI message handler: Found new BMC (man_id: 0x002a7c, prod_id: 0x1b0f, dev_id: 0x20) Oct 13 06:53:14.747812 kernel: iTCO_wdt iTCO_wdt: Found a Intel PCH TCO device (Version=6, TCOBASE=0x0400) Oct 13 06:53:14.762641 kernel: iTCO_wdt iTCO_wdt: initialized. heartbeat=30 sec (nowayout=0) Oct 13 06:53:14.779534 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Oct 13 06:53:14.792751 kernel: intel_rapl_common: Found RAPL domain package Oct 13 06:53:14.792775 kernel: intel_rapl_common: Found RAPL domain core Oct 13 06:53:14.799273 kernel: intel_rapl_common: Found RAPL domain dram Oct 13 06:53:14.805870 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Oct 13 06:53:14.815847 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Oct 13 06:53:14.838695 kernel: ipmi_si IPI0001:00: IPMI kcs interface initialized Oct 13 06:53:14.845661 kernel: ipmi_ssif: IPMI SSIF Interface driver Oct 13 06:53:14.850049 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Oct 13 06:53:14.872246 systemd[1]: Starting ensure-sysext.service... Oct 13 06:53:14.878179 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Oct 13 06:53:14.888569 systemd[1]: Starting systemd-networkd.service - Network Configuration... Oct 13 06:53:14.897224 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Oct 13 06:53:14.897806 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Oct 13 06:53:14.902638 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Oct 13 06:53:14.913986 systemd[1]: Reload requested from client PID 1698 ('systemctl') (unit ensure-sysext.service)... Oct 13 06:53:14.913992 systemd[1]: Reloading... Oct 13 06:53:14.920628 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Oct 13 06:53:14.920647 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Oct 13 06:53:14.920805 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Oct 13 06:53:14.920967 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Oct 13 06:53:14.921463 systemd-tmpfiles[1702]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Oct 13 06:53:14.921636 systemd-tmpfiles[1702]: ACLs are not supported, ignoring. Oct 13 06:53:14.921678 systemd-tmpfiles[1702]: ACLs are not supported, ignoring. Oct 13 06:53:14.923848 systemd-tmpfiles[1702]: Detected autofs mount point /boot during canonicalization of boot. Oct 13 06:53:14.923852 systemd-tmpfiles[1702]: Skipping /boot Oct 13 06:53:14.927905 systemd-tmpfiles[1702]: Detected autofs mount point /boot during canonicalization of boot. Oct 13 06:53:14.927909 systemd-tmpfiles[1702]: Skipping /boot Oct 13 06:53:14.944666 zram_generator::config[1738]: No configuration found. Oct 13 06:53:14.979371 systemd-networkd[1700]: lo: Link UP Oct 13 06:53:14.979375 systemd-networkd[1700]: lo: Gained carrier Oct 13 06:53:14.981972 systemd-networkd[1700]: bond0: netdev ready Oct 13 06:53:14.982913 systemd-networkd[1700]: Enumeration completed Oct 13 06:53:14.988329 systemd-networkd[1700]: enp1s0f0np0: Configuring with /etc/systemd/network/10-0c:42:a1:15:b6:dc.network. Oct 13 06:53:15.412669 kernel: mlx5_core 0000:01:00.0 enp1s0f0np0: Link up Oct 13 06:53:15.425700 kernel: bond0: (slave enp1s0f0np0): Enslaving as a backup interface with an up link Oct 13 06:53:15.426831 systemd-networkd[1700]: enp1s0f1np1: Configuring with /etc/systemd/network/10-0c:42:a1:15:b6:dd.network. Oct 13 06:53:15.586748 kernel: mlx5_core 0000:01:00.1 enp1s0f1np1: Link up Oct 13 06:53:15.598737 kernel: bond0: (slave enp1s0f1np1): Enslaving as a backup interface with an up link Oct 13 06:53:15.599075 systemd-networkd[1700]: bond0: Configuring with /etc/systemd/network/05-bond0.network. Oct 13 06:53:15.601038 systemd-networkd[1700]: enp1s0f0np0: Link UP Oct 13 06:53:15.601695 systemd-networkd[1700]: enp1s0f0np0: Gained carrier Oct 13 06:53:15.609712 kernel: bond0: Warning: No 802.3ad response from the link partner for any adapters in the bond Oct 13 06:53:15.621829 systemd-networkd[1700]: enp1s0f1np1: Reconfiguring with /etc/systemd/network/10-0c:42:a1:15:b6:dc.network. Oct 13 06:53:15.621946 systemd-networkd[1700]: enp1s0f1np1: Link UP Oct 13 06:53:15.622053 systemd-networkd[1700]: enp1s0f1np1: Gained carrier Oct 13 06:53:15.633809 systemd-networkd[1700]: bond0: Link UP Oct 13 06:53:15.633927 systemd-networkd[1700]: bond0: Gained carrier Oct 13 06:53:15.671099 systemd[1]: Reloading finished in 756 ms. Oct 13 06:53:15.684266 systemd[1]: Started systemd-userdbd.service - User Database Manager. Oct 13 06:53:15.692831 systemd[1]: Started systemd-networkd.service - Network Configuration. Oct 13 06:53:15.717128 kernel: bond0: (slave enp1s0f0np0): link status definitely up, 25000 Mbps full duplex Oct 13 06:53:15.717148 kernel: bond0: active interface up! Oct 13 06:53:15.719546 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Oct 13 06:53:15.730872 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Oct 13 06:53:15.741837 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Oct 13 06:53:15.755650 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 13 06:53:15.774205 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Oct 13 06:53:15.790235 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Oct 13 06:53:15.795361 augenrules[1827]: No rules Oct 13 06:53:15.811008 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Oct 13 06:53:15.821397 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Oct 13 06:53:15.834660 kernel: bond0: (slave enp1s0f1np1): link status definitely up, 25000 Mbps full duplex Oct 13 06:53:15.847235 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Oct 13 06:53:15.856303 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Oct 13 06:53:15.866439 systemd[1]: audit-rules.service: Deactivated successfully. Oct 13 06:53:15.871809 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 13 06:53:15.881054 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Oct 13 06:53:15.890964 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Oct 13 06:53:15.911876 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Oct 13 06:53:15.923176 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Oct 13 06:53:15.935263 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:53:15.935997 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 13 06:53:15.942853 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Oct 13 06:53:15.943601 systemd-resolved[1834]: Positive Trust Anchors: Oct 13 06:53:15.943606 systemd-resolved[1834]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Oct 13 06:53:15.943628 systemd-resolved[1834]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Oct 13 06:53:15.946032 systemd-resolved[1834]: Using system hostname 'ci-4459.1.0-a-3e5fd6a38a'. Oct 13 06:53:15.949240 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Oct 13 06:53:15.955396 augenrules[1843]: /sbin/augenrules: No change Oct 13 06:53:15.958331 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Oct 13 06:53:15.958613 augenrules[1861]: No rules Oct 13 06:53:15.967354 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Oct 13 06:53:15.978321 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Oct 13 06:53:15.986853 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Oct 13 06:53:15.986919 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Oct 13 06:53:15.987567 systemd[1]: Starting systemd-update-done.service - Update is Completed... Oct 13 06:53:15.995755 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Oct 13 06:53:15.995817 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Oct 13 06:53:15.996702 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Oct 13 06:53:16.007073 systemd[1]: audit-rules.service: Deactivated successfully. Oct 13 06:53:16.007179 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 13 06:53:16.015942 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Oct 13 06:53:16.016031 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Oct 13 06:53:16.025933 systemd[1]: modprobe@drm.service: Deactivated successfully. Oct 13 06:53:16.026021 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Oct 13 06:53:16.034927 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Oct 13 06:53:16.035015 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Oct 13 06:53:16.044937 systemd[1]: modprobe@loop.service: Deactivated successfully. Oct 13 06:53:16.045024 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Oct 13 06:53:16.053946 systemd[1]: Finished systemd-update-done.service - Update is Completed. Oct 13 06:53:16.063907 systemd[1]: Finished ensure-sysext.service. Oct 13 06:53:16.072616 systemd[1]: Reached target network.target - Network. Oct 13 06:53:16.079744 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Oct 13 06:53:16.090713 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Oct 13 06:53:16.090744 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Oct 13 06:53:16.091679 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Oct 13 06:53:16.138145 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Oct 13 06:53:16.148749 systemd[1]: Reached target sysinit.target - System Initialization. Oct 13 06:53:16.158733 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Oct 13 06:53:16.169702 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Oct 13 06:53:16.180691 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Oct 13 06:53:16.190697 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Oct 13 06:53:16.201690 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Oct 13 06:53:16.201705 systemd[1]: Reached target paths.target - Path Units. Oct 13 06:53:16.209690 systemd[1]: Reached target time-set.target - System Time Set. Oct 13 06:53:16.218748 systemd[1]: Started logrotate.timer - Daily rotation of log files. Oct 13 06:53:16.228740 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Oct 13 06:53:16.238827 systemd[1]: Reached target timers.target - Timer Units. Oct 13 06:53:16.248296 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Oct 13 06:53:16.258337 systemd[1]: Starting docker.socket - Docker Socket for the API... Oct 13 06:53:16.267483 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Oct 13 06:53:16.284975 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Oct 13 06:53:16.294820 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Oct 13 06:53:16.306019 systemd[1]: Listening on docker.socket - Docker Socket for the API. Oct 13 06:53:16.315112 systemd[1]: Reached target sockets.target - Socket Units. Oct 13 06:53:16.323690 systemd[1]: Reached target basic.target - Basic System. Oct 13 06:53:16.330714 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Oct 13 06:53:16.330728 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Oct 13 06:53:16.331186 systemd[1]: Starting containerd.service - containerd container runtime... Oct 13 06:53:16.359004 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Oct 13 06:53:16.380855 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Oct 13 06:53:16.388267 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Oct 13 06:53:16.392672 coreos-metadata[1882]: Oct 13 06:53:16.392 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Oct 13 06:53:16.404965 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Oct 13 06:53:16.415181 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Oct 13 06:53:16.416864 jq[1888]: false Oct 13 06:53:16.424696 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Oct 13 06:53:16.425210 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Oct 13 06:53:16.429383 extend-filesystems[1889]: Found /dev/sdb6 Oct 13 06:53:16.433799 extend-filesystems[1889]: Found /dev/sdb9 Oct 13 06:53:16.433799 extend-filesystems[1889]: Checking size of /dev/sdb9 Oct 13 06:53:16.465805 kernel: EXT4-fs (sdb9): resizing filesystem from 553472 to 116605649 blocks Oct 13 06:53:16.455209 oslogin_cache_refresh[1890]: Refreshing passwd entry cache Oct 13 06:53:16.434398 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Oct 13 06:53:16.466029 extend-filesystems[1889]: Resized partition /dev/sdb9 Oct 13 06:53:16.452787 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Oct 13 06:53:16.481803 google_oslogin_nss_cache[1890]: oslogin_cache_refresh[1890]: Refreshing passwd entry cache Oct 13 06:53:16.481937 extend-filesystems[1901]: resize2fs 1.47.3 (8-Jul-2025) Oct 13 06:53:16.466629 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Oct 13 06:53:16.492612 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Oct 13 06:53:16.519964 systemd[1]: Starting systemd-logind.service - User Login Management... Oct 13 06:53:16.529801 systemd[1]: Starting tcsd.service - TCG Core Services Daemon... Oct 13 06:53:16.536982 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Oct 13 06:53:16.537300 systemd[1]: Starting update-engine.service - Update Engine... Oct 13 06:53:16.545253 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Oct 13 06:53:16.552469 update_engine[1920]: I20251013 06:53:16.552403 1920 main.cc:92] Flatcar Update Engine starting Oct 13 06:53:16.554769 systemd-logind[1915]: Watching system buttons on /dev/input/event3 (Power Button) Oct 13 06:53:16.554781 systemd-logind[1915]: Watching system buttons on /dev/input/event2 (Sleep Button) Oct 13 06:53:16.554792 systemd-logind[1915]: Watching system buttons on /dev/input/event0 (HID 0557:2419) Oct 13 06:53:16.554947 systemd-logind[1915]: New seat seat0. Oct 13 06:53:16.556311 systemd[1]: Started systemd-logind.service - User Login Management. Oct 13 06:53:16.557538 jq[1921]: true Oct 13 06:53:16.566268 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Oct 13 06:53:16.576826 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Oct 13 06:53:16.576937 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Oct 13 06:53:16.577096 systemd[1]: motdgen.service: Deactivated successfully. Oct 13 06:53:16.584773 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Oct 13 06:53:16.594177 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Oct 13 06:53:16.594289 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Oct 13 06:53:16.621600 jq[1924]: true Oct 13 06:53:16.622024 (ntainerd)[1925]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Oct 13 06:53:16.631412 tar[1923]: linux-amd64/LICENSE Oct 13 06:53:16.631558 tar[1923]: linux-amd64/helm Oct 13 06:53:16.635520 systemd[1]: tcsd.service: Skipped due to 'exec-condition'. Oct 13 06:53:16.635636 systemd[1]: Condition check resulted in tcsd.service - TCG Core Services Daemon being skipped. Oct 13 06:53:16.637407 sshd_keygen[1918]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Oct 13 06:53:16.649828 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Oct 13 06:53:16.660725 systemd[1]: Starting issuegen.service - Generate /run/issue... Oct 13 06:53:16.663881 dbus-daemon[1883]: [system] SELinux support is enabled Oct 13 06:53:16.670861 dbus-daemon[1883]: [system] Successfully activated service 'org.freedesktop.systemd1' Oct 13 06:53:16.668732 systemd[1]: Started dbus.service - D-Bus System Message Bus. Oct 13 06:53:16.672301 update_engine[1920]: I20251013 06:53:16.665576 1920 update_check_scheduler.cc:74] Next update check in 9m5s Oct 13 06:53:16.670346 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Oct 13 06:53:16.670360 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Oct 13 06:53:16.675106 bash[1954]: Updated "/home/core/.ssh/authorized_keys" Oct 13 06:53:16.687744 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Oct 13 06:53:16.687759 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Oct 13 06:53:16.703766 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Oct 13 06:53:16.713904 systemd[1]: issuegen.service: Deactivated successfully. Oct 13 06:53:16.714011 systemd[1]: Finished issuegen.service - Generate /run/issue. Oct 13 06:53:16.724672 systemd[1]: Started update-engine.service - Update Engine. Oct 13 06:53:16.733874 systemd[1]: Starting sshkeys.service... Oct 13 06:53:16.747007 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Oct 13 06:53:16.766993 systemd[1]: Started locksmithd.service - Cluster reboot manager. Oct 13 06:53:16.776304 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Oct 13 06:53:16.789406 systemd[1]: Started getty@tty1.service - Getty on tty1. Oct 13 06:53:16.797503 systemd[1]: Started serial-getty@ttyS1.service - Serial Getty on ttyS1. Oct 13 06:53:16.801715 locksmithd[1979]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Oct 13 06:53:16.807942 systemd[1]: Reached target getty.target - Login Prompts. Oct 13 06:53:16.818685 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Oct 13 06:53:16.822229 containerd[1925]: time="2025-10-13T06:53:16Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Oct 13 06:53:16.822578 containerd[1925]: time="2025-10-13T06:53:16.822565221Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Oct 13 06:53:16.829488 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Oct 13 06:53:16.831940 containerd[1925]: time="2025-10-13T06:53:16.831893060Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.078µs" Oct 13 06:53:16.831940 containerd[1925]: time="2025-10-13T06:53:16.831922554Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Oct 13 06:53:16.831988 containerd[1925]: time="2025-10-13T06:53:16.831940491Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Oct 13 06:53:16.832101 containerd[1925]: time="2025-10-13T06:53:16.832061002Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Oct 13 06:53:16.832101 containerd[1925]: time="2025-10-13T06:53:16.832075118Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Oct 13 06:53:16.832129 containerd[1925]: time="2025-10-13T06:53:16.832094685Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832255 containerd[1925]: time="2025-10-13T06:53:16.832144124Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832255 containerd[1925]: time="2025-10-13T06:53:16.832161754Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832426 containerd[1925]: time="2025-10-13T06:53:16.832384218Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832426 containerd[1925]: time="2025-10-13T06:53:16.832402697Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832426 containerd[1925]: time="2025-10-13T06:53:16.832412964Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832471 containerd[1925]: time="2025-10-13T06:53:16.832425970Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832508 containerd[1925]: time="2025-10-13T06:53:16.832498572Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832718 containerd[1925]: time="2025-10-13T06:53:16.832684760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832737 containerd[1925]: time="2025-10-13T06:53:16.832708704Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Oct 13 06:53:16.832737 containerd[1925]: time="2025-10-13T06:53:16.832724987Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Oct 13 06:53:16.832768 containerd[1925]: time="2025-10-13T06:53:16.832743626Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Oct 13 06:53:16.833106 containerd[1925]: time="2025-10-13T06:53:16.833092044Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Oct 13 06:53:16.833169 containerd[1925]: time="2025-10-13T06:53:16.833154321Z" level=info msg="metadata content store policy set" policy=shared Oct 13 06:53:16.839527 tar[1923]: linux-amd64/README.md Oct 13 06:53:16.850644 containerd[1925]: time="2025-10-13T06:53:16.850622864Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Oct 13 06:53:16.850681 containerd[1925]: time="2025-10-13T06:53:16.850672244Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Oct 13 06:53:16.850697 containerd[1925]: time="2025-10-13T06:53:16.850683765Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Oct 13 06:53:16.850697 containerd[1925]: time="2025-10-13T06:53:16.850692253Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Oct 13 06:53:16.850722 containerd[1925]: time="2025-10-13T06:53:16.850703000Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Oct 13 06:53:16.850722 containerd[1925]: time="2025-10-13T06:53:16.850709359Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Oct 13 06:53:16.850722 containerd[1925]: time="2025-10-13T06:53:16.850715922Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Oct 13 06:53:16.850768 containerd[1925]: time="2025-10-13T06:53:16.850723194Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Oct 13 06:53:16.850768 containerd[1925]: time="2025-10-13T06:53:16.850736263Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Oct 13 06:53:16.850768 containerd[1925]: time="2025-10-13T06:53:16.850742635Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Oct 13 06:53:16.850768 containerd[1925]: time="2025-10-13T06:53:16.850747926Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Oct 13 06:53:16.850768 containerd[1925]: time="2025-10-13T06:53:16.850755473Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Oct 13 06:53:16.850832 containerd[1925]: time="2025-10-13T06:53:16.850824702Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Oct 13 06:53:16.850846 containerd[1925]: time="2025-10-13T06:53:16.850842255Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Oct 13 06:53:16.850859 containerd[1925]: time="2025-10-13T06:53:16.850851697Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Oct 13 06:53:16.850859 containerd[1925]: time="2025-10-13T06:53:16.850857840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Oct 13 06:53:16.850884 containerd[1925]: time="2025-10-13T06:53:16.850863860Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Oct 13 06:53:16.850884 containerd[1925]: time="2025-10-13T06:53:16.850869919Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Oct 13 06:53:16.850884 containerd[1925]: time="2025-10-13T06:53:16.850876176Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Oct 13 06:53:16.850934 containerd[1925]: time="2025-10-13T06:53:16.850883328Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Oct 13 06:53:16.850934 containerd[1925]: time="2025-10-13T06:53:16.850889689Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Oct 13 06:53:16.850934 containerd[1925]: time="2025-10-13T06:53:16.850896946Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Oct 13 06:53:16.850934 containerd[1925]: time="2025-10-13T06:53:16.850902635Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Oct 13 06:53:16.850997 containerd[1925]: time="2025-10-13T06:53:16.850947564Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Oct 13 06:53:16.850997 containerd[1925]: time="2025-10-13T06:53:16.850960263Z" level=info msg="Start snapshots syncer" Oct 13 06:53:16.850997 containerd[1925]: time="2025-10-13T06:53:16.850971721Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Oct 13 06:53:16.851173 containerd[1925]: time="2025-10-13T06:53:16.851125766Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Oct 13 06:53:16.851173 containerd[1925]: time="2025-10-13T06:53:16.851156639Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Oct 13 06:53:16.851256 containerd[1925]: time="2025-10-13T06:53:16.851199595Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Oct 13 06:53:16.851256 containerd[1925]: time="2025-10-13T06:53:16.851251369Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Oct 13 06:53:16.851284 containerd[1925]: time="2025-10-13T06:53:16.851273747Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Oct 13 06:53:16.851284 containerd[1925]: time="2025-10-13T06:53:16.851281172Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Oct 13 06:53:16.851313 containerd[1925]: time="2025-10-13T06:53:16.851289117Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Oct 13 06:53:16.851313 containerd[1925]: time="2025-10-13T06:53:16.851297018Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Oct 13 06:53:16.851313 containerd[1925]: time="2025-10-13T06:53:16.851302978Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Oct 13 06:53:16.851313 containerd[1925]: time="2025-10-13T06:53:16.851308893Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Oct 13 06:53:16.851379 containerd[1925]: time="2025-10-13T06:53:16.851326773Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Oct 13 06:53:16.851379 containerd[1925]: time="2025-10-13T06:53:16.851336903Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Oct 13 06:53:16.851379 containerd[1925]: time="2025-10-13T06:53:16.851343572Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Oct 13 06:53:16.851379 containerd[1925]: time="2025-10-13T06:53:16.851365753Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 13 06:53:16.851379 containerd[1925]: time="2025-10-13T06:53:16.851374148Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Oct 13 06:53:16.851379 containerd[1925]: time="2025-10-13T06:53:16.851379035Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 13 06:53:16.851461 containerd[1925]: time="2025-10-13T06:53:16.851388692Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Oct 13 06:53:16.851461 containerd[1925]: time="2025-10-13T06:53:16.851393475Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Oct 13 06:53:16.851461 containerd[1925]: time="2025-10-13T06:53:16.851398582Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Oct 13 06:53:16.851461 containerd[1925]: time="2025-10-13T06:53:16.851409580Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Oct 13 06:53:16.851461 containerd[1925]: time="2025-10-13T06:53:16.851419059Z" level=info msg="runtime interface created" Oct 13 06:53:16.851461 containerd[1925]: time="2025-10-13T06:53:16.851421968Z" level=info msg="created NRI interface" Oct 13 06:53:16.851461 containerd[1925]: time="2025-10-13T06:53:16.851434132Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Oct 13 06:53:16.851461 containerd[1925]: time="2025-10-13T06:53:16.851445450Z" level=info msg="Connect containerd service" Oct 13 06:53:16.851559 containerd[1925]: time="2025-10-13T06:53:16.851466228Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Oct 13 06:53:16.851906 containerd[1925]: time="2025-10-13T06:53:16.851895149Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Oct 13 06:53:16.852271 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Oct 13 06:53:16.858493 coreos-metadata[1993]: Oct 13 06:53:16.858 INFO Fetching https://metadata.packet.net/metadata: Attempt #1 Oct 13 06:53:16.921690 containerd[1925]: time="2025-10-13T06:53:16.921630697Z" level=info msg="Start subscribing containerd event" Oct 13 06:53:16.921690 containerd[1925]: time="2025-10-13T06:53:16.921672830Z" level=info msg="Start recovering state" Oct 13 06:53:16.921761 containerd[1925]: time="2025-10-13T06:53:16.921699811Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Oct 13 06:53:16.921761 containerd[1925]: time="2025-10-13T06:53:16.921728554Z" level=info msg=serving... address=/run/containerd/containerd.sock Oct 13 06:53:16.921761 containerd[1925]: time="2025-10-13T06:53:16.921748160Z" level=info msg="Start event monitor" Oct 13 06:53:16.921761 containerd[1925]: time="2025-10-13T06:53:16.921757714Z" level=info msg="Start cni network conf syncer for default" Oct 13 06:53:16.921825 containerd[1925]: time="2025-10-13T06:53:16.921762490Z" level=info msg="Start streaming server" Oct 13 06:53:16.921825 containerd[1925]: time="2025-10-13T06:53:16.921784858Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Oct 13 06:53:16.921825 containerd[1925]: time="2025-10-13T06:53:16.921794687Z" level=info msg="runtime interface starting up..." Oct 13 06:53:16.921825 containerd[1925]: time="2025-10-13T06:53:16.921798046Z" level=info msg="starting plugins..." Oct 13 06:53:16.921825 containerd[1925]: time="2025-10-13T06:53:16.921805299Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Oct 13 06:53:16.921898 containerd[1925]: time="2025-10-13T06:53:16.921874736Z" level=info msg="containerd successfully booted in 0.099959s" Oct 13 06:53:16.921932 systemd[1]: Started containerd.service - containerd container runtime. Oct 13 06:53:17.002691 kernel: EXT4-fs (sdb9): resized filesystem to 116605649 Oct 13 06:53:17.028486 extend-filesystems[1901]: Filesystem at /dev/sdb9 is mounted on /; on-line resizing required Oct 13 06:53:17.028486 extend-filesystems[1901]: old_desc_blocks = 1, new_desc_blocks = 56 Oct 13 06:53:17.028486 extend-filesystems[1901]: The filesystem on /dev/sdb9 is now 116605649 (4k) blocks long. Oct 13 06:53:17.066727 extend-filesystems[1889]: Resized filesystem in /dev/sdb9 Oct 13 06:53:17.029278 systemd[1]: extend-filesystems.service: Deactivated successfully. Oct 13 06:53:17.029418 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Oct 13 06:53:17.504700 systemd-networkd[1700]: bond0: Gained IPv6LL Oct 13 06:53:17.506086 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Oct 13 06:53:17.516072 systemd[1]: Reached target network-online.target - Network is Online. Oct 13 06:53:17.525821 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:53:17.545960 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Oct 13 06:53:17.564517 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Oct 13 06:53:18.167901 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Oct 13 06:53:18.168036 kernel: mlx5_core 0000:01:00.0: shared_fdb:0 mode:queue_affinity Oct 13 06:53:18.237665 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:1 Oct 13 06:53:18.326666 kernel: mlx5_core 0000:01:00.0: lag map: port 1:1 port 2:2 Oct 13 06:53:18.361548 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:53:18.372350 (kubelet)[2040]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 13 06:53:18.866688 kubelet[2040]: E1013 06:53:18.866608 2040 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 13 06:53:18.867806 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 13 06:53:18.867882 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 13 06:53:18.868054 systemd[1]: kubelet.service: Consumed 599ms CPU time, 270.1M memory peak. Oct 13 06:53:19.431616 systemd-timesyncd[1877]: Contacted time server 23.141.40.124:123 (0.flatcar.pool.ntp.org). Oct 13 06:53:19.431811 systemd-timesyncd[1877]: Initial clock synchronization to Mon 2025-10-13 06:53:19.315566 UTC. Oct 13 06:53:20.307036 coreos-metadata[1882]: Oct 13 06:53:20.306 INFO Fetch successful Oct 13 06:53:20.323089 coreos-metadata[1993]: Oct 13 06:53:20.322 INFO Fetch successful Oct 13 06:53:20.360115 unknown[1993]: wrote ssh authorized keys file for user: core Oct 13 06:53:20.375106 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Oct 13 06:53:20.382681 update-ssh-keys[2061]: Updated "/home/core/.ssh/authorized_keys" Oct 13 06:53:20.385264 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Oct 13 06:53:20.397517 systemd[1]: Finished sshkeys.service. Oct 13 06:53:20.406735 systemd[1]: Starting packet-phone-home.service - Report Success to Packet... Oct 13 06:53:20.459776 google_oslogin_nss_cache[1890]: oslogin_cache_refresh[1890]: Failure getting users, quitting Oct 13 06:53:20.459776 google_oslogin_nss_cache[1890]: oslogin_cache_refresh[1890]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 13 06:53:20.459725 oslogin_cache_refresh[1890]: Failure getting users, quitting Oct 13 06:53:20.460905 google_oslogin_nss_cache[1890]: oslogin_cache_refresh[1890]: Refreshing group entry cache Oct 13 06:53:20.459766 oslogin_cache_refresh[1890]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Oct 13 06:53:20.459874 oslogin_cache_refresh[1890]: Refreshing group entry cache Oct 13 06:53:20.461380 google_oslogin_nss_cache[1890]: oslogin_cache_refresh[1890]: Failure getting groups, quitting Oct 13 06:53:20.461380 google_oslogin_nss_cache[1890]: oslogin_cache_refresh[1890]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 13 06:53:20.461324 oslogin_cache_refresh[1890]: Failure getting groups, quitting Oct 13 06:53:20.461352 oslogin_cache_refresh[1890]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Oct 13 06:53:20.465225 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Oct 13 06:53:20.465839 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Oct 13 06:53:20.732110 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Oct 13 06:53:20.742476 systemd[1]: Started sshd@0-139.178.94.25:22-147.75.109.163:59356.service - OpenSSH per-connection server daemon (147.75.109.163:59356). Oct 13 06:53:20.825234 sshd[2071]: Accepted publickey for core from 147.75.109.163 port 59356 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:20.825910 sshd-session[2071]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:20.829565 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Oct 13 06:53:20.839419 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Oct 13 06:53:20.854598 systemd-logind[1915]: New session 1 of user core. Oct 13 06:53:20.878077 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Oct 13 06:53:20.892380 systemd[1]: Starting user@500.service - User Manager for UID 500... Oct 13 06:53:20.928537 (systemd)[2076]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Oct 13 06:53:20.934823 systemd-logind[1915]: New session c1 of user core. Oct 13 06:53:21.061237 systemd[2076]: Queued start job for default target default.target. Oct 13 06:53:21.076306 systemd[2076]: Created slice app.slice - User Application Slice. Oct 13 06:53:21.076339 systemd[2076]: Reached target paths.target - Paths. Oct 13 06:53:21.076359 systemd[2076]: Reached target timers.target - Timers. Oct 13 06:53:21.076966 systemd[2076]: Starting dbus.socket - D-Bus User Message Bus Socket... Oct 13 06:53:21.082279 systemd[2076]: Listening on dbus.socket - D-Bus User Message Bus Socket. Oct 13 06:53:21.082307 systemd[2076]: Reached target sockets.target - Sockets. Oct 13 06:53:21.082330 systemd[2076]: Reached target basic.target - Basic System. Oct 13 06:53:21.082351 systemd[2076]: Reached target default.target - Main User Target. Oct 13 06:53:21.082366 systemd[2076]: Startup finished in 131ms. Oct 13 06:53:21.082418 systemd[1]: Started user@500.service - User Manager for UID 500. Oct 13 06:53:21.091554 systemd[1]: Started session-1.scope - Session 1 of User core. Oct 13 06:53:21.164080 systemd[1]: Started sshd@1-139.178.94.25:22-147.75.109.163:59364.service - OpenSSH per-connection server daemon (147.75.109.163:59364). Oct 13 06:53:21.216940 sshd[2087]: Accepted publickey for core from 147.75.109.163 port 59364 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:21.217520 sshd-session[2087]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:21.220000 systemd-logind[1915]: New session 2 of user core. Oct 13 06:53:21.232905 systemd[1]: Started session-2.scope - Session 2 of User core. Oct 13 06:53:21.296055 sshd[2090]: Connection closed by 147.75.109.163 port 59364 Oct 13 06:53:21.296585 sshd-session[2087]: pam_unix(sshd:session): session closed for user core Oct 13 06:53:21.322022 systemd[1]: sshd@1-139.178.94.25:22-147.75.109.163:59364.service: Deactivated successfully. Oct 13 06:53:21.325766 systemd[1]: session-2.scope: Deactivated successfully. Oct 13 06:53:21.327973 systemd-logind[1915]: Session 2 logged out. Waiting for processes to exit. Oct 13 06:53:21.333275 systemd[1]: Started sshd@2-139.178.94.25:22-147.75.109.163:59378.service - OpenSSH per-connection server daemon (147.75.109.163:59378). Oct 13 06:53:21.346521 systemd-logind[1915]: Removed session 2. Oct 13 06:53:21.437570 sshd[2096]: Accepted publickey for core from 147.75.109.163 port 59378 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:21.438490 sshd-session[2096]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:21.442022 systemd-logind[1915]: New session 3 of user core. Oct 13 06:53:21.455874 systemd[1]: Started session-3.scope - Session 3 of User core. Oct 13 06:53:21.518352 sshd[2099]: Connection closed by 147.75.109.163 port 59378 Oct 13 06:53:21.518532 sshd-session[2096]: pam_unix(sshd:session): session closed for user core Oct 13 06:53:21.520082 systemd[1]: sshd@2-139.178.94.25:22-147.75.109.163:59378.service: Deactivated successfully. Oct 13 06:53:21.520967 systemd[1]: session-3.scope: Deactivated successfully. Oct 13 06:53:21.521547 systemd-logind[1915]: Session 3 logged out. Waiting for processes to exit. Oct 13 06:53:21.522176 systemd-logind[1915]: Removed session 3. Oct 13 06:53:21.834917 login[1983]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 13 06:53:21.839201 systemd-logind[1915]: New session 4 of user core. Oct 13 06:53:21.839643 systemd[1]: Started session-4.scope - Session 4 of User core. Oct 13 06:53:21.845416 login[1981]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Oct 13 06:53:21.847934 systemd-logind[1915]: New session 5 of user core. Oct 13 06:53:21.848486 systemd[1]: Started session-5.scope - Session 5 of User core. Oct 13 06:53:22.201212 systemd[1]: Finished packet-phone-home.service - Report Success to Packet. Oct 13 06:53:22.202257 systemd[1]: Reached target multi-user.target - Multi-User System. Oct 13 06:53:22.202503 systemd[1]: Startup finished in 4.256s (kernel) + 23.692s (initrd) + 10.301s (userspace) = 38.249s. Oct 13 06:53:29.067593 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Oct 13 06:53:29.070609 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:53:29.340164 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:53:29.344693 (kubelet)[2139]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 13 06:53:29.414227 kubelet[2139]: E1013 06:53:29.414179 2139 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 13 06:53:29.416901 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 13 06:53:29.416995 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 13 06:53:29.417192 systemd[1]: kubelet.service: Consumed 211ms CPU time, 115.9M memory peak. Oct 13 06:53:31.469034 systemd[1]: Started sshd@3-139.178.94.25:22-147.75.109.163:52664.service - OpenSSH per-connection server daemon (147.75.109.163:52664). Oct 13 06:53:31.508085 sshd[2160]: Accepted publickey for core from 147.75.109.163 port 52664 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:31.508775 sshd-session[2160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:31.511511 systemd-logind[1915]: New session 6 of user core. Oct 13 06:53:31.528902 systemd[1]: Started session-6.scope - Session 6 of User core. Oct 13 06:53:31.581277 sshd[2163]: Connection closed by 147.75.109.163 port 52664 Oct 13 06:53:31.581428 sshd-session[2160]: pam_unix(sshd:session): session closed for user core Oct 13 06:53:31.594753 systemd[1]: sshd@3-139.178.94.25:22-147.75.109.163:52664.service: Deactivated successfully. Oct 13 06:53:31.595611 systemd[1]: session-6.scope: Deactivated successfully. Oct 13 06:53:31.596187 systemd-logind[1915]: Session 6 logged out. Waiting for processes to exit. Oct 13 06:53:31.597256 systemd[1]: Started sshd@4-139.178.94.25:22-147.75.109.163:52666.service - OpenSSH per-connection server daemon (147.75.109.163:52666). Oct 13 06:53:31.597886 systemd-logind[1915]: Removed session 6. Oct 13 06:53:31.640824 sshd[2169]: Accepted publickey for core from 147.75.109.163 port 52666 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:31.641579 sshd-session[2169]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:31.644781 systemd-logind[1915]: New session 7 of user core. Oct 13 06:53:31.656900 systemd[1]: Started session-7.scope - Session 7 of User core. Oct 13 06:53:31.708815 sshd[2173]: Connection closed by 147.75.109.163 port 52666 Oct 13 06:53:31.709554 sshd-session[2169]: pam_unix(sshd:session): session closed for user core Oct 13 06:53:31.728709 systemd[1]: sshd@4-139.178.94.25:22-147.75.109.163:52666.service: Deactivated successfully. Oct 13 06:53:31.729514 systemd[1]: session-7.scope: Deactivated successfully. Oct 13 06:53:31.729998 systemd-logind[1915]: Session 7 logged out. Waiting for processes to exit. Oct 13 06:53:31.730911 systemd[1]: Started sshd@5-139.178.94.25:22-147.75.109.163:46296.service - OpenSSH per-connection server daemon (147.75.109.163:46296). Oct 13 06:53:31.731551 systemd-logind[1915]: Removed session 7. Oct 13 06:53:31.771591 sshd[2179]: Accepted publickey for core from 147.75.109.163 port 46296 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:31.772407 sshd-session[2179]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:31.775733 systemd-logind[1915]: New session 8 of user core. Oct 13 06:53:31.785898 systemd[1]: Started session-8.scope - Session 8 of User core. Oct 13 06:53:31.848160 sshd[2182]: Connection closed by 147.75.109.163 port 46296 Oct 13 06:53:31.848938 sshd-session[2179]: pam_unix(sshd:session): session closed for user core Oct 13 06:53:31.871836 systemd[1]: sshd@5-139.178.94.25:22-147.75.109.163:46296.service: Deactivated successfully. Oct 13 06:53:31.875670 systemd[1]: session-8.scope: Deactivated successfully. Oct 13 06:53:31.877945 systemd-logind[1915]: Session 8 logged out. Waiting for processes to exit. Oct 13 06:53:31.883524 systemd[1]: Started sshd@6-139.178.94.25:22-147.75.109.163:46312.service - OpenSSH per-connection server daemon (147.75.109.163:46312). Oct 13 06:53:31.885368 systemd-logind[1915]: Removed session 8. Oct 13 06:53:31.990915 sshd[2188]: Accepted publickey for core from 147.75.109.163 port 46312 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:31.992109 sshd-session[2188]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:31.996089 systemd-logind[1915]: New session 9 of user core. Oct 13 06:53:32.013989 systemd[1]: Started session-9.scope - Session 9 of User core. Oct 13 06:53:32.080896 sudo[2193]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Oct 13 06:53:32.081036 sudo[2193]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 13 06:53:32.090063 sudo[2193]: pam_unix(sudo:session): session closed for user root Oct 13 06:53:32.090792 sshd[2192]: Connection closed by 147.75.109.163 port 46312 Oct 13 06:53:32.090964 sshd-session[2188]: pam_unix(sshd:session): session closed for user core Oct 13 06:53:32.103978 systemd[1]: sshd@6-139.178.94.25:22-147.75.109.163:46312.service: Deactivated successfully. Oct 13 06:53:32.104952 systemd[1]: session-9.scope: Deactivated successfully. Oct 13 06:53:32.105582 systemd-logind[1915]: Session 9 logged out. Waiting for processes to exit. Oct 13 06:53:32.106990 systemd[1]: Started sshd@7-139.178.94.25:22-147.75.109.163:46322.service - OpenSSH per-connection server daemon (147.75.109.163:46322). Oct 13 06:53:32.107427 systemd-logind[1915]: Removed session 9. Oct 13 06:53:32.155967 sshd[2199]: Accepted publickey for core from 147.75.109.163 port 46322 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:32.156944 sshd-session[2199]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:32.160704 systemd-logind[1915]: New session 10 of user core. Oct 13 06:53:32.170905 systemd[1]: Started session-10.scope - Session 10 of User core. Oct 13 06:53:32.227445 sudo[2204]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Oct 13 06:53:32.227582 sudo[2204]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 13 06:53:32.230223 sudo[2204]: pam_unix(sudo:session): session closed for user root Oct 13 06:53:32.232815 sudo[2203]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Oct 13 06:53:32.232952 sudo[2203]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 13 06:53:32.238363 systemd[1]: Starting audit-rules.service - Load Audit Rules... Oct 13 06:53:32.271987 augenrules[2226]: No rules Oct 13 06:53:32.272685 systemd[1]: audit-rules.service: Deactivated successfully. Oct 13 06:53:32.272920 systemd[1]: Finished audit-rules.service - Load Audit Rules. Oct 13 06:53:32.273832 sudo[2203]: pam_unix(sudo:session): session closed for user root Oct 13 06:53:32.275156 sshd[2202]: Connection closed by 147.75.109.163 port 46322 Oct 13 06:53:32.275520 sshd-session[2199]: pam_unix(sshd:session): session closed for user core Oct 13 06:53:32.292992 systemd[1]: sshd@7-139.178.94.25:22-147.75.109.163:46322.service: Deactivated successfully. Oct 13 06:53:32.296396 systemd[1]: session-10.scope: Deactivated successfully. Oct 13 06:53:32.298358 systemd-logind[1915]: Session 10 logged out. Waiting for processes to exit. Oct 13 06:53:32.303481 systemd[1]: Started sshd@8-139.178.94.25:22-147.75.109.163:46326.service - OpenSSH per-connection server daemon (147.75.109.163:46326). Oct 13 06:53:32.305162 systemd-logind[1915]: Removed session 10. Oct 13 06:53:32.378850 sshd[2235]: Accepted publickey for core from 147.75.109.163 port 46326 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 06:53:32.379371 sshd-session[2235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 06:53:32.381761 systemd-logind[1915]: New session 11 of user core. Oct 13 06:53:32.397900 systemd[1]: Started session-11.scope - Session 11 of User core. Oct 13 06:53:32.446573 sudo[2239]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Oct 13 06:53:32.446719 sudo[2239]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Oct 13 06:53:32.735636 systemd[1]: Starting docker.service - Docker Application Container Engine... Oct 13 06:53:32.757001 (dockerd)[2265]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Oct 13 06:53:32.962138 dockerd[2265]: time="2025-10-13T06:53:32.962081122Z" level=info msg="Starting up" Oct 13 06:53:32.962578 dockerd[2265]: time="2025-10-13T06:53:32.962544805Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Oct 13 06:53:32.968330 dockerd[2265]: time="2025-10-13T06:53:32.968281647Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Oct 13 06:53:32.991258 dockerd[2265]: time="2025-10-13T06:53:32.991184327Z" level=info msg="Loading containers: start." Oct 13 06:53:33.002661 kernel: Initializing XFRM netlink socket Oct 13 06:53:33.153237 systemd-networkd[1700]: docker0: Link UP Oct 13 06:53:33.170395 dockerd[2265]: time="2025-10-13T06:53:33.170380149Z" level=info msg="Loading containers: done." Oct 13 06:53:33.177014 dockerd[2265]: time="2025-10-13T06:53:33.176996719Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Oct 13 06:53:33.177079 dockerd[2265]: time="2025-10-13T06:53:33.177037075Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Oct 13 06:53:33.177079 dockerd[2265]: time="2025-10-13T06:53:33.177075464Z" level=info msg="Initializing buildkit" Oct 13 06:53:33.188087 dockerd[2265]: time="2025-10-13T06:53:33.188044465Z" level=info msg="Completed buildkit initialization" Oct 13 06:53:33.190430 dockerd[2265]: time="2025-10-13T06:53:33.190417594Z" level=info msg="Daemon has completed initialization" Oct 13 06:53:33.190481 dockerd[2265]: time="2025-10-13T06:53:33.190462381Z" level=info msg="API listen on /run/docker.sock" Oct 13 06:53:33.190518 systemd[1]: Started docker.service - Docker Application Container Engine. Oct 13 06:53:33.999197 containerd[1925]: time="2025-10-13T06:53:33.999171268Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Oct 13 06:53:34.463817 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2664631863.mount: Deactivated successfully. Oct 13 06:53:35.293169 containerd[1925]: time="2025-10-13T06:53:35.293139273Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:35.293437 containerd[1925]: time="2025-10-13T06:53:35.293318513Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28837916" Oct 13 06:53:35.293751 containerd[1925]: time="2025-10-13T06:53:35.293736400Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:35.294981 containerd[1925]: time="2025-10-13T06:53:35.294968542Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:35.295940 containerd[1925]: time="2025-10-13T06:53:35.295924469Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 1.296727056s" Oct 13 06:53:35.295969 containerd[1925]: time="2025-10-13T06:53:35.295942984Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Oct 13 06:53:35.296240 containerd[1925]: time="2025-10-13T06:53:35.296194728Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Oct 13 06:53:36.379121 containerd[1925]: time="2025-10-13T06:53:36.379063454Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:36.379326 containerd[1925]: time="2025-10-13T06:53:36.379281893Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787027" Oct 13 06:53:36.379598 containerd[1925]: time="2025-10-13T06:53:36.379562801Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:36.380884 containerd[1925]: time="2025-10-13T06:53:36.380845008Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:36.381786 containerd[1925]: time="2025-10-13T06:53:36.381746795Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 1.085537878s" Oct 13 06:53:36.381786 containerd[1925]: time="2025-10-13T06:53:36.381761156Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Oct 13 06:53:36.382013 containerd[1925]: time="2025-10-13T06:53:36.381972097Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Oct 13 06:53:37.317400 containerd[1925]: time="2025-10-13T06:53:37.317342431Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:37.317596 containerd[1925]: time="2025-10-13T06:53:37.317551883Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176289" Oct 13 06:53:37.317915 containerd[1925]: time="2025-10-13T06:53:37.317875515Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:37.319202 containerd[1925]: time="2025-10-13T06:53:37.319162894Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:37.319719 containerd[1925]: time="2025-10-13T06:53:37.319697988Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 937.706351ms" Oct 13 06:53:37.319719 containerd[1925]: time="2025-10-13T06:53:37.319717877Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Oct 13 06:53:37.320016 containerd[1925]: time="2025-10-13T06:53:37.319969152Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Oct 13 06:53:38.123210 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1253156931.mount: Deactivated successfully. Oct 13 06:53:38.317345 containerd[1925]: time="2025-10-13T06:53:38.317317433Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:38.317609 containerd[1925]: time="2025-10-13T06:53:38.317522488Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924206" Oct 13 06:53:38.317825 containerd[1925]: time="2025-10-13T06:53:38.317789624Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:38.318672 containerd[1925]: time="2025-10-13T06:53:38.318655537Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:38.319054 containerd[1925]: time="2025-10-13T06:53:38.319014771Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 999.030393ms" Oct 13 06:53:38.319054 containerd[1925]: time="2025-10-13T06:53:38.319029369Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Oct 13 06:53:38.319263 containerd[1925]: time="2025-10-13T06:53:38.319252385Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Oct 13 06:53:38.827780 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2284329305.mount: Deactivated successfully. Oct 13 06:53:39.324444 containerd[1925]: time="2025-10-13T06:53:39.324420375Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:39.324701 containerd[1925]: time="2025-10-13T06:53:39.324592382Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Oct 13 06:53:39.325024 containerd[1925]: time="2025-10-13T06:53:39.325011094Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:39.326586 containerd[1925]: time="2025-10-13T06:53:39.326573775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:39.327044 containerd[1925]: time="2025-10-13T06:53:39.327030054Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.007763462s" Oct 13 06:53:39.327087 containerd[1925]: time="2025-10-13T06:53:39.327045654Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Oct 13 06:53:39.327340 containerd[1925]: time="2025-10-13T06:53:39.327328314Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Oct 13 06:53:39.565935 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Oct 13 06:53:39.566931 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:53:39.823187 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:53:39.825261 (kubelet)[2637]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Oct 13 06:53:39.849535 kubelet[2637]: E1013 06:53:39.849512 2637 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Oct 13 06:53:39.850692 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Oct 13 06:53:39.850781 systemd[1]: kubelet.service: Failed with result 'exit-code'. Oct 13 06:53:39.850959 systemd[1]: kubelet.service: Consumed 105ms CPU time, 112.8M memory peak. Oct 13 06:53:39.959516 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3772736600.mount: Deactivated successfully. Oct 13 06:53:39.960812 containerd[1925]: time="2025-10-13T06:53:39.960770960Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 13 06:53:39.961003 containerd[1925]: time="2025-10-13T06:53:39.960990433Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Oct 13 06:53:39.961329 containerd[1925]: time="2025-10-13T06:53:39.961315406Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 13 06:53:39.962125 containerd[1925]: time="2025-10-13T06:53:39.962112732Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Oct 13 06:53:39.962533 containerd[1925]: time="2025-10-13T06:53:39.962520107Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 635.176384ms" Oct 13 06:53:39.962567 containerd[1925]: time="2025-10-13T06:53:39.962535909Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Oct 13 06:53:39.962771 containerd[1925]: time="2025-10-13T06:53:39.962758934Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Oct 13 06:53:40.367436 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3837464243.mount: Deactivated successfully. Oct 13 06:53:41.685116 containerd[1925]: time="2025-10-13T06:53:41.685060893Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:41.685322 containerd[1925]: time="2025-10-13T06:53:41.685240985Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682056" Oct 13 06:53:41.685582 containerd[1925]: time="2025-10-13T06:53:41.685570910Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:41.687315 containerd[1925]: time="2025-10-13T06:53:41.687273358Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:41.687815 containerd[1925]: time="2025-10-13T06:53:41.687775731Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.725002379s" Oct 13 06:53:41.687815 containerd[1925]: time="2025-10-13T06:53:41.687790253Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Oct 13 06:53:43.359422 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:53:43.359528 systemd[1]: kubelet.service: Consumed 105ms CPU time, 112.8M memory peak. Oct 13 06:53:43.360644 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:53:43.377180 systemd[1]: Reload requested from client PID 2761 ('systemctl') (unit session-11.scope)... Oct 13 06:53:43.377187 systemd[1]: Reloading... Oct 13 06:53:43.411767 zram_generator::config[2806]: No configuration found. Oct 13 06:53:43.557863 systemd[1]: Reloading finished in 180 ms. Oct 13 06:53:43.574098 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Oct 13 06:53:43.574141 systemd[1]: kubelet.service: Failed with result 'signal'. Oct 13 06:53:43.574264 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:53:43.574286 systemd[1]: kubelet.service: Consumed 44ms CPU time, 79.9M memory peak. Oct 13 06:53:43.575447 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:53:43.851583 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:53:43.854434 (kubelet)[2870]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 13 06:53:43.877709 kubelet[2870]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 06:53:43.877709 kubelet[2870]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 13 06:53:43.877709 kubelet[2870]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 06:53:43.877926 kubelet[2870]: I1013 06:53:43.877775 2870 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 13 06:53:44.129261 kubelet[2870]: I1013 06:53:44.129172 2870 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Oct 13 06:53:44.129261 kubelet[2870]: I1013 06:53:44.129187 2870 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 13 06:53:44.129399 kubelet[2870]: I1013 06:53:44.129349 2870 server.go:954] "Client rotation is on, will bootstrap in background" Oct 13 06:53:44.151072 kubelet[2870]: E1013 06:53:44.151031 2870 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.94.25:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.94.25:6443: connect: connection refused" logger="UnhandledError" Oct 13 06:53:44.153249 kubelet[2870]: I1013 06:53:44.153217 2870 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 13 06:53:44.158937 kubelet[2870]: I1013 06:53:44.158930 2870 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 13 06:53:44.167032 kubelet[2870]: I1013 06:53:44.167000 2870 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Oct 13 06:53:44.167177 kubelet[2870]: I1013 06:53:44.167133 2870 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 13 06:53:44.167265 kubelet[2870]: I1013 06:53:44.167146 2870 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.1.0-a-3e5fd6a38a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 13 06:53:44.167801 kubelet[2870]: I1013 06:53:44.167764 2870 topology_manager.go:138] "Creating topology manager with none policy" Oct 13 06:53:44.167801 kubelet[2870]: I1013 06:53:44.167773 2870 container_manager_linux.go:304] "Creating device plugin manager" Oct 13 06:53:44.167851 kubelet[2870]: I1013 06:53:44.167833 2870 state_mem.go:36] "Initialized new in-memory state store" Oct 13 06:53:44.170699 kubelet[2870]: I1013 06:53:44.170637 2870 kubelet.go:446] "Attempting to sync node with API server" Oct 13 06:53:44.170699 kubelet[2870]: I1013 06:53:44.170651 2870 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 13 06:53:44.170699 kubelet[2870]: I1013 06:53:44.170699 2870 kubelet.go:352] "Adding apiserver pod source" Oct 13 06:53:44.170803 kubelet[2870]: I1013 06:53:44.170706 2870 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 13 06:53:44.173434 kubelet[2870]: W1013 06:53:44.173410 2870 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.94.25:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.94.25:6443: connect: connection refused Oct 13 06:53:44.173467 kubelet[2870]: E1013 06:53:44.173454 2870 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.94.25:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.94.25:6443: connect: connection refused" logger="UnhandledError" Oct 13 06:53:44.174151 kubelet[2870]: I1013 06:53:44.174138 2870 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 13 06:53:44.174593 kubelet[2870]: W1013 06:53:44.174572 2870 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.94.25:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.1.0-a-3e5fd6a38a&limit=500&resourceVersion=0": dial tcp 139.178.94.25:6443: connect: connection refused Oct 13 06:53:44.174630 kubelet[2870]: E1013 06:53:44.174598 2870 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.94.25:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459.1.0-a-3e5fd6a38a&limit=500&resourceVersion=0\": dial tcp 139.178.94.25:6443: connect: connection refused" logger="UnhandledError" Oct 13 06:53:44.174746 kubelet[2870]: I1013 06:53:44.174724 2870 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 13 06:53:44.174783 kubelet[2870]: W1013 06:53:44.174777 2870 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Oct 13 06:53:44.176208 kubelet[2870]: I1013 06:53:44.176201 2870 watchdog_linux.go:99] "Systemd watchdog is not enabled" Oct 13 06:53:44.176239 kubelet[2870]: I1013 06:53:44.176222 2870 server.go:1287] "Started kubelet" Oct 13 06:53:44.176279 kubelet[2870]: I1013 06:53:44.176261 2870 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Oct 13 06:53:44.176359 kubelet[2870]: I1013 06:53:44.176311 2870 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 13 06:53:44.176514 kubelet[2870]: I1013 06:53:44.176504 2870 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 13 06:53:44.177005 kubelet[2870]: I1013 06:53:44.176995 2870 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 13 06:53:44.177005 kubelet[2870]: I1013 06:53:44.176998 2870 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 13 06:53:44.177073 kubelet[2870]: I1013 06:53:44.177039 2870 volume_manager.go:297] "Starting Kubelet Volume Manager" Oct 13 06:53:44.177073 kubelet[2870]: I1013 06:53:44.177067 2870 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Oct 13 06:53:44.177130 kubelet[2870]: I1013 06:53:44.177079 2870 server.go:479] "Adding debug handlers to kubelet server" Oct 13 06:53:44.177130 kubelet[2870]: I1013 06:53:44.177120 2870 reconciler.go:26] "Reconciler: start to sync state" Oct 13 06:53:44.177183 kubelet[2870]: E1013 06:53:44.177057 2870 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.1.0-a-3e5fd6a38a\" not found" Oct 13 06:53:44.177215 kubelet[2870]: E1013 06:53:44.177197 2870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.25:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.1.0-a-3e5fd6a38a?timeout=10s\": dial tcp 139.178.94.25:6443: connect: connection refused" interval="200ms" Oct 13 06:53:44.177273 kubelet[2870]: W1013 06:53:44.177244 2870 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.94.25:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.94.25:6443: connect: connection refused Oct 13 06:53:44.177314 kubelet[2870]: E1013 06:53:44.177281 2870 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.94.25:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.94.25:6443: connect: connection refused" logger="UnhandledError" Oct 13 06:53:44.177416 kubelet[2870]: I1013 06:53:44.177402 2870 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 13 06:53:44.178692 kubelet[2870]: I1013 06:53:44.178683 2870 factory.go:221] Registration of the containerd container factory successfully Oct 13 06:53:44.178692 kubelet[2870]: I1013 06:53:44.178690 2870 factory.go:221] Registration of the systemd container factory successfully Oct 13 06:53:44.178762 kubelet[2870]: E1013 06:53:44.178740 2870 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 13 06:53:44.181440 kubelet[2870]: E1013 06:53:44.179613 2870 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.94.25:6443/api/v1/namespaces/default/events\": dial tcp 139.178.94.25:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459.1.0-a-3e5fd6a38a.186dfa78597e6971 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459.1.0-a-3e5fd6a38a,UID:ci-4459.1.0-a-3e5fd6a38a,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459.1.0-a-3e5fd6a38a,},FirstTimestamp:2025-10-13 06:53:44.176208241 +0000 UTC m=+0.319812316,LastTimestamp:2025-10-13 06:53:44.176208241 +0000 UTC m=+0.319812316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459.1.0-a-3e5fd6a38a,}" Oct 13 06:53:44.186369 kubelet[2870]: I1013 06:53:44.186347 2870 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 13 06:53:44.186424 kubelet[2870]: I1013 06:53:44.186398 2870 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 13 06:53:44.186424 kubelet[2870]: I1013 06:53:44.186405 2870 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 13 06:53:44.186424 kubelet[2870]: I1013 06:53:44.186414 2870 state_mem.go:36] "Initialized new in-memory state store" Oct 13 06:53:44.186885 kubelet[2870]: I1013 06:53:44.186874 2870 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 13 06:53:44.186885 kubelet[2870]: I1013 06:53:44.186886 2870 status_manager.go:227] "Starting to sync pod status with apiserver" Oct 13 06:53:44.187137 kubelet[2870]: I1013 06:53:44.186897 2870 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 13 06:53:44.187137 kubelet[2870]: I1013 06:53:44.186902 2870 kubelet.go:2382] "Starting kubelet main sync loop" Oct 13 06:53:44.187137 kubelet[2870]: E1013 06:53:44.186928 2870 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 13 06:53:44.187211 kubelet[2870]: W1013 06:53:44.187184 2870 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.94.25:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.94.25:6443: connect: connection refused Oct 13 06:53:44.187239 kubelet[2870]: E1013 06:53:44.187228 2870 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.94.25:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.94.25:6443: connect: connection refused" logger="UnhandledError" Oct 13 06:53:44.187624 kubelet[2870]: I1013 06:53:44.187618 2870 policy_none.go:49] "None policy: Start" Oct 13 06:53:44.187647 kubelet[2870]: I1013 06:53:44.187626 2870 memory_manager.go:186] "Starting memorymanager" policy="None" Oct 13 06:53:44.187647 kubelet[2870]: I1013 06:53:44.187633 2870 state_mem.go:35] "Initializing new in-memory state store" Oct 13 06:53:44.190422 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Oct 13 06:53:44.212532 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Oct 13 06:53:44.214653 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Oct 13 06:53:44.230350 kubelet[2870]: I1013 06:53:44.230330 2870 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 13 06:53:44.230496 kubelet[2870]: I1013 06:53:44.230480 2870 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 13 06:53:44.230557 kubelet[2870]: I1013 06:53:44.230491 2870 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 13 06:53:44.230678 kubelet[2870]: I1013 06:53:44.230650 2870 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 13 06:53:44.231158 kubelet[2870]: E1013 06:53:44.231138 2870 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 13 06:53:44.231232 kubelet[2870]: E1013 06:53:44.231182 2870 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459.1.0-a-3e5fd6a38a\" not found" Oct 13 06:53:44.311858 systemd[1]: Created slice kubepods-burstable-pode69f1455e87035beef1d37b889b9cc0a.slice - libcontainer container kubepods-burstable-pode69f1455e87035beef1d37b889b9cc0a.slice. Oct 13 06:53:44.334405 kubelet[2870]: I1013 06:53:44.334287 2870 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.335188 kubelet[2870]: E1013 06:53:44.335083 2870 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.25:6443/api/v1/nodes\": dial tcp 139.178.94.25:6443: connect: connection refused" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.339177 kubelet[2870]: E1013 06:53:44.339093 2870 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.1.0-a-3e5fd6a38a\" not found" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.347646 systemd[1]: Created slice kubepods-burstable-pod5886bdb40e1bde6dc58c15487a28bc6d.slice - libcontainer container kubepods-burstable-pod5886bdb40e1bde6dc58c15487a28bc6d.slice. Oct 13 06:53:44.365750 kubelet[2870]: E1013 06:53:44.365651 2870 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.1.0-a-3e5fd6a38a\" not found" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.373159 systemd[1]: Created slice kubepods-burstable-pod61af6362359a4aeb4497d3c9433deb60.slice - libcontainer container kubepods-burstable-pod61af6362359a4aeb4497d3c9433deb60.slice. Oct 13 06:53:44.377750 kubelet[2870]: E1013 06:53:44.377694 2870 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.1.0-a-3e5fd6a38a\" not found" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.378297 kubelet[2870]: I1013 06:53:44.378237 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/61af6362359a4aeb4497d3c9433deb60-kubeconfig\") pod \"kube-scheduler-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"61af6362359a4aeb4497d3c9433deb60\") " pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.378504 kubelet[2870]: E1013 06:53:44.378237 2870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.25:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.1.0-a-3e5fd6a38a?timeout=10s\": dial tcp 139.178.94.25:6443: connect: connection refused" interval="400ms" Oct 13 06:53:44.378504 kubelet[2870]: I1013 06:53:44.378320 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e69f1455e87035beef1d37b889b9cc0a-ca-certs\") pod \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"e69f1455e87035beef1d37b889b9cc0a\") " pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.378504 kubelet[2870]: I1013 06:53:44.378391 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e69f1455e87035beef1d37b889b9cc0a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"e69f1455e87035beef1d37b889b9cc0a\") " pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.378504 kubelet[2870]: I1013 06:53:44.378441 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-ca-certs\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.378504 kubelet[2870]: I1013 06:53:44.378492 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.379043 kubelet[2870]: I1013 06:53:44.378536 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e69f1455e87035beef1d37b889b9cc0a-k8s-certs\") pod \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"e69f1455e87035beef1d37b889b9cc0a\") " pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.379043 kubelet[2870]: I1013 06:53:44.378624 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.379043 kubelet[2870]: I1013 06:53:44.378699 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-k8s-certs\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.379043 kubelet[2870]: I1013 06:53:44.378753 2870 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-kubeconfig\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.539468 kubelet[2870]: I1013 06:53:44.539292 2870 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.540190 kubelet[2870]: E1013 06:53:44.540111 2870 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://139.178.94.25:6443/api/v1/nodes\": dial tcp 139.178.94.25:6443: connect: connection refused" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:44.641617 containerd[1925]: time="2025-10-13T06:53:44.641483133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.1.0-a-3e5fd6a38a,Uid:e69f1455e87035beef1d37b889b9cc0a,Namespace:kube-system,Attempt:0,}" Oct 13 06:53:44.662038 containerd[1925]: time="2025-10-13T06:53:44.662018393Z" level=info msg="connecting to shim f078d8e1d1f60349c59982011462b6193230d410f810f0e9f2a4b484101798c1" address="unix:///run/containerd/s/5a277295096889e05ebde538e97032097dd39b93a136e4e9a647190c75eb3205" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:53:44.667293 containerd[1925]: time="2025-10-13T06:53:44.667245619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a,Uid:5886bdb40e1bde6dc58c15487a28bc6d,Namespace:kube-system,Attempt:0,}" Oct 13 06:53:44.675325 containerd[1925]: time="2025-10-13T06:53:44.675276487Z" level=info msg="connecting to shim 1577cd7c0c6924e3ef62ca735e45ee404c4ca9f6d71251bcf1f5ac5738bb22bf" address="unix:///run/containerd/s/11ddc25e4c29c4d32e8912940baccb18298076a600f2c2d96f0da005c465a768" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:53:44.679505 containerd[1925]: time="2025-10-13T06:53:44.679460651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.1.0-a-3e5fd6a38a,Uid:61af6362359a4aeb4497d3c9433deb60,Namespace:kube-system,Attempt:0,}" Oct 13 06:53:44.682786 systemd[1]: Started cri-containerd-f078d8e1d1f60349c59982011462b6193230d410f810f0e9f2a4b484101798c1.scope - libcontainer container f078d8e1d1f60349c59982011462b6193230d410f810f0e9f2a4b484101798c1. Oct 13 06:53:44.686503 containerd[1925]: time="2025-10-13T06:53:44.686479394Z" level=info msg="connecting to shim 6d7b123152fa7bedd13d2acb58f9e99f0a128ba6a1528bde2743e5ea9c971689" address="unix:///run/containerd/s/188cac417fe0057330cb7f2e8380d8fa362ba2295df7a5932414d773ca2d3568" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:53:44.689322 systemd[1]: Started cri-containerd-1577cd7c0c6924e3ef62ca735e45ee404c4ca9f6d71251bcf1f5ac5738bb22bf.scope - libcontainer container 1577cd7c0c6924e3ef62ca735e45ee404c4ca9f6d71251bcf1f5ac5738bb22bf. Oct 13 06:53:44.694476 systemd[1]: Started cri-containerd-6d7b123152fa7bedd13d2acb58f9e99f0a128ba6a1528bde2743e5ea9c971689.scope - libcontainer container 6d7b123152fa7bedd13d2acb58f9e99f0a128ba6a1528bde2743e5ea9c971689. Oct 13 06:53:44.712286 containerd[1925]: time="2025-10-13T06:53:44.712264779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459.1.0-a-3e5fd6a38a,Uid:e69f1455e87035beef1d37b889b9cc0a,Namespace:kube-system,Attempt:0,} returns sandbox id \"f078d8e1d1f60349c59982011462b6193230d410f810f0e9f2a4b484101798c1\"" Oct 13 06:53:44.713511 containerd[1925]: time="2025-10-13T06:53:44.713499599Z" level=info msg="CreateContainer within sandbox \"f078d8e1d1f60349c59982011462b6193230d410f810f0e9f2a4b484101798c1\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Oct 13 06:53:44.716731 containerd[1925]: time="2025-10-13T06:53:44.716716957Z" level=info msg="Container 56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:53:44.719093 containerd[1925]: time="2025-10-13T06:53:44.719078909Z" level=info msg="CreateContainer within sandbox \"f078d8e1d1f60349c59982011462b6193230d410f810f0e9f2a4b484101798c1\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17\"" Oct 13 06:53:44.719314 containerd[1925]: time="2025-10-13T06:53:44.719304358Z" level=info msg="StartContainer for \"56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17\"" Oct 13 06:53:44.719859 containerd[1925]: time="2025-10-13T06:53:44.719847181Z" level=info msg="connecting to shim 56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17" address="unix:///run/containerd/s/5a277295096889e05ebde538e97032097dd39b93a136e4e9a647190c75eb3205" protocol=ttrpc version=3 Oct 13 06:53:44.723557 containerd[1925]: time="2025-10-13T06:53:44.723530452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a,Uid:5886bdb40e1bde6dc58c15487a28bc6d,Namespace:kube-system,Attempt:0,} returns sandbox id \"1577cd7c0c6924e3ef62ca735e45ee404c4ca9f6d71251bcf1f5ac5738bb22bf\"" Oct 13 06:53:44.724644 containerd[1925]: time="2025-10-13T06:53:44.724630944Z" level=info msg="CreateContainer within sandbox \"1577cd7c0c6924e3ef62ca735e45ee404c4ca9f6d71251bcf1f5ac5738bb22bf\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Oct 13 06:53:44.726516 containerd[1925]: time="2025-10-13T06:53:44.726498667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459.1.0-a-3e5fd6a38a,Uid:61af6362359a4aeb4497d3c9433deb60,Namespace:kube-system,Attempt:0,} returns sandbox id \"6d7b123152fa7bedd13d2acb58f9e99f0a128ba6a1528bde2743e5ea9c971689\"" Oct 13 06:53:44.727409 containerd[1925]: time="2025-10-13T06:53:44.727395411Z" level=info msg="CreateContainer within sandbox \"6d7b123152fa7bedd13d2acb58f9e99f0a128ba6a1528bde2743e5ea9c971689\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Oct 13 06:53:44.727910 containerd[1925]: time="2025-10-13T06:53:44.727899860Z" level=info msg="Container 833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:53:44.730204 containerd[1925]: time="2025-10-13T06:53:44.730192149Z" level=info msg="Container 0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:53:44.731106 containerd[1925]: time="2025-10-13T06:53:44.731096110Z" level=info msg="CreateContainer within sandbox \"1577cd7c0c6924e3ef62ca735e45ee404c4ca9f6d71251bcf1f5ac5738bb22bf\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058\"" Oct 13 06:53:44.731245 containerd[1925]: time="2025-10-13T06:53:44.731232886Z" level=info msg="StartContainer for \"833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058\"" Oct 13 06:53:44.731783 containerd[1925]: time="2025-10-13T06:53:44.731769752Z" level=info msg="connecting to shim 833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058" address="unix:///run/containerd/s/11ddc25e4c29c4d32e8912940baccb18298076a600f2c2d96f0da005c465a768" protocol=ttrpc version=3 Oct 13 06:53:44.732363 containerd[1925]: time="2025-10-13T06:53:44.732350593Z" level=info msg="CreateContainer within sandbox \"6d7b123152fa7bedd13d2acb58f9e99f0a128ba6a1528bde2743e5ea9c971689\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01\"" Oct 13 06:53:44.732493 containerd[1925]: time="2025-10-13T06:53:44.732483199Z" level=info msg="StartContainer for \"0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01\"" Oct 13 06:53:44.732991 containerd[1925]: time="2025-10-13T06:53:44.732977380Z" level=info msg="connecting to shim 0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01" address="unix:///run/containerd/s/188cac417fe0057330cb7f2e8380d8fa362ba2295df7a5932414d773ca2d3568" protocol=ttrpc version=3 Oct 13 06:53:44.736781 systemd[1]: Started cri-containerd-56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17.scope - libcontainer container 56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17. Oct 13 06:53:44.738630 systemd[1]: Started cri-containerd-833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058.scope - libcontainer container 833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058. Oct 13 06:53:44.740098 systemd[1]: Started cri-containerd-0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01.scope - libcontainer container 0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01. Oct 13 06:53:44.764321 containerd[1925]: time="2025-10-13T06:53:44.764299073Z" level=info msg="StartContainer for \"56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17\" returns successfully" Oct 13 06:53:44.765343 containerd[1925]: time="2025-10-13T06:53:44.765326737Z" level=info msg="StartContainer for \"833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058\" returns successfully" Oct 13 06:53:44.768349 containerd[1925]: time="2025-10-13T06:53:44.768328104Z" level=info msg="StartContainer for \"0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01\" returns successfully" Oct 13 06:53:44.778993 kubelet[2870]: E1013 06:53:44.778937 2870 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.94.25:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459.1.0-a-3e5fd6a38a?timeout=10s\": dial tcp 139.178.94.25:6443: connect: connection refused" interval="800ms" Oct 13 06:53:44.941564 kubelet[2870]: I1013 06:53:44.941546 2870 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.191379 kubelet[2870]: E1013 06:53:45.191319 2870 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.1.0-a-3e5fd6a38a\" not found" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.192064 kubelet[2870]: E1013 06:53:45.191996 2870 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.1.0-a-3e5fd6a38a\" not found" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.192730 kubelet[2870]: E1013 06:53:45.192666 2870 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459.1.0-a-3e5fd6a38a\" not found" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.585965 kubelet[2870]: I1013 06:53:45.585893 2870 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.585965 kubelet[2870]: E1013 06:53:45.585983 2870 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4459.1.0-a-3e5fd6a38a\": node \"ci-4459.1.0-a-3e5fd6a38a\" not found" Oct 13 06:53:45.678344 kubelet[2870]: I1013 06:53:45.678256 2870 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.686083 kubelet[2870]: E1013 06:53:45.686006 2870 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.686083 kubelet[2870]: I1013 06:53:45.686047 2870 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.688915 kubelet[2870]: E1013 06:53:45.688841 2870 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.688915 kubelet[2870]: I1013 06:53:45.688880 2870 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:45.691396 kubelet[2870]: E1013 06:53:45.691320 2870 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.1.0-a-3e5fd6a38a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:46.173031 kubelet[2870]: I1013 06:53:46.172948 2870 apiserver.go:52] "Watching apiserver" Oct 13 06:53:46.177933 kubelet[2870]: I1013 06:53:46.177868 2870 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Oct 13 06:53:46.193984 kubelet[2870]: I1013 06:53:46.193921 2870 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:46.194169 kubelet[2870]: I1013 06:53:46.194104 2870 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:46.197770 kubelet[2870]: E1013 06:53:46.197714 2870 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.1.0-a-3e5fd6a38a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:46.197770 kubelet[2870]: E1013 06:53:46.197717 2870 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:46.756418 kubelet[2870]: I1013 06:53:46.756350 2870 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:46.762935 kubelet[2870]: W1013 06:53:46.762839 2870 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Oct 13 06:53:47.609331 kubelet[2870]: I1013 06:53:47.609265 2870 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:47.616192 kubelet[2870]: W1013 06:53:47.616133 2870 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Oct 13 06:53:47.810532 systemd[1]: Reload requested from client PID 3182 ('systemctl') (unit session-11.scope)... Oct 13 06:53:47.810540 systemd[1]: Reloading... Oct 13 06:53:47.850721 zram_generator::config[3227]: No configuration found. Oct 13 06:53:48.005792 systemd[1]: Reloading finished in 195 ms. Oct 13 06:53:48.037672 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:53:48.049142 systemd[1]: kubelet.service: Deactivated successfully. Oct 13 06:53:48.049281 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:53:48.049306 systemd[1]: kubelet.service: Consumed 827ms CPU time, 140M memory peak. Oct 13 06:53:48.050696 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Oct 13 06:53:48.341118 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Oct 13 06:53:48.343368 (kubelet)[3292]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Oct 13 06:53:48.364737 kubelet[3292]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 06:53:48.364737 kubelet[3292]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Oct 13 06:53:48.364737 kubelet[3292]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 06:53:48.364984 kubelet[3292]: I1013 06:53:48.364779 3292 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 13 06:53:48.369448 kubelet[3292]: I1013 06:53:48.369432 3292 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Oct 13 06:53:48.369448 kubelet[3292]: I1013 06:53:48.369446 3292 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 13 06:53:48.369614 kubelet[3292]: I1013 06:53:48.369607 3292 server.go:954] "Client rotation is on, will bootstrap in background" Oct 13 06:53:48.370392 kubelet[3292]: I1013 06:53:48.370352 3292 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 13 06:53:48.374083 kubelet[3292]: I1013 06:53:48.374043 3292 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Oct 13 06:53:48.375991 kubelet[3292]: I1013 06:53:48.375982 3292 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 13 06:53:48.383614 kubelet[3292]: I1013 06:53:48.383567 3292 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Oct 13 06:53:48.383718 kubelet[3292]: I1013 06:53:48.383692 3292 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 13 06:53:48.383860 kubelet[3292]: I1013 06:53:48.383713 3292 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459.1.0-a-3e5fd6a38a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 13 06:53:48.383860 kubelet[3292]: I1013 06:53:48.383837 3292 topology_manager.go:138] "Creating topology manager with none policy" Oct 13 06:53:48.383860 kubelet[3292]: I1013 06:53:48.383844 3292 container_manager_linux.go:304] "Creating device plugin manager" Oct 13 06:53:48.383967 kubelet[3292]: I1013 06:53:48.383877 3292 state_mem.go:36] "Initialized new in-memory state store" Oct 13 06:53:48.384030 kubelet[3292]: I1013 06:53:48.383997 3292 kubelet.go:446] "Attempting to sync node with API server" Oct 13 06:53:48.384030 kubelet[3292]: I1013 06:53:48.384010 3292 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 13 06:53:48.384030 kubelet[3292]: I1013 06:53:48.384023 3292 kubelet.go:352] "Adding apiserver pod source" Oct 13 06:53:48.384085 kubelet[3292]: I1013 06:53:48.384033 3292 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 13 06:53:48.384503 kubelet[3292]: I1013 06:53:48.384492 3292 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Oct 13 06:53:48.384765 kubelet[3292]: I1013 06:53:48.384757 3292 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 13 06:53:48.385010 kubelet[3292]: I1013 06:53:48.385004 3292 watchdog_linux.go:99] "Systemd watchdog is not enabled" Oct 13 06:53:48.385035 kubelet[3292]: I1013 06:53:48.385021 3292 server.go:1287] "Started kubelet" Oct 13 06:53:48.385127 kubelet[3292]: I1013 06:53:48.385106 3292 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Oct 13 06:53:48.385127 kubelet[3292]: I1013 06:53:48.385089 3292 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 13 06:53:48.385257 kubelet[3292]: I1013 06:53:48.385248 3292 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 13 06:53:48.385925 kubelet[3292]: I1013 06:53:48.385914 3292 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 13 06:53:48.385971 kubelet[3292]: I1013 06:53:48.385921 3292 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Oct 13 06:53:48.385971 kubelet[3292]: I1013 06:53:48.385943 3292 server.go:479] "Adding debug handlers to kubelet server" Oct 13 06:53:48.386037 kubelet[3292]: E1013 06:53:48.385976 3292 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459.1.0-a-3e5fd6a38a\" not found" Oct 13 06:53:48.386037 kubelet[3292]: I1013 06:53:48.385997 3292 volume_manager.go:297] "Starting Kubelet Volume Manager" Oct 13 06:53:48.386106 kubelet[3292]: I1013 06:53:48.386053 3292 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Oct 13 06:53:48.386153 kubelet[3292]: I1013 06:53:48.386142 3292 reconciler.go:26] "Reconciler: start to sync state" Oct 13 06:53:48.386522 kubelet[3292]: E1013 06:53:48.386468 3292 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Oct 13 06:53:48.387627 kubelet[3292]: I1013 06:53:48.386933 3292 factory.go:221] Registration of the systemd container factory successfully Oct 13 06:53:48.387627 kubelet[3292]: I1013 06:53:48.387109 3292 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Oct 13 06:53:48.390035 kubelet[3292]: I1013 06:53:48.389393 3292 factory.go:221] Registration of the containerd container factory successfully Oct 13 06:53:48.392570 kubelet[3292]: I1013 06:53:48.392545 3292 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 13 06:53:48.393156 kubelet[3292]: I1013 06:53:48.393143 3292 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 13 06:53:48.393224 kubelet[3292]: I1013 06:53:48.393169 3292 status_manager.go:227] "Starting to sync pod status with apiserver" Oct 13 06:53:48.393224 kubelet[3292]: I1013 06:53:48.393184 3292 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Oct 13 06:53:48.393224 kubelet[3292]: I1013 06:53:48.393189 3292 kubelet.go:2382] "Starting kubelet main sync loop" Oct 13 06:53:48.393224 kubelet[3292]: E1013 06:53:48.393213 3292 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 13 06:53:48.403592 kubelet[3292]: I1013 06:53:48.403551 3292 cpu_manager.go:221] "Starting CPU manager" policy="none" Oct 13 06:53:48.403592 kubelet[3292]: I1013 06:53:48.403559 3292 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Oct 13 06:53:48.403592 kubelet[3292]: I1013 06:53:48.403569 3292 state_mem.go:36] "Initialized new in-memory state store" Oct 13 06:53:48.403697 kubelet[3292]: I1013 06:53:48.403651 3292 state_mem.go:88] "Updated default CPUSet" cpuSet="" Oct 13 06:53:48.403697 kubelet[3292]: I1013 06:53:48.403661 3292 state_mem.go:96] "Updated CPUSet assignments" assignments={} Oct 13 06:53:48.403697 kubelet[3292]: I1013 06:53:48.403674 3292 policy_none.go:49] "None policy: Start" Oct 13 06:53:48.403697 kubelet[3292]: I1013 06:53:48.403679 3292 memory_manager.go:186] "Starting memorymanager" policy="None" Oct 13 06:53:48.403697 kubelet[3292]: I1013 06:53:48.403685 3292 state_mem.go:35] "Initializing new in-memory state store" Oct 13 06:53:48.403769 kubelet[3292]: I1013 06:53:48.403741 3292 state_mem.go:75] "Updated machine memory state" Oct 13 06:53:48.405523 kubelet[3292]: I1013 06:53:48.405486 3292 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 13 06:53:48.405571 kubelet[3292]: I1013 06:53:48.405566 3292 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 13 06:53:48.405596 kubelet[3292]: I1013 06:53:48.405572 3292 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 13 06:53:48.405650 kubelet[3292]: I1013 06:53:48.405644 3292 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 13 06:53:48.405988 kubelet[3292]: E1013 06:53:48.405947 3292 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Oct 13 06:53:48.495438 kubelet[3292]: I1013 06:53:48.495282 3292 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.495438 kubelet[3292]: I1013 06:53:48.495393 3292 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.495786 kubelet[3292]: I1013 06:53:48.495542 3292 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.503629 kubelet[3292]: W1013 06:53:48.503523 3292 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Oct 13 06:53:48.503629 kubelet[3292]: W1013 06:53:48.503585 3292 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Oct 13 06:53:48.503936 kubelet[3292]: E1013 06:53:48.503751 3292 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" already exists" pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.503936 kubelet[3292]: W1013 06:53:48.503870 3292 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Oct 13 06:53:48.504139 kubelet[3292]: E1013 06:53:48.503991 3292 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.1.0-a-3e5fd6a38a\" already exists" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.512517 kubelet[3292]: I1013 06:53:48.512426 3292 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.522113 kubelet[3292]: I1013 06:53:48.522043 3292 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.522320 kubelet[3292]: I1013 06:53:48.522191 3292 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.587324 kubelet[3292]: I1013 06:53:48.587242 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-ca-certs\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.587734 kubelet[3292]: I1013 06:53:48.587372 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-flexvolume-dir\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.587734 kubelet[3292]: I1013 06:53:48.587463 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-k8s-certs\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.587734 kubelet[3292]: I1013 06:53:48.587537 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.587734 kubelet[3292]: I1013 06:53:48.587613 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/61af6362359a4aeb4497d3c9433deb60-kubeconfig\") pod \"kube-scheduler-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"61af6362359a4aeb4497d3c9433deb60\") " pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.588082 kubelet[3292]: I1013 06:53:48.587733 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e69f1455e87035beef1d37b889b9cc0a-k8s-certs\") pod \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"e69f1455e87035beef1d37b889b9cc0a\") " pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.588082 kubelet[3292]: I1013 06:53:48.587788 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/5886bdb40e1bde6dc58c15487a28bc6d-kubeconfig\") pod \"kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"5886bdb40e1bde6dc58c15487a28bc6d\") " pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.588082 kubelet[3292]: I1013 06:53:48.587856 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e69f1455e87035beef1d37b889b9cc0a-ca-certs\") pod \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"e69f1455e87035beef1d37b889b9cc0a\") " pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:48.588082 kubelet[3292]: I1013 06:53:48.587923 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e69f1455e87035beef1d37b889b9cc0a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" (UID: \"e69f1455e87035beef1d37b889b9cc0a\") " pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:49.384376 kubelet[3292]: I1013 06:53:49.384330 3292 apiserver.go:52] "Watching apiserver" Oct 13 06:53:49.386985 kubelet[3292]: I1013 06:53:49.386938 3292 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Oct 13 06:53:49.397011 kubelet[3292]: I1013 06:53:49.396987 3292 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:49.397127 kubelet[3292]: I1013 06:53:49.397119 3292 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:49.400171 kubelet[3292]: W1013 06:53:49.400153 3292 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Oct 13 06:53:49.400171 kubelet[3292]: W1013 06:53:49.400168 3292 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Oct 13 06:53:49.400298 kubelet[3292]: E1013 06:53:49.400197 3292 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459.1.0-a-3e5fd6a38a\" already exists" pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:49.400298 kubelet[3292]: E1013 06:53:49.400206 3292 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459.1.0-a-3e5fd6a38a\" already exists" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:53:49.412728 kubelet[3292]: I1013 06:53:49.412696 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459.1.0-a-3e5fd6a38a" podStartSLOduration=2.412685855 podStartE2EDuration="2.412685855s" podCreationTimestamp="2025-10-13 06:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:53:49.409057418 +0000 UTC m=+1.063751466" watchObservedRunningTime="2025-10-13 06:53:49.412685855 +0000 UTC m=+1.067379899" Oct 13 06:53:49.416365 kubelet[3292]: I1013 06:53:49.416309 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459.1.0-a-3e5fd6a38a" podStartSLOduration=1.416302287 podStartE2EDuration="1.416302287s" podCreationTimestamp="2025-10-13 06:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:53:49.412686084 +0000 UTC m=+1.067380131" watchObservedRunningTime="2025-10-13 06:53:49.416302287 +0000 UTC m=+1.070996334" Oct 13 06:53:49.420890 kubelet[3292]: I1013 06:53:49.420821 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459.1.0-a-3e5fd6a38a" podStartSLOduration=3.420810045 podStartE2EDuration="3.420810045s" podCreationTimestamp="2025-10-13 06:53:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:53:49.416426697 +0000 UTC m=+1.071120746" watchObservedRunningTime="2025-10-13 06:53:49.420810045 +0000 UTC m=+1.075504090" Oct 13 06:53:53.363995 kubelet[3292]: I1013 06:53:53.363935 3292 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Oct 13 06:53:53.364836 containerd[1925]: time="2025-10-13T06:53:53.364596813Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Oct 13 06:53:53.365424 kubelet[3292]: I1013 06:53:53.365035 3292 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Oct 13 06:53:54.326235 kubelet[3292]: I1013 06:53:54.326168 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a72484f0-05bc-45e8-9b22-4706bb17ae53-xtables-lock\") pod \"kube-proxy-48w9h\" (UID: \"a72484f0-05bc-45e8-9b22-4706bb17ae53\") " pod="kube-system/kube-proxy-48w9h" Oct 13 06:53:54.326496 kubelet[3292]: I1013 06:53:54.326249 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a72484f0-05bc-45e8-9b22-4706bb17ae53-lib-modules\") pod \"kube-proxy-48w9h\" (UID: \"a72484f0-05bc-45e8-9b22-4706bb17ae53\") " pod="kube-system/kube-proxy-48w9h" Oct 13 06:53:54.326496 kubelet[3292]: I1013 06:53:54.326308 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/a72484f0-05bc-45e8-9b22-4706bb17ae53-kube-proxy\") pod \"kube-proxy-48w9h\" (UID: \"a72484f0-05bc-45e8-9b22-4706bb17ae53\") " pod="kube-system/kube-proxy-48w9h" Oct 13 06:53:54.326496 kubelet[3292]: I1013 06:53:54.326364 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzbvl\" (UniqueName: \"kubernetes.io/projected/a72484f0-05bc-45e8-9b22-4706bb17ae53-kube-api-access-dzbvl\") pod \"kube-proxy-48w9h\" (UID: \"a72484f0-05bc-45e8-9b22-4706bb17ae53\") " pod="kube-system/kube-proxy-48w9h" Oct 13 06:53:54.331408 systemd[1]: Created slice kubepods-besteffort-poda72484f0_05bc_45e8_9b22_4706bb17ae53.slice - libcontainer container kubepods-besteffort-poda72484f0_05bc_45e8_9b22_4706bb17ae53.slice. Oct 13 06:53:54.500699 systemd[1]: Created slice kubepods-besteffort-pod84e655dd_a9b6_4720_ac5e_e44e411ab6d7.slice - libcontainer container kubepods-besteffort-pod84e655dd_a9b6_4720_ac5e_e44e411ab6d7.slice. Oct 13 06:53:54.528235 kubelet[3292]: I1013 06:53:54.528207 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltfmd\" (UniqueName: \"kubernetes.io/projected/84e655dd-a9b6-4720-ac5e-e44e411ab6d7-kube-api-access-ltfmd\") pod \"tigera-operator-755d956888-xd2gn\" (UID: \"84e655dd-a9b6-4720-ac5e-e44e411ab6d7\") " pod="tigera-operator/tigera-operator-755d956888-xd2gn" Oct 13 06:53:54.528460 kubelet[3292]: I1013 06:53:54.528239 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/84e655dd-a9b6-4720-ac5e-e44e411ab6d7-var-lib-calico\") pod \"tigera-operator-755d956888-xd2gn\" (UID: \"84e655dd-a9b6-4720-ac5e-e44e411ab6d7\") " pod="tigera-operator/tigera-operator-755d956888-xd2gn" Oct 13 06:53:54.654987 containerd[1925]: time="2025-10-13T06:53:54.654800893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-48w9h,Uid:a72484f0-05bc-45e8-9b22-4706bb17ae53,Namespace:kube-system,Attempt:0,}" Oct 13 06:53:54.662321 containerd[1925]: time="2025-10-13T06:53:54.662281427Z" level=info msg="connecting to shim 2b7bcf33c1480c5675226c0c130f9643910606b093211ddda38e99e09f64e1b6" address="unix:///run/containerd/s/c7be822da5e40ce19ea07d88b5c4c5ba78401315550e43a92d7c753d0b9a8807" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:53:54.677958 systemd[1]: Started cri-containerd-2b7bcf33c1480c5675226c0c130f9643910606b093211ddda38e99e09f64e1b6.scope - libcontainer container 2b7bcf33c1480c5675226c0c130f9643910606b093211ddda38e99e09f64e1b6. Oct 13 06:53:54.690127 containerd[1925]: time="2025-10-13T06:53:54.690109152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-48w9h,Uid:a72484f0-05bc-45e8-9b22-4706bb17ae53,Namespace:kube-system,Attempt:0,} returns sandbox id \"2b7bcf33c1480c5675226c0c130f9643910606b093211ddda38e99e09f64e1b6\"" Oct 13 06:53:54.691242 containerd[1925]: time="2025-10-13T06:53:54.691227976Z" level=info msg="CreateContainer within sandbox \"2b7bcf33c1480c5675226c0c130f9643910606b093211ddda38e99e09f64e1b6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Oct 13 06:53:54.695382 containerd[1925]: time="2025-10-13T06:53:54.695364112Z" level=info msg="Container f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:53:54.698294 containerd[1925]: time="2025-10-13T06:53:54.698281252Z" level=info msg="CreateContainer within sandbox \"2b7bcf33c1480c5675226c0c130f9643910606b093211ddda38e99e09f64e1b6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7\"" Oct 13 06:53:54.698513 containerd[1925]: time="2025-10-13T06:53:54.698503661Z" level=info msg="StartContainer for \"f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7\"" Oct 13 06:53:54.699212 containerd[1925]: time="2025-10-13T06:53:54.699201230Z" level=info msg="connecting to shim f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7" address="unix:///run/containerd/s/c7be822da5e40ce19ea07d88b5c4c5ba78401315550e43a92d7c753d0b9a8807" protocol=ttrpc version=3 Oct 13 06:53:54.711940 systemd[1]: Started cri-containerd-f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7.scope - libcontainer container f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7. Oct 13 06:53:54.732105 containerd[1925]: time="2025-10-13T06:53:54.732077139Z" level=info msg="StartContainer for \"f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7\" returns successfully" Oct 13 06:53:54.803157 containerd[1925]: time="2025-10-13T06:53:54.803095174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-xd2gn,Uid:84e655dd-a9b6-4720-ac5e-e44e411ab6d7,Namespace:tigera-operator,Attempt:0,}" Oct 13 06:53:54.809620 containerd[1925]: time="2025-10-13T06:53:54.809596027Z" level=info msg="connecting to shim d30ce034e5d6d395ab8f0234a2aba2fa4937927d8204a1ab9d7feeb79be0bc8f" address="unix:///run/containerd/s/8cfceb32ea71525d4840c66762eea496ed2513dc0712f00b4fb3c9c0e0710a4e" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:53:54.826170 systemd[1]: Started cri-containerd-d30ce034e5d6d395ab8f0234a2aba2fa4937927d8204a1ab9d7feeb79be0bc8f.scope - libcontainer container d30ce034e5d6d395ab8f0234a2aba2fa4937927d8204a1ab9d7feeb79be0bc8f. Oct 13 06:53:54.908892 containerd[1925]: time="2025-10-13T06:53:54.908801864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-xd2gn,Uid:84e655dd-a9b6-4720-ac5e-e44e411ab6d7,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"d30ce034e5d6d395ab8f0234a2aba2fa4937927d8204a1ab9d7feeb79be0bc8f\"" Oct 13 06:53:54.909561 containerd[1925]: time="2025-10-13T06:53:54.909548461Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Oct 13 06:53:56.538083 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount589275623.mount: Deactivated successfully. Oct 13 06:53:56.773060 containerd[1925]: time="2025-10-13T06:53:56.773007313Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:56.773256 containerd[1925]: time="2025-10-13T06:53:56.773167391Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Oct 13 06:53:56.773637 containerd[1925]: time="2025-10-13T06:53:56.773593458Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:56.774446 containerd[1925]: time="2025-10-13T06:53:56.774405219Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:53:56.775130 containerd[1925]: time="2025-10-13T06:53:56.775088962Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.865523519s" Oct 13 06:53:56.775130 containerd[1925]: time="2025-10-13T06:53:56.775103183Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Oct 13 06:53:56.775997 containerd[1925]: time="2025-10-13T06:53:56.775986011Z" level=info msg="CreateContainer within sandbox \"d30ce034e5d6d395ab8f0234a2aba2fa4937927d8204a1ab9d7feeb79be0bc8f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Oct 13 06:53:56.778641 containerd[1925]: time="2025-10-13T06:53:56.778629742Z" level=info msg="Container 45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:53:56.780733 containerd[1925]: time="2025-10-13T06:53:56.780718108Z" level=info msg="CreateContainer within sandbox \"d30ce034e5d6d395ab8f0234a2aba2fa4937927d8204a1ab9d7feeb79be0bc8f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d\"" Oct 13 06:53:56.780918 containerd[1925]: time="2025-10-13T06:53:56.780905781Z" level=info msg="StartContainer for \"45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d\"" Oct 13 06:53:56.781311 containerd[1925]: time="2025-10-13T06:53:56.781300289Z" level=info msg="connecting to shim 45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d" address="unix:///run/containerd/s/8cfceb32ea71525d4840c66762eea496ed2513dc0712f00b4fb3c9c0e0710a4e" protocol=ttrpc version=3 Oct 13 06:53:56.802968 systemd[1]: Started cri-containerd-45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d.scope - libcontainer container 45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d. Oct 13 06:53:56.815376 containerd[1925]: time="2025-10-13T06:53:56.815351941Z" level=info msg="StartContainer for \"45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d\" returns successfully" Oct 13 06:53:57.440634 kubelet[3292]: I1013 06:53:57.440595 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-48w9h" podStartSLOduration=3.44058271 podStartE2EDuration="3.44058271s" podCreationTimestamp="2025-10-13 06:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:53:55.456784077 +0000 UTC m=+7.111478205" watchObservedRunningTime="2025-10-13 06:53:57.44058271 +0000 UTC m=+9.095276754" Oct 13 06:53:57.440871 kubelet[3292]: I1013 06:53:57.440681 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-xd2gn" podStartSLOduration=1.5745751559999999 podStartE2EDuration="3.440676s" podCreationTimestamp="2025-10-13 06:53:54 +0000 UTC" firstStartedPulling="2025-10-13 06:53:54.909375684 +0000 UTC m=+6.564069730" lastFinishedPulling="2025-10-13 06:53:56.775476527 +0000 UTC m=+8.430170574" observedRunningTime="2025-10-13 06:53:57.440509883 +0000 UTC m=+9.095203931" watchObservedRunningTime="2025-10-13 06:53:57.440676 +0000 UTC m=+9.095370045" Oct 13 06:54:01.275388 sudo[2239]: pam_unix(sudo:session): session closed for user root Oct 13 06:54:01.276391 sshd[2238]: Connection closed by 147.75.109.163 port 46326 Oct 13 06:54:01.276749 sshd-session[2235]: pam_unix(sshd:session): session closed for user core Oct 13 06:54:01.279270 systemd[1]: sshd@8-139.178.94.25:22-147.75.109.163:46326.service: Deactivated successfully. Oct 13 06:54:01.280344 systemd[1]: session-11.scope: Deactivated successfully. Oct 13 06:54:01.280449 systemd[1]: session-11.scope: Consumed 3.290s CPU time, 234.9M memory peak. Oct 13 06:54:01.281267 systemd-logind[1915]: Session 11 logged out. Waiting for processes to exit. Oct 13 06:54:01.282084 systemd-logind[1915]: Removed session 11. Oct 13 06:54:01.952750 update_engine[1920]: I20251013 06:54:01.952715 1920 update_attempter.cc:509] Updating boot flags... Oct 13 06:54:03.495882 systemd[1]: Created slice kubepods-besteffort-podc016f682_f828_4640_8a49_9f5e1f3bf314.slice - libcontainer container kubepods-besteffort-podc016f682_f828_4640_8a49_9f5e1f3bf314.slice. Oct 13 06:54:03.592421 kubelet[3292]: I1013 06:54:03.592344 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/c016f682-f828-4640-8a49-9f5e1f3bf314-typha-certs\") pod \"calico-typha-65ff5bb69f-j9s2w\" (UID: \"c016f682-f828-4640-8a49-9f5e1f3bf314\") " pod="calico-system/calico-typha-65ff5bb69f-j9s2w" Oct 13 06:54:03.593391 kubelet[3292]: I1013 06:54:03.592450 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696lk\" (UniqueName: \"kubernetes.io/projected/c016f682-f828-4640-8a49-9f5e1f3bf314-kube-api-access-696lk\") pod \"calico-typha-65ff5bb69f-j9s2w\" (UID: \"c016f682-f828-4640-8a49-9f5e1f3bf314\") " pod="calico-system/calico-typha-65ff5bb69f-j9s2w" Oct 13 06:54:03.593391 kubelet[3292]: I1013 06:54:03.592528 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c016f682-f828-4640-8a49-9f5e1f3bf314-tigera-ca-bundle\") pod \"calico-typha-65ff5bb69f-j9s2w\" (UID: \"c016f682-f828-4640-8a49-9f5e1f3bf314\") " pod="calico-system/calico-typha-65ff5bb69f-j9s2w" Oct 13 06:54:03.801125 containerd[1925]: time="2025-10-13T06:54:03.800920372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65ff5bb69f-j9s2w,Uid:c016f682-f828-4640-8a49-9f5e1f3bf314,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:03.811201 containerd[1925]: time="2025-10-13T06:54:03.811157361Z" level=info msg="connecting to shim 7d3ab56939d85fcf5f392a0c1b2cfd3b8eb9be515ab170b153a63bcdb1e8bc3c" address="unix:///run/containerd/s/defadedaf28b28446da51affb65621decab8f786ae8288cd822e8420653051a8" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:03.817071 systemd[1]: Created slice kubepods-besteffort-podcdaaf6ee_db3d_4038_84b6_a61b109e5017.slice - libcontainer container kubepods-besteffort-podcdaaf6ee_db3d_4038_84b6_a61b109e5017.slice. Oct 13 06:54:03.832868 systemd[1]: Started cri-containerd-7d3ab56939d85fcf5f392a0c1b2cfd3b8eb9be515ab170b153a63bcdb1e8bc3c.scope - libcontainer container 7d3ab56939d85fcf5f392a0c1b2cfd3b8eb9be515ab170b153a63bcdb1e8bc3c. Oct 13 06:54:03.857717 containerd[1925]: time="2025-10-13T06:54:03.857695895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-65ff5bb69f-j9s2w,Uid:c016f682-f828-4640-8a49-9f5e1f3bf314,Namespace:calico-system,Attempt:0,} returns sandbox id \"7d3ab56939d85fcf5f392a0c1b2cfd3b8eb9be515ab170b153a63bcdb1e8bc3c\"" Oct 13 06:54:03.858261 containerd[1925]: time="2025-10-13T06:54:03.858252683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Oct 13 06:54:03.894340 kubelet[3292]: I1013 06:54:03.894295 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdaaf6ee-db3d-4038-84b6-a61b109e5017-tigera-ca-bundle\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894340 kubelet[3292]: I1013 06:54:03.894318 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-cni-log-dir\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894340 kubelet[3292]: I1013 06:54:03.894330 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-cni-net-dir\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894340 kubelet[3292]: I1013 06:54:03.894339 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-var-run-calico\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894472 kubelet[3292]: I1013 06:54:03.894350 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-lib-modules\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894472 kubelet[3292]: I1013 06:54:03.894382 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-var-lib-calico\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894472 kubelet[3292]: I1013 06:54:03.894416 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-policysync\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894472 kubelet[3292]: I1013 06:54:03.894448 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-xtables-lock\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894472 kubelet[3292]: I1013 06:54:03.894471 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/cdaaf6ee-db3d-4038-84b6-a61b109e5017-node-certs\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894562 kubelet[3292]: I1013 06:54:03.894482 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-cni-bin-dir\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894562 kubelet[3292]: I1013 06:54:03.894493 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/cdaaf6ee-db3d-4038-84b6-a61b109e5017-flexvol-driver-host\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.894562 kubelet[3292]: I1013 06:54:03.894503 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc8dh\" (UniqueName: \"kubernetes.io/projected/cdaaf6ee-db3d-4038-84b6-a61b109e5017-kube-api-access-sc8dh\") pod \"calico-node-jw4g9\" (UID: \"cdaaf6ee-db3d-4038-84b6-a61b109e5017\") " pod="calico-system/calico-node-jw4g9" Oct 13 06:54:03.997769 kubelet[3292]: E1013 06:54:03.997702 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:03.997769 kubelet[3292]: W1013 06:54:03.997748 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:03.998123 kubelet[3292]: E1013 06:54:03.997833 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.003049 kubelet[3292]: E1013 06:54:04.002992 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.003049 kubelet[3292]: W1013 06:54:04.003035 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.003479 kubelet[3292]: E1013 06:54:04.003073 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.015427 kubelet[3292]: E1013 06:54:04.015355 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.015427 kubelet[3292]: W1013 06:54:04.015401 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.015717 kubelet[3292]: E1013 06:54:04.015442 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.110244 kubelet[3292]: E1013 06:54:04.110141 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gbdt9" podUID="168acdb3-8fc1-4f19-b797-a3b8ada8a829" Oct 13 06:54:04.121463 containerd[1925]: time="2025-10-13T06:54:04.121405791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jw4g9,Uid:cdaaf6ee-db3d-4038-84b6-a61b109e5017,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:04.129553 containerd[1925]: time="2025-10-13T06:54:04.129527486Z" level=info msg="connecting to shim 127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74" address="unix:///run/containerd/s/23298c5bff17e46006b89fa64715a827f9477d580fa30fd696cefb0aa6f3d155" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:04.155923 systemd[1]: Started cri-containerd-127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74.scope - libcontainer container 127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74. Oct 13 06:54:04.166902 containerd[1925]: time="2025-10-13T06:54:04.166880858Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-jw4g9,Uid:cdaaf6ee-db3d-4038-84b6-a61b109e5017,Namespace:calico-system,Attempt:0,} returns sandbox id \"127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74\"" Oct 13 06:54:04.181891 kubelet[3292]: E1013 06:54:04.181796 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.181891 kubelet[3292]: W1013 06:54:04.181851 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.181891 kubelet[3292]: E1013 06:54:04.181901 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.182376 kubelet[3292]: E1013 06:54:04.182331 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.182487 kubelet[3292]: W1013 06:54:04.182372 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.182487 kubelet[3292]: E1013 06:54:04.182411 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.182996 kubelet[3292]: E1013 06:54:04.182913 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.182996 kubelet[3292]: W1013 06:54:04.182947 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.182996 kubelet[3292]: E1013 06:54:04.182977 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.183547 kubelet[3292]: E1013 06:54:04.183511 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.183679 kubelet[3292]: W1013 06:54:04.183551 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.183679 kubelet[3292]: E1013 06:54:04.183582 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.184100 kubelet[3292]: E1013 06:54:04.184050 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.184100 kubelet[3292]: W1013 06:54:04.184077 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.184323 kubelet[3292]: E1013 06:54:04.184105 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.184519 kubelet[3292]: E1013 06:54:04.184491 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.184624 kubelet[3292]: W1013 06:54:04.184519 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.184624 kubelet[3292]: E1013 06:54:04.184543 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.184995 kubelet[3292]: E1013 06:54:04.184938 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.184995 kubelet[3292]: W1013 06:54:04.184971 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.185224 kubelet[3292]: E1013 06:54:04.184998 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.185458 kubelet[3292]: E1013 06:54:04.185430 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.185562 kubelet[3292]: W1013 06:54:04.185456 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.185562 kubelet[3292]: E1013 06:54:04.185480 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.185938 kubelet[3292]: E1013 06:54:04.185884 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.185938 kubelet[3292]: W1013 06:54:04.185914 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.186162 kubelet[3292]: E1013 06:54:04.185950 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.186378 kubelet[3292]: E1013 06:54:04.186324 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.186378 kubelet[3292]: W1013 06:54:04.186349 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.186378 kubelet[3292]: E1013 06:54:04.186372 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.186767 kubelet[3292]: E1013 06:54:04.186718 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.186767 kubelet[3292]: W1013 06:54:04.186744 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.186767 kubelet[3292]: E1013 06:54:04.186767 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.187206 kubelet[3292]: E1013 06:54:04.187155 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.187206 kubelet[3292]: W1013 06:54:04.187178 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.187206 kubelet[3292]: E1013 06:54:04.187201 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.187630 kubelet[3292]: E1013 06:54:04.187603 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.187630 kubelet[3292]: W1013 06:54:04.187627 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.187857 kubelet[3292]: E1013 06:54:04.187654 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.188099 kubelet[3292]: E1013 06:54:04.188072 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.188214 kubelet[3292]: W1013 06:54:04.188097 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.188214 kubelet[3292]: E1013 06:54:04.188121 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.188466 kubelet[3292]: E1013 06:54:04.188440 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.188573 kubelet[3292]: W1013 06:54:04.188465 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.188573 kubelet[3292]: E1013 06:54:04.188488 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.188895 kubelet[3292]: E1013 06:54:04.188841 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.188895 kubelet[3292]: W1013 06:54:04.188864 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.188895 kubelet[3292]: E1013 06:54:04.188888 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.189303 kubelet[3292]: E1013 06:54:04.189278 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.189417 kubelet[3292]: W1013 06:54:04.189302 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.189417 kubelet[3292]: E1013 06:54:04.189325 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.189687 kubelet[3292]: E1013 06:54:04.189639 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.189687 kubelet[3292]: W1013 06:54:04.189683 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.189864 kubelet[3292]: E1013 06:54:04.189709 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.190073 kubelet[3292]: E1013 06:54:04.190046 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.190183 kubelet[3292]: W1013 06:54:04.190071 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.190183 kubelet[3292]: E1013 06:54:04.190095 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.190443 kubelet[3292]: E1013 06:54:04.190418 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.190535 kubelet[3292]: W1013 06:54:04.190441 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.190535 kubelet[3292]: E1013 06:54:04.190465 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.196347 kubelet[3292]: E1013 06:54:04.196265 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.196347 kubelet[3292]: W1013 06:54:04.196296 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.196347 kubelet[3292]: E1013 06:54:04.196329 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.196763 kubelet[3292]: I1013 06:54:04.196390 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/168acdb3-8fc1-4f19-b797-a3b8ada8a829-socket-dir\") pod \"csi-node-driver-gbdt9\" (UID: \"168acdb3-8fc1-4f19-b797-a3b8ada8a829\") " pod="calico-system/csi-node-driver-gbdt9" Oct 13 06:54:04.196868 kubelet[3292]: E1013 06:54:04.196843 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.196952 kubelet[3292]: W1013 06:54:04.196874 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.196952 kubelet[3292]: E1013 06:54:04.196910 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.197126 kubelet[3292]: I1013 06:54:04.196959 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/168acdb3-8fc1-4f19-b797-a3b8ada8a829-kubelet-dir\") pod \"csi-node-driver-gbdt9\" (UID: \"168acdb3-8fc1-4f19-b797-a3b8ada8a829\") " pod="calico-system/csi-node-driver-gbdt9" Oct 13 06:54:04.197627 kubelet[3292]: E1013 06:54:04.197542 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.197627 kubelet[3292]: W1013 06:54:04.197585 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.197916 kubelet[3292]: E1013 06:54:04.197640 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.198231 kubelet[3292]: E1013 06:54:04.198143 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.198231 kubelet[3292]: W1013 06:54:04.198182 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.198231 kubelet[3292]: E1013 06:54:04.198226 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.198832 kubelet[3292]: E1013 06:54:04.198751 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.198832 kubelet[3292]: W1013 06:54:04.198783 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.198832 kubelet[3292]: E1013 06:54:04.198820 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.199183 kubelet[3292]: I1013 06:54:04.198876 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/168acdb3-8fc1-4f19-b797-a3b8ada8a829-registration-dir\") pod \"csi-node-driver-gbdt9\" (UID: \"168acdb3-8fc1-4f19-b797-a3b8ada8a829\") " pod="calico-system/csi-node-driver-gbdt9" Oct 13 06:54:04.199509 kubelet[3292]: E1013 06:54:04.199418 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.199509 kubelet[3292]: W1013 06:54:04.199460 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.199509 kubelet[3292]: E1013 06:54:04.199503 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.200106 kubelet[3292]: E1013 06:54:04.200017 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.200106 kubelet[3292]: W1013 06:54:04.200055 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.200106 kubelet[3292]: E1013 06:54:04.200098 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.200584 kubelet[3292]: E1013 06:54:04.200552 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.200584 kubelet[3292]: W1013 06:54:04.200581 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.200809 kubelet[3292]: E1013 06:54:04.200619 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.200809 kubelet[3292]: I1013 06:54:04.200700 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8xhr\" (UniqueName: \"kubernetes.io/projected/168acdb3-8fc1-4f19-b797-a3b8ada8a829-kube-api-access-f8xhr\") pod \"csi-node-driver-gbdt9\" (UID: \"168acdb3-8fc1-4f19-b797-a3b8ada8a829\") " pod="calico-system/csi-node-driver-gbdt9" Oct 13 06:54:04.201187 kubelet[3292]: E1013 06:54:04.201145 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.201292 kubelet[3292]: W1013 06:54:04.201187 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.201292 kubelet[3292]: E1013 06:54:04.201231 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.201614 kubelet[3292]: E1013 06:54:04.201587 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.201746 kubelet[3292]: W1013 06:54:04.201613 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.201860 kubelet[3292]: E1013 06:54:04.201647 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.202240 kubelet[3292]: E1013 06:54:04.202203 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.202334 kubelet[3292]: W1013 06:54:04.202242 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.202334 kubelet[3292]: E1013 06:54:04.202286 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.202517 kubelet[3292]: I1013 06:54:04.202340 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/168acdb3-8fc1-4f19-b797-a3b8ada8a829-varrun\") pod \"csi-node-driver-gbdt9\" (UID: \"168acdb3-8fc1-4f19-b797-a3b8ada8a829\") " pod="calico-system/csi-node-driver-gbdt9" Oct 13 06:54:04.202881 kubelet[3292]: E1013 06:54:04.202815 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.202881 kubelet[3292]: W1013 06:54:04.202855 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.203110 kubelet[3292]: E1013 06:54:04.202898 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.203379 kubelet[3292]: E1013 06:54:04.203319 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.203379 kubelet[3292]: W1013 06:54:04.203347 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.203582 kubelet[3292]: E1013 06:54:04.203382 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.203958 kubelet[3292]: E1013 06:54:04.203897 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.203958 kubelet[3292]: W1013 06:54:04.203935 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.204176 kubelet[3292]: E1013 06:54:04.203970 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.204427 kubelet[3292]: E1013 06:54:04.204378 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.204427 kubelet[3292]: W1013 06:54:04.204405 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.204633 kubelet[3292]: E1013 06:54:04.204432 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.303621 kubelet[3292]: E1013 06:54:04.303573 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.303621 kubelet[3292]: W1013 06:54:04.303612 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.303946 kubelet[3292]: E1013 06:54:04.303654 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.304325 kubelet[3292]: E1013 06:54:04.304237 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.304325 kubelet[3292]: W1013 06:54:04.304266 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.304325 kubelet[3292]: E1013 06:54:04.304308 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.304988 kubelet[3292]: E1013 06:54:04.304893 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.304988 kubelet[3292]: W1013 06:54:04.304922 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.304988 kubelet[3292]: E1013 06:54:04.304961 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.305511 kubelet[3292]: E1013 06:54:04.305476 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.305511 kubelet[3292]: W1013 06:54:04.305508 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.305783 kubelet[3292]: E1013 06:54:04.305551 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.306074 kubelet[3292]: E1013 06:54:04.306033 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.306074 kubelet[3292]: W1013 06:54:04.306064 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.306346 kubelet[3292]: E1013 06:54:04.306105 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.306606 kubelet[3292]: E1013 06:54:04.306573 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.306606 kubelet[3292]: W1013 06:54:04.306604 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.306900 kubelet[3292]: E1013 06:54:04.306643 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.307198 kubelet[3292]: E1013 06:54:04.307164 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.307198 kubelet[3292]: W1013 06:54:04.307193 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.307443 kubelet[3292]: E1013 06:54:04.307232 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.307732 kubelet[3292]: E1013 06:54:04.307694 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.307732 kubelet[3292]: W1013 06:54:04.307725 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.307929 kubelet[3292]: E1013 06:54:04.307764 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.308197 kubelet[3292]: E1013 06:54:04.308165 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.308197 kubelet[3292]: W1013 06:54:04.308194 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.308420 kubelet[3292]: E1013 06:54:04.308227 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.308620 kubelet[3292]: E1013 06:54:04.308589 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.308620 kubelet[3292]: W1013 06:54:04.308614 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.308864 kubelet[3292]: E1013 06:54:04.308717 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.309072 kubelet[3292]: E1013 06:54:04.309043 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.309072 kubelet[3292]: W1013 06:54:04.309067 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.309261 kubelet[3292]: E1013 06:54:04.309145 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.309470 kubelet[3292]: E1013 06:54:04.309444 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.309576 kubelet[3292]: W1013 06:54:04.309469 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.309576 kubelet[3292]: E1013 06:54:04.309530 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.309884 kubelet[3292]: E1013 06:54:04.309858 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.309884 kubelet[3292]: W1013 06:54:04.309882 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.310087 kubelet[3292]: E1013 06:54:04.309944 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.310275 kubelet[3292]: E1013 06:54:04.310248 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.310372 kubelet[3292]: W1013 06:54:04.310273 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.310372 kubelet[3292]: E1013 06:54:04.310327 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.310751 kubelet[3292]: E1013 06:54:04.310723 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.310865 kubelet[3292]: W1013 06:54:04.310750 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.310865 kubelet[3292]: E1013 06:54:04.310834 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.311142 kubelet[3292]: E1013 06:54:04.311109 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.311240 kubelet[3292]: W1013 06:54:04.311141 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.311327 kubelet[3292]: E1013 06:54:04.311223 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.311548 kubelet[3292]: E1013 06:54:04.311519 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.311647 kubelet[3292]: W1013 06:54:04.311547 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.311647 kubelet[3292]: E1013 06:54:04.311612 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.311995 kubelet[3292]: E1013 06:54:04.311966 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.311995 kubelet[3292]: W1013 06:54:04.311993 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.312181 kubelet[3292]: E1013 06:54:04.312073 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.312444 kubelet[3292]: E1013 06:54:04.312412 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.312444 kubelet[3292]: W1013 06:54:04.312441 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.312820 kubelet[3292]: E1013 06:54:04.312505 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.312988 kubelet[3292]: E1013 06:54:04.312881 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.312988 kubelet[3292]: W1013 06:54:04.312908 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.313280 kubelet[3292]: E1013 06:54:04.312986 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.313465 kubelet[3292]: E1013 06:54:04.313304 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.313465 kubelet[3292]: W1013 06:54:04.313330 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.313465 kubelet[3292]: E1013 06:54:04.313415 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.313812 kubelet[3292]: E1013 06:54:04.313776 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.313812 kubelet[3292]: W1013 06:54:04.313802 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.314002 kubelet[3292]: E1013 06:54:04.313886 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.314267 kubelet[3292]: E1013 06:54:04.314231 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.314267 kubelet[3292]: W1013 06:54:04.314260 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.314469 kubelet[3292]: E1013 06:54:04.314293 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.314936 kubelet[3292]: E1013 06:54:04.314882 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.314936 kubelet[3292]: W1013 06:54:04.314915 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.315141 kubelet[3292]: E1013 06:54:04.314957 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.315503 kubelet[3292]: E1013 06:54:04.315474 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.315621 kubelet[3292]: W1013 06:54:04.315502 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.315621 kubelet[3292]: E1013 06:54:04.315531 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:04.324872 kubelet[3292]: E1013 06:54:04.324858 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:04.324872 kubelet[3292]: W1013 06:54:04.324868 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:04.324945 kubelet[3292]: E1013 06:54:04.324879 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:05.490080 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1917592183.mount: Deactivated successfully. Oct 13 06:54:06.142931 containerd[1925]: time="2025-10-13T06:54:06.142883045Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:06.143146 containerd[1925]: time="2025-10-13T06:54:06.143073849Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Oct 13 06:54:06.143485 containerd[1925]: time="2025-10-13T06:54:06.143446629Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:06.144220 containerd[1925]: time="2025-10-13T06:54:06.144180363Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:06.144801 containerd[1925]: time="2025-10-13T06:54:06.144761570Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.286495899s" Oct 13 06:54:06.144801 containerd[1925]: time="2025-10-13T06:54:06.144775767Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Oct 13 06:54:06.145261 containerd[1925]: time="2025-10-13T06:54:06.145218570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Oct 13 06:54:06.148119 containerd[1925]: time="2025-10-13T06:54:06.148100456Z" level=info msg="CreateContainer within sandbox \"7d3ab56939d85fcf5f392a0c1b2cfd3b8eb9be515ab170b153a63bcdb1e8bc3c\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Oct 13 06:54:06.151124 containerd[1925]: time="2025-10-13T06:54:06.151079636Z" level=info msg="Container 72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:06.153909 containerd[1925]: time="2025-10-13T06:54:06.153896751Z" level=info msg="CreateContainer within sandbox \"7d3ab56939d85fcf5f392a0c1b2cfd3b8eb9be515ab170b153a63bcdb1e8bc3c\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6\"" Oct 13 06:54:06.154127 containerd[1925]: time="2025-10-13T06:54:06.154087238Z" level=info msg="StartContainer for \"72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6\"" Oct 13 06:54:06.154607 containerd[1925]: time="2025-10-13T06:54:06.154595817Z" level=info msg="connecting to shim 72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6" address="unix:///run/containerd/s/defadedaf28b28446da51affb65621decab8f786ae8288cd822e8420653051a8" protocol=ttrpc version=3 Oct 13 06:54:06.171824 systemd[1]: Started cri-containerd-72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6.scope - libcontainer container 72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6. Oct 13 06:54:06.198648 containerd[1925]: time="2025-10-13T06:54:06.198626715Z" level=info msg="StartContainer for \"72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6\" returns successfully" Oct 13 06:54:06.393760 kubelet[3292]: E1013 06:54:06.393642 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gbdt9" podUID="168acdb3-8fc1-4f19-b797-a3b8ada8a829" Oct 13 06:54:06.456273 kubelet[3292]: I1013 06:54:06.456224 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-65ff5bb69f-j9s2w" podStartSLOduration=1.169202131 podStartE2EDuration="3.456207653s" podCreationTimestamp="2025-10-13 06:54:03 +0000 UTC" firstStartedPulling="2025-10-13 06:54:03.858152676 +0000 UTC m=+15.512846723" lastFinishedPulling="2025-10-13 06:54:06.145158196 +0000 UTC m=+17.799852245" observedRunningTime="2025-10-13 06:54:06.455886758 +0000 UTC m=+18.110580816" watchObservedRunningTime="2025-10-13 06:54:06.456207653 +0000 UTC m=+18.110901705" Oct 13 06:54:06.507111 kubelet[3292]: E1013 06:54:06.507013 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.507111 kubelet[3292]: W1013 06:54:06.507072 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.507594 kubelet[3292]: E1013 06:54:06.507119 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.507721 kubelet[3292]: E1013 06:54:06.507601 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.507721 kubelet[3292]: W1013 06:54:06.507630 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.507721 kubelet[3292]: E1013 06:54:06.507699 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.508229 kubelet[3292]: E1013 06:54:06.508153 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.508229 kubelet[3292]: W1013 06:54:06.508183 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.508229 kubelet[3292]: E1013 06:54:06.508218 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.508840 kubelet[3292]: E1013 06:54:06.508757 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.508840 kubelet[3292]: W1013 06:54:06.508793 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.508840 kubelet[3292]: E1013 06:54:06.508826 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.509364 kubelet[3292]: E1013 06:54:06.509308 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.509479 kubelet[3292]: W1013 06:54:06.509364 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.509479 kubelet[3292]: E1013 06:54:06.509396 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.509918 kubelet[3292]: E1013 06:54:06.509887 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.509918 kubelet[3292]: W1013 06:54:06.509915 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.510137 kubelet[3292]: E1013 06:54:06.509942 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.510397 kubelet[3292]: E1013 06:54:06.510369 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.510506 kubelet[3292]: W1013 06:54:06.510396 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.510506 kubelet[3292]: E1013 06:54:06.510420 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.510826 kubelet[3292]: E1013 06:54:06.510798 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.510826 kubelet[3292]: W1013 06:54:06.510824 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.511048 kubelet[3292]: E1013 06:54:06.510848 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.511308 kubelet[3292]: E1013 06:54:06.511281 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.511411 kubelet[3292]: W1013 06:54:06.511307 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.511411 kubelet[3292]: E1013 06:54:06.511331 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.511761 kubelet[3292]: E1013 06:54:06.511651 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.511761 kubelet[3292]: W1013 06:54:06.511734 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.511761 kubelet[3292]: E1013 06:54:06.511759 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.512223 kubelet[3292]: E1013 06:54:06.512170 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.512223 kubelet[3292]: W1013 06:54:06.512196 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.512223 kubelet[3292]: E1013 06:54:06.512223 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.512577 kubelet[3292]: E1013 06:54:06.512548 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.512577 kubelet[3292]: W1013 06:54:06.512572 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.512819 kubelet[3292]: E1013 06:54:06.512596 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.513121 kubelet[3292]: E1013 06:54:06.513069 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.513121 kubelet[3292]: W1013 06:54:06.513098 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.513351 kubelet[3292]: E1013 06:54:06.513124 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.513575 kubelet[3292]: E1013 06:54:06.513523 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.513575 kubelet[3292]: W1013 06:54:06.513547 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.513575 kubelet[3292]: E1013 06:54:06.513571 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.514042 kubelet[3292]: E1013 06:54:06.514012 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.514042 kubelet[3292]: W1013 06:54:06.514040 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.514231 kubelet[3292]: E1013 06:54:06.514066 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.527535 kubelet[3292]: E1013 06:54:06.527494 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.527535 kubelet[3292]: W1013 06:54:06.527532 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.527874 kubelet[3292]: E1013 06:54:06.527567 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.528162 kubelet[3292]: E1013 06:54:06.528125 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.528266 kubelet[3292]: W1013 06:54:06.528163 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.528266 kubelet[3292]: E1013 06:54:06.528206 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.528814 kubelet[3292]: E1013 06:54:06.528730 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.528814 kubelet[3292]: W1013 06:54:06.528771 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.528814 kubelet[3292]: E1013 06:54:06.528817 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.529377 kubelet[3292]: E1013 06:54:06.529319 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.529377 kubelet[3292]: W1013 06:54:06.529357 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.529581 kubelet[3292]: E1013 06:54:06.529401 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.529978 kubelet[3292]: E1013 06:54:06.529922 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.529978 kubelet[3292]: W1013 06:54:06.529959 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.530318 kubelet[3292]: E1013 06:54:06.530062 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.530495 kubelet[3292]: E1013 06:54:06.530440 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.530495 kubelet[3292]: W1013 06:54:06.530477 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.530723 kubelet[3292]: E1013 06:54:06.530578 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.530990 kubelet[3292]: E1013 06:54:06.530942 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.530990 kubelet[3292]: W1013 06:54:06.530968 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.531208 kubelet[3292]: E1013 06:54:06.531055 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.531447 kubelet[3292]: E1013 06:54:06.531398 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.531447 kubelet[3292]: W1013 06:54:06.531424 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.531639 kubelet[3292]: E1013 06:54:06.531458 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.532107 kubelet[3292]: E1013 06:54:06.532053 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.532107 kubelet[3292]: W1013 06:54:06.532087 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.532332 kubelet[3292]: E1013 06:54:06.532130 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.532638 kubelet[3292]: E1013 06:54:06.532584 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.532638 kubelet[3292]: W1013 06:54:06.532612 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.532872 kubelet[3292]: E1013 06:54:06.532710 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.533121 kubelet[3292]: E1013 06:54:06.533067 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.533121 kubelet[3292]: W1013 06:54:06.533095 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.533335 kubelet[3292]: E1013 06:54:06.533172 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.533533 kubelet[3292]: E1013 06:54:06.533479 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.533533 kubelet[3292]: W1013 06:54:06.533505 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.533798 kubelet[3292]: E1013 06:54:06.533578 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.533965 kubelet[3292]: E1013 06:54:06.533909 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.533965 kubelet[3292]: W1013 06:54:06.533936 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.534186 kubelet[3292]: E1013 06:54:06.533974 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.534566 kubelet[3292]: E1013 06:54:06.534538 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.534695 kubelet[3292]: W1013 06:54:06.534564 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.534695 kubelet[3292]: E1013 06:54:06.534596 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.535245 kubelet[3292]: E1013 06:54:06.535179 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.535245 kubelet[3292]: W1013 06:54:06.535218 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.535465 kubelet[3292]: E1013 06:54:06.535252 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.535852 kubelet[3292]: E1013 06:54:06.535771 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.535852 kubelet[3292]: W1013 06:54:06.535801 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.535852 kubelet[3292]: E1013 06:54:06.535838 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.536452 kubelet[3292]: E1013 06:54:06.536367 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.536452 kubelet[3292]: W1013 06:54:06.536407 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.536452 kubelet[3292]: E1013 06:54:06.536444 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:06.537069 kubelet[3292]: E1013 06:54:06.537018 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:06.537069 kubelet[3292]: W1013 06:54:06.537070 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:06.537374 kubelet[3292]: E1013 06:54:06.537119 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.450309 kubelet[3292]: I1013 06:54:07.450230 3292 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:54:07.523257 kubelet[3292]: E1013 06:54:07.523153 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.523257 kubelet[3292]: W1013 06:54:07.523199 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.523257 kubelet[3292]: E1013 06:54:07.523242 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.523867 kubelet[3292]: E1013 06:54:07.523752 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.523867 kubelet[3292]: W1013 06:54:07.523791 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.523867 kubelet[3292]: E1013 06:54:07.523826 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.524337 kubelet[3292]: E1013 06:54:07.524259 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.524337 kubelet[3292]: W1013 06:54:07.524291 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.524337 kubelet[3292]: E1013 06:54:07.524329 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.524861 kubelet[3292]: E1013 06:54:07.524805 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.524861 kubelet[3292]: W1013 06:54:07.524835 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.524861 kubelet[3292]: E1013 06:54:07.524860 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.525390 kubelet[3292]: E1013 06:54:07.525332 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.525390 kubelet[3292]: W1013 06:54:07.525366 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.525618 kubelet[3292]: E1013 06:54:07.525394 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.525828 kubelet[3292]: E1013 06:54:07.525800 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.525828 kubelet[3292]: W1013 06:54:07.525826 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.526010 kubelet[3292]: E1013 06:54:07.525857 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.526303 kubelet[3292]: E1013 06:54:07.526274 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.526303 kubelet[3292]: W1013 06:54:07.526300 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.526498 kubelet[3292]: E1013 06:54:07.526324 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.526784 kubelet[3292]: E1013 06:54:07.526728 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.526784 kubelet[3292]: W1013 06:54:07.526752 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.527018 kubelet[3292]: E1013 06:54:07.526787 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.527247 kubelet[3292]: E1013 06:54:07.527191 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.527247 kubelet[3292]: W1013 06:54:07.527223 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.527467 kubelet[3292]: E1013 06:54:07.527257 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.527666 kubelet[3292]: E1013 06:54:07.527627 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.527814 kubelet[3292]: W1013 06:54:07.527654 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.527814 kubelet[3292]: E1013 06:54:07.527709 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.528124 kubelet[3292]: E1013 06:54:07.528092 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.528124 kubelet[3292]: W1013 06:54:07.528118 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.528392 kubelet[3292]: E1013 06:54:07.528153 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.528550 kubelet[3292]: E1013 06:54:07.528522 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.528550 kubelet[3292]: W1013 06:54:07.528547 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.528786 kubelet[3292]: E1013 06:54:07.528571 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.529043 kubelet[3292]: E1013 06:54:07.529014 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.529043 kubelet[3292]: W1013 06:54:07.529040 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.529295 kubelet[3292]: E1013 06:54:07.529074 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.529551 kubelet[3292]: E1013 06:54:07.529495 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.529551 kubelet[3292]: W1013 06:54:07.529524 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.529801 kubelet[3292]: E1013 06:54:07.529559 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.530008 kubelet[3292]: E1013 06:54:07.529973 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.530008 kubelet[3292]: W1013 06:54:07.530000 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.530216 kubelet[3292]: E1013 06:54:07.530026 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.538992 kubelet[3292]: E1013 06:54:07.538908 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.538992 kubelet[3292]: W1013 06:54:07.538951 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.539305 kubelet[3292]: E1013 06:54:07.538998 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.539569 kubelet[3292]: E1013 06:54:07.539528 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.539569 kubelet[3292]: W1013 06:54:07.539560 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.539887 kubelet[3292]: E1013 06:54:07.539610 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.540288 kubelet[3292]: E1013 06:54:07.540195 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.540288 kubelet[3292]: W1013 06:54:07.540235 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.540288 kubelet[3292]: E1013 06:54:07.540281 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.540758 kubelet[3292]: E1013 06:54:07.540721 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.540758 kubelet[3292]: W1013 06:54:07.540748 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.540989 kubelet[3292]: E1013 06:54:07.540789 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.541277 kubelet[3292]: E1013 06:54:07.541235 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.541277 kubelet[3292]: W1013 06:54:07.541273 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.541557 kubelet[3292]: E1013 06:54:07.541348 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.541760 kubelet[3292]: E1013 06:54:07.541724 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.541760 kubelet[3292]: W1013 06:54:07.541752 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.542001 kubelet[3292]: E1013 06:54:07.541845 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.542280 kubelet[3292]: E1013 06:54:07.542239 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.542280 kubelet[3292]: W1013 06:54:07.542276 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.542556 kubelet[3292]: E1013 06:54:07.542348 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.542820 kubelet[3292]: E1013 06:54:07.542790 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.542820 kubelet[3292]: W1013 06:54:07.542818 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.543014 kubelet[3292]: E1013 06:54:07.542855 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.543489 kubelet[3292]: E1013 06:54:07.543451 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.543489 kubelet[3292]: W1013 06:54:07.543488 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.543737 kubelet[3292]: E1013 06:54:07.543532 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.544056 kubelet[3292]: E1013 06:54:07.544026 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.544169 kubelet[3292]: W1013 06:54:07.544057 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.544169 kubelet[3292]: E1013 06:54:07.544119 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.544494 kubelet[3292]: E1013 06:54:07.544466 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.544600 kubelet[3292]: W1013 06:54:07.544494 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.544600 kubelet[3292]: E1013 06:54:07.544544 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.544938 kubelet[3292]: E1013 06:54:07.544909 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.544938 kubelet[3292]: W1013 06:54:07.544936 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.545133 kubelet[3292]: E1013 06:54:07.545051 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.545336 kubelet[3292]: E1013 06:54:07.545309 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.545439 kubelet[3292]: W1013 06:54:07.545334 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.545439 kubelet[3292]: E1013 06:54:07.545367 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.545859 kubelet[3292]: E1013 06:54:07.545822 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.545859 kubelet[3292]: W1013 06:54:07.545848 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.546085 kubelet[3292]: E1013 06:54:07.545879 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.546486 kubelet[3292]: E1013 06:54:07.546450 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.546595 kubelet[3292]: W1013 06:54:07.546486 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.546595 kubelet[3292]: E1013 06:54:07.546521 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.546962 kubelet[3292]: E1013 06:54:07.546877 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.546962 kubelet[3292]: W1013 06:54:07.546913 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.546962 kubelet[3292]: E1013 06:54:07.546951 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.547439 kubelet[3292]: E1013 06:54:07.547392 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.547439 kubelet[3292]: W1013 06:54:07.547415 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.547439 kubelet[3292]: E1013 06:54:07.547441 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.548358 kubelet[3292]: E1013 06:54:07.548323 3292 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Oct 13 06:54:07.548358 kubelet[3292]: W1013 06:54:07.548352 3292 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Oct 13 06:54:07.548548 kubelet[3292]: E1013 06:54:07.548378 3292 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Oct 13 06:54:07.965221 containerd[1925]: time="2025-10-13T06:54:07.965171833Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:07.965434 containerd[1925]: time="2025-10-13T06:54:07.965423698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Oct 13 06:54:07.965757 containerd[1925]: time="2025-10-13T06:54:07.965743416Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:07.966532 containerd[1925]: time="2025-10-13T06:54:07.966520033Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:07.966918 containerd[1925]: time="2025-10-13T06:54:07.966906088Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.821670071s" Oct 13 06:54:07.966943 containerd[1925]: time="2025-10-13T06:54:07.966921638Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Oct 13 06:54:07.967872 containerd[1925]: time="2025-10-13T06:54:07.967860610Z" level=info msg="CreateContainer within sandbox \"127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Oct 13 06:54:07.971145 containerd[1925]: time="2025-10-13T06:54:07.971127378Z" level=info msg="Container 3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:07.974115 containerd[1925]: time="2025-10-13T06:54:07.974102439Z" level=info msg="CreateContainer within sandbox \"127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788\"" Oct 13 06:54:07.974365 containerd[1925]: time="2025-10-13T06:54:07.974314980Z" level=info msg="StartContainer for \"3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788\"" Oct 13 06:54:07.975110 containerd[1925]: time="2025-10-13T06:54:07.975099833Z" level=info msg="connecting to shim 3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788" address="unix:///run/containerd/s/23298c5bff17e46006b89fa64715a827f9477d580fa30fd696cefb0aa6f3d155" protocol=ttrpc version=3 Oct 13 06:54:07.991950 systemd[1]: Started cri-containerd-3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788.scope - libcontainer container 3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788. Oct 13 06:54:08.011778 containerd[1925]: time="2025-10-13T06:54:08.011750778Z" level=info msg="StartContainer for \"3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788\" returns successfully" Oct 13 06:54:08.016202 systemd[1]: cri-containerd-3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788.scope: Deactivated successfully. Oct 13 06:54:08.017567 containerd[1925]: time="2025-10-13T06:54:08.017545084Z" level=info msg="received exit event container_id:\"3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788\" id:\"3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788\" pid:4146 exited_at:{seconds:1760338448 nanos:17331247}" Oct 13 06:54:08.017627 containerd[1925]: time="2025-10-13T06:54:08.017572708Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788\" id:\"3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788\" pid:4146 exited_at:{seconds:1760338448 nanos:17331247}" Oct 13 06:54:08.030640 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788-rootfs.mount: Deactivated successfully. Oct 13 06:54:08.395132 kubelet[3292]: E1013 06:54:08.395008 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gbdt9" podUID="168acdb3-8fc1-4f19-b797-a3b8ada8a829" Oct 13 06:54:09.465274 containerd[1925]: time="2025-10-13T06:54:09.465190796Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Oct 13 06:54:10.394601 kubelet[3292]: E1013 06:54:10.394466 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gbdt9" podUID="168acdb3-8fc1-4f19-b797-a3b8ada8a829" Oct 13 06:54:12.393556 kubelet[3292]: E1013 06:54:12.393476 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gbdt9" podUID="168acdb3-8fc1-4f19-b797-a3b8ada8a829" Oct 13 06:54:13.110651 containerd[1925]: time="2025-10-13T06:54:13.110626482Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:13.110859 containerd[1925]: time="2025-10-13T06:54:13.110795930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Oct 13 06:54:13.111182 containerd[1925]: time="2025-10-13T06:54:13.111169933Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:13.111962 containerd[1925]: time="2025-10-13T06:54:13.111948542Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:13.112345 containerd[1925]: time="2025-10-13T06:54:13.112331888Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.647061309s" Oct 13 06:54:13.112368 containerd[1925]: time="2025-10-13T06:54:13.112348110Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Oct 13 06:54:13.113219 containerd[1925]: time="2025-10-13T06:54:13.113206879Z" level=info msg="CreateContainer within sandbox \"127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Oct 13 06:54:13.116409 containerd[1925]: time="2025-10-13T06:54:13.116396244Z" level=info msg="Container 9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:13.120138 containerd[1925]: time="2025-10-13T06:54:13.120124166Z" level=info msg="CreateContainer within sandbox \"127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4\"" Oct 13 06:54:13.120426 containerd[1925]: time="2025-10-13T06:54:13.120414827Z" level=info msg="StartContainer for \"9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4\"" Oct 13 06:54:13.121154 containerd[1925]: time="2025-10-13T06:54:13.121142791Z" level=info msg="connecting to shim 9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4" address="unix:///run/containerd/s/23298c5bff17e46006b89fa64715a827f9477d580fa30fd696cefb0aa6f3d155" protocol=ttrpc version=3 Oct 13 06:54:13.137785 systemd[1]: Started cri-containerd-9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4.scope - libcontainer container 9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4. Oct 13 06:54:13.158008 containerd[1925]: time="2025-10-13T06:54:13.157986829Z" level=info msg="StartContainer for \"9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4\" returns successfully" Oct 13 06:54:13.701258 systemd[1]: cri-containerd-9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4.scope: Deactivated successfully. Oct 13 06:54:13.701409 systemd[1]: cri-containerd-9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4.scope: Consumed 336ms CPU time, 191.7M memory peak, 171.3M written to disk. Oct 13 06:54:13.701695 containerd[1925]: time="2025-10-13T06:54:13.701678807Z" level=info msg="received exit event container_id:\"9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4\" id:\"9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4\" pid:4208 exited_at:{seconds:1760338453 nanos:701575538}" Oct 13 06:54:13.701743 containerd[1925]: time="2025-10-13T06:54:13.701732029Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4\" id:\"9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4\" pid:4208 exited_at:{seconds:1760338453 nanos:701575538}" Oct 13 06:54:13.711164 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4-rootfs.mount: Deactivated successfully. Oct 13 06:54:13.752178 kubelet[3292]: I1013 06:54:13.752156 3292 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Oct 13 06:54:13.815260 systemd[1]: Created slice kubepods-burstable-pod12f6bdc5_c572_4ad3_9971_792e82e94f2d.slice - libcontainer container kubepods-burstable-pod12f6bdc5_c572_4ad3_9971_792e82e94f2d.slice. Oct 13 06:54:13.823494 systemd[1]: Created slice kubepods-burstable-pod389641a2_f104_4e70_937e_4d07a8816bed.slice - libcontainer container kubepods-burstable-pod389641a2_f104_4e70_937e_4d07a8816bed.slice. Oct 13 06:54:13.829320 systemd[1]: Created slice kubepods-besteffort-pod55c626e4_fa08_4709_8ae8_a6e16ad60876.slice - libcontainer container kubepods-besteffort-pod55c626e4_fa08_4709_8ae8_a6e16ad60876.slice. Oct 13 06:54:13.834239 systemd[1]: Created slice kubepods-besteffort-pod270bd17b_94f3_4836_a4f6_8e717be90a35.slice - libcontainer container kubepods-besteffort-pod270bd17b_94f3_4836_a4f6_8e717be90a35.slice. Oct 13 06:54:13.838379 systemd[1]: Created slice kubepods-besteffort-pod4bb65662_ba1f_4201_a746_530b80972c43.slice - libcontainer container kubepods-besteffort-pod4bb65662_ba1f_4201_a746_530b80972c43.slice. Oct 13 06:54:13.841860 systemd[1]: Created slice kubepods-besteffort-pod966b92e5_6493_4cbc_a333_cedc1d052bfa.slice - libcontainer container kubepods-besteffort-pod966b92e5_6493_4cbc_a333_cedc1d052bfa.slice. Oct 13 06:54:13.844914 systemd[1]: Created slice kubepods-besteffort-pod095e7c99_377a_4f2d_96a4_21b09a1d4a94.slice - libcontainer container kubepods-besteffort-pod095e7c99_377a_4f2d_96a4_21b09a1d4a94.slice. Oct 13 06:54:13.884922 kubelet[3292]: I1013 06:54:13.884816 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/095e7c99-377a-4f2d-96a4-21b09a1d4a94-whisker-ca-bundle\") pod \"whisker-69694b5d67-pq9tr\" (UID: \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\") " pod="calico-system/whisker-69694b5d67-pq9tr" Oct 13 06:54:13.884922 kubelet[3292]: I1013 06:54:13.884904 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/095e7c99-377a-4f2d-96a4-21b09a1d4a94-whisker-backend-key-pair\") pod \"whisker-69694b5d67-pq9tr\" (UID: \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\") " pod="calico-system/whisker-69694b5d67-pq9tr" Oct 13 06:54:13.885316 kubelet[3292]: I1013 06:54:13.884977 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdtdq\" (UniqueName: \"kubernetes.io/projected/55c626e4-fa08-4709-8ae8-a6e16ad60876-kube-api-access-gdtdq\") pod \"calico-kube-controllers-866f8d6d77-5wphw\" (UID: \"55c626e4-fa08-4709-8ae8-a6e16ad60876\") " pod="calico-system/calico-kube-controllers-866f8d6d77-5wphw" Oct 13 06:54:13.885316 kubelet[3292]: I1013 06:54:13.885036 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/966b92e5-6493-4cbc-a333-cedc1d052bfa-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-bv4dr\" (UID: \"966b92e5-6493-4cbc-a333-cedc1d052bfa\") " pod="calico-system/goldmane-54d579b49d-bv4dr" Oct 13 06:54:13.885316 kubelet[3292]: I1013 06:54:13.885084 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8txh\" (UniqueName: \"kubernetes.io/projected/389641a2-f104-4e70-937e-4d07a8816bed-kube-api-access-j8txh\") pod \"coredns-668d6bf9bc-gpd7c\" (UID: \"389641a2-f104-4e70-937e-4d07a8816bed\") " pod="kube-system/coredns-668d6bf9bc-gpd7c" Oct 13 06:54:13.885316 kubelet[3292]: I1013 06:54:13.885135 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/966b92e5-6493-4cbc-a333-cedc1d052bfa-config\") pod \"goldmane-54d579b49d-bv4dr\" (UID: \"966b92e5-6493-4cbc-a333-cedc1d052bfa\") " pod="calico-system/goldmane-54d579b49d-bv4dr" Oct 13 06:54:13.885316 kubelet[3292]: I1013 06:54:13.885182 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/966b92e5-6493-4cbc-a333-cedc1d052bfa-goldmane-key-pair\") pod \"goldmane-54d579b49d-bv4dr\" (UID: \"966b92e5-6493-4cbc-a333-cedc1d052bfa\") " pod="calico-system/goldmane-54d579b49d-bv4dr" Oct 13 06:54:13.885892 kubelet[3292]: I1013 06:54:13.885251 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r9wf\" (UniqueName: \"kubernetes.io/projected/4bb65662-ba1f-4201-a746-530b80972c43-kube-api-access-5r9wf\") pod \"calico-apiserver-75cd76566-6brsl\" (UID: \"4bb65662-ba1f-4201-a746-530b80972c43\") " pod="calico-apiserver/calico-apiserver-75cd76566-6brsl" Oct 13 06:54:13.885892 kubelet[3292]: I1013 06:54:13.885343 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12f6bdc5-c572-4ad3-9971-792e82e94f2d-config-volume\") pod \"coredns-668d6bf9bc-bg2bt\" (UID: \"12f6bdc5-c572-4ad3-9971-792e82e94f2d\") " pod="kube-system/coredns-668d6bf9bc-bg2bt" Oct 13 06:54:13.885892 kubelet[3292]: I1013 06:54:13.885398 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5jd2\" (UniqueName: \"kubernetes.io/projected/270bd17b-94f3-4836-a4f6-8e717be90a35-kube-api-access-w5jd2\") pod \"calico-apiserver-75cd76566-rw7m9\" (UID: \"270bd17b-94f3-4836-a4f6-8e717be90a35\") " pod="calico-apiserver/calico-apiserver-75cd76566-rw7m9" Oct 13 06:54:13.885892 kubelet[3292]: I1013 06:54:13.885453 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvg87\" (UniqueName: \"kubernetes.io/projected/966b92e5-6493-4cbc-a333-cedc1d052bfa-kube-api-access-lvg87\") pod \"goldmane-54d579b49d-bv4dr\" (UID: \"966b92e5-6493-4cbc-a333-cedc1d052bfa\") " pod="calico-system/goldmane-54d579b49d-bv4dr" Oct 13 06:54:13.885892 kubelet[3292]: I1013 06:54:13.885511 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzmtm\" (UniqueName: \"kubernetes.io/projected/095e7c99-377a-4f2d-96a4-21b09a1d4a94-kube-api-access-gzmtm\") pod \"whisker-69694b5d67-pq9tr\" (UID: \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\") " pod="calico-system/whisker-69694b5d67-pq9tr" Oct 13 06:54:13.886452 kubelet[3292]: I1013 06:54:13.885558 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7mpt\" (UniqueName: \"kubernetes.io/projected/12f6bdc5-c572-4ad3-9971-792e82e94f2d-kube-api-access-n7mpt\") pod \"coredns-668d6bf9bc-bg2bt\" (UID: \"12f6bdc5-c572-4ad3-9971-792e82e94f2d\") " pod="kube-system/coredns-668d6bf9bc-bg2bt" Oct 13 06:54:13.886452 kubelet[3292]: I1013 06:54:13.885742 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55c626e4-fa08-4709-8ae8-a6e16ad60876-tigera-ca-bundle\") pod \"calico-kube-controllers-866f8d6d77-5wphw\" (UID: \"55c626e4-fa08-4709-8ae8-a6e16ad60876\") " pod="calico-system/calico-kube-controllers-866f8d6d77-5wphw" Oct 13 06:54:13.886452 kubelet[3292]: I1013 06:54:13.885841 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/389641a2-f104-4e70-937e-4d07a8816bed-config-volume\") pod \"coredns-668d6bf9bc-gpd7c\" (UID: \"389641a2-f104-4e70-937e-4d07a8816bed\") " pod="kube-system/coredns-668d6bf9bc-gpd7c" Oct 13 06:54:13.886452 kubelet[3292]: I1013 06:54:13.885904 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/270bd17b-94f3-4836-a4f6-8e717be90a35-calico-apiserver-certs\") pod \"calico-apiserver-75cd76566-rw7m9\" (UID: \"270bd17b-94f3-4836-a4f6-8e717be90a35\") " pod="calico-apiserver/calico-apiserver-75cd76566-rw7m9" Oct 13 06:54:13.886452 kubelet[3292]: I1013 06:54:13.885958 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4bb65662-ba1f-4201-a746-530b80972c43-calico-apiserver-certs\") pod \"calico-apiserver-75cd76566-6brsl\" (UID: \"4bb65662-ba1f-4201-a746-530b80972c43\") " pod="calico-apiserver/calico-apiserver-75cd76566-6brsl" Oct 13 06:54:14.122392 containerd[1925]: time="2025-10-13T06:54:14.122307437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bg2bt,Uid:12f6bdc5-c572-4ad3-9971-792e82e94f2d,Namespace:kube-system,Attempt:0,}" Oct 13 06:54:14.126654 containerd[1925]: time="2025-10-13T06:54:14.126636065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gpd7c,Uid:389641a2-f104-4e70-937e-4d07a8816bed,Namespace:kube-system,Attempt:0,}" Oct 13 06:54:14.133174 containerd[1925]: time="2025-10-13T06:54:14.133145410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-866f8d6d77-5wphw,Uid:55c626e4-fa08-4709-8ae8-a6e16ad60876,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:14.136535 containerd[1925]: time="2025-10-13T06:54:14.136514952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75cd76566-rw7m9,Uid:270bd17b-94f3-4836-a4f6-8e717be90a35,Namespace:calico-apiserver,Attempt:0,}" Oct 13 06:54:14.141071 containerd[1925]: time="2025-10-13T06:54:14.141044469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75cd76566-6brsl,Uid:4bb65662-ba1f-4201-a746-530b80972c43,Namespace:calico-apiserver,Attempt:0,}" Oct 13 06:54:14.144535 containerd[1925]: time="2025-10-13T06:54:14.144515465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bv4dr,Uid:966b92e5-6493-4cbc-a333-cedc1d052bfa,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:14.146944 containerd[1925]: time="2025-10-13T06:54:14.146922983Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69694b5d67-pq9tr,Uid:095e7c99-377a-4f2d-96a4-21b09a1d4a94,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:14.149541 containerd[1925]: time="2025-10-13T06:54:14.149506308Z" level=error msg="Failed to destroy network for sandbox \"ad15d6a229d23c048702c7aee06385e90820bfd87ffbecca04ceaf58714757bf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.152836 containerd[1925]: time="2025-10-13T06:54:14.150954676Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bg2bt,Uid:12f6bdc5-c572-4ad3-9971-792e82e94f2d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad15d6a229d23c048702c7aee06385e90820bfd87ffbecca04ceaf58714757bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.153210 kubelet[3292]: E1013 06:54:14.153174 3292 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad15d6a229d23c048702c7aee06385e90820bfd87ffbecca04ceaf58714757bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.153294 kubelet[3292]: E1013 06:54:14.153239 3292 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad15d6a229d23c048702c7aee06385e90820bfd87ffbecca04ceaf58714757bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bg2bt" Oct 13 06:54:14.153294 kubelet[3292]: E1013 06:54:14.153260 3292 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad15d6a229d23c048702c7aee06385e90820bfd87ffbecca04ceaf58714757bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bg2bt" Oct 13 06:54:14.153362 kubelet[3292]: E1013 06:54:14.153307 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-bg2bt_kube-system(12f6bdc5-c572-4ad3-9971-792e82e94f2d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-bg2bt_kube-system(12f6bdc5-c572-4ad3-9971-792e82e94f2d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad15d6a229d23c048702c7aee06385e90820bfd87ffbecca04ceaf58714757bf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-bg2bt" podUID="12f6bdc5-c572-4ad3-9971-792e82e94f2d" Oct 13 06:54:14.154435 systemd[1]: run-netns-cni\x2de692f661\x2d0de8\x2d86e3\x2d0496\x2d81d02f901062.mount: Deactivated successfully. Oct 13 06:54:14.155814 containerd[1925]: time="2025-10-13T06:54:14.155770590Z" level=error msg="Failed to destroy network for sandbox \"fca1fd9a0be933668ff819603f513e3ac19cb757f3e8f1826418cab66fa9d132\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.156335 containerd[1925]: time="2025-10-13T06:54:14.156314676Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gpd7c,Uid:389641a2-f104-4e70-937e-4d07a8816bed,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fca1fd9a0be933668ff819603f513e3ac19cb757f3e8f1826418cab66fa9d132\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.156514 kubelet[3292]: E1013 06:54:14.156494 3292 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fca1fd9a0be933668ff819603f513e3ac19cb757f3e8f1826418cab66fa9d132\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.156553 kubelet[3292]: E1013 06:54:14.156533 3292 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fca1fd9a0be933668ff819603f513e3ac19cb757f3e8f1826418cab66fa9d132\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gpd7c" Oct 13 06:54:14.156553 kubelet[3292]: E1013 06:54:14.156549 3292 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fca1fd9a0be933668ff819603f513e3ac19cb757f3e8f1826418cab66fa9d132\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gpd7c" Oct 13 06:54:14.156628 kubelet[3292]: E1013 06:54:14.156582 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-gpd7c_kube-system(389641a2-f104-4e70-937e-4d07a8816bed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-gpd7c_kube-system(389641a2-f104-4e70-937e-4d07a8816bed)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fca1fd9a0be933668ff819603f513e3ac19cb757f3e8f1826418cab66fa9d132\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-gpd7c" podUID="389641a2-f104-4e70-937e-4d07a8816bed" Oct 13 06:54:14.173240 containerd[1925]: time="2025-10-13T06:54:14.173207387Z" level=error msg="Failed to destroy network for sandbox \"bdaa2c6273f0982cd18c2717523be58265026f07b7c2ab5f31a41849b6c79008\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.173629 containerd[1925]: time="2025-10-13T06:54:14.173611550Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-866f8d6d77-5wphw,Uid:55c626e4-fa08-4709-8ae8-a6e16ad60876,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bdaa2c6273f0982cd18c2717523be58265026f07b7c2ab5f31a41849b6c79008\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.173788 kubelet[3292]: E1013 06:54:14.173766 3292 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bdaa2c6273f0982cd18c2717523be58265026f07b7c2ab5f31a41849b6c79008\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.173824 kubelet[3292]: E1013 06:54:14.173806 3292 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bdaa2c6273f0982cd18c2717523be58265026f07b7c2ab5f31a41849b6c79008\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-866f8d6d77-5wphw" Oct 13 06:54:14.173824 kubelet[3292]: E1013 06:54:14.173820 3292 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bdaa2c6273f0982cd18c2717523be58265026f07b7c2ab5f31a41849b6c79008\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-866f8d6d77-5wphw" Oct 13 06:54:14.173863 kubelet[3292]: E1013 06:54:14.173849 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-866f8d6d77-5wphw_calico-system(55c626e4-fa08-4709-8ae8-a6e16ad60876)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-866f8d6d77-5wphw_calico-system(55c626e4-fa08-4709-8ae8-a6e16ad60876)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bdaa2c6273f0982cd18c2717523be58265026f07b7c2ab5f31a41849b6c79008\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-866f8d6d77-5wphw" podUID="55c626e4-fa08-4709-8ae8-a6e16ad60876" Oct 13 06:54:14.174152 containerd[1925]: time="2025-10-13T06:54:14.174127166Z" level=error msg="Failed to destroy network for sandbox \"88313343e5c85708462929f054163a229415f0c633ad4c03519decd28c0fae74\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.174543 containerd[1925]: time="2025-10-13T06:54:14.174527384Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75cd76566-rw7m9,Uid:270bd17b-94f3-4836-a4f6-8e717be90a35,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"88313343e5c85708462929f054163a229415f0c633ad4c03519decd28c0fae74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.174621 kubelet[3292]: E1013 06:54:14.174606 3292 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88313343e5c85708462929f054163a229415f0c633ad4c03519decd28c0fae74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.174648 kubelet[3292]: E1013 06:54:14.174632 3292 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88313343e5c85708462929f054163a229415f0c633ad4c03519decd28c0fae74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75cd76566-rw7m9" Oct 13 06:54:14.174648 kubelet[3292]: E1013 06:54:14.174643 3292 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88313343e5c85708462929f054163a229415f0c633ad4c03519decd28c0fae74\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75cd76566-rw7m9" Oct 13 06:54:14.174704 kubelet[3292]: E1013 06:54:14.174671 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75cd76566-rw7m9_calico-apiserver(270bd17b-94f3-4836-a4f6-8e717be90a35)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75cd76566-rw7m9_calico-apiserver(270bd17b-94f3-4836-a4f6-8e717be90a35)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"88313343e5c85708462929f054163a229415f0c633ad4c03519decd28c0fae74\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-75cd76566-rw7m9" podUID="270bd17b-94f3-4836-a4f6-8e717be90a35" Oct 13 06:54:14.176001 containerd[1925]: time="2025-10-13T06:54:14.175984243Z" level=error msg="Failed to destroy network for sandbox \"9d7d19e8bb7f3d2ef46a156afbf8c610859fc8155ed1c582fb9a753842d9593f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.176417 containerd[1925]: time="2025-10-13T06:54:14.176397316Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bv4dr,Uid:966b92e5-6493-4cbc-a333-cedc1d052bfa,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d7d19e8bb7f3d2ef46a156afbf8c610859fc8155ed1c582fb9a753842d9593f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.176538 kubelet[3292]: E1013 06:54:14.176517 3292 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d7d19e8bb7f3d2ef46a156afbf8c610859fc8155ed1c582fb9a753842d9593f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.176584 kubelet[3292]: E1013 06:54:14.176553 3292 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d7d19e8bb7f3d2ef46a156afbf8c610859fc8155ed1c582fb9a753842d9593f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-bv4dr" Oct 13 06:54:14.176584 kubelet[3292]: E1013 06:54:14.176572 3292 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d7d19e8bb7f3d2ef46a156afbf8c610859fc8155ed1c582fb9a753842d9593f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-bv4dr" Oct 13 06:54:14.176630 kubelet[3292]: E1013 06:54:14.176607 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-bv4dr_calico-system(966b92e5-6493-4cbc-a333-cedc1d052bfa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-bv4dr_calico-system(966b92e5-6493-4cbc-a333-cedc1d052bfa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9d7d19e8bb7f3d2ef46a156afbf8c610859fc8155ed1c582fb9a753842d9593f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-bv4dr" podUID="966b92e5-6493-4cbc-a333-cedc1d052bfa" Oct 13 06:54:14.176877 containerd[1925]: time="2025-10-13T06:54:14.176856959Z" level=error msg="Failed to destroy network for sandbox \"12ca249c989593a25eb41a4c738fc34d597ecca32b48aa23d83116322a9a7688\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.177229 containerd[1925]: time="2025-10-13T06:54:14.177210183Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75cd76566-6brsl,Uid:4bb65662-ba1f-4201-a746-530b80972c43,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"12ca249c989593a25eb41a4c738fc34d597ecca32b48aa23d83116322a9a7688\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.177298 kubelet[3292]: E1013 06:54:14.177287 3292 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12ca249c989593a25eb41a4c738fc34d597ecca32b48aa23d83116322a9a7688\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.177323 kubelet[3292]: E1013 06:54:14.177306 3292 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12ca249c989593a25eb41a4c738fc34d597ecca32b48aa23d83116322a9a7688\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75cd76566-6brsl" Oct 13 06:54:14.177323 kubelet[3292]: E1013 06:54:14.177316 3292 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12ca249c989593a25eb41a4c738fc34d597ecca32b48aa23d83116322a9a7688\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75cd76566-6brsl" Oct 13 06:54:14.177359 kubelet[3292]: E1013 06:54:14.177335 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75cd76566-6brsl_calico-apiserver(4bb65662-ba1f-4201-a746-530b80972c43)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75cd76566-6brsl_calico-apiserver(4bb65662-ba1f-4201-a746-530b80972c43)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12ca249c989593a25eb41a4c738fc34d597ecca32b48aa23d83116322a9a7688\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-75cd76566-6brsl" podUID="4bb65662-ba1f-4201-a746-530b80972c43" Oct 13 06:54:14.177610 containerd[1925]: time="2025-10-13T06:54:14.177592066Z" level=error msg="Failed to destroy network for sandbox \"39c7c7543e59827ff46f5e9eb955329dc1049583ef20c9cac25dbb9e7fd33d47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.177904 containerd[1925]: time="2025-10-13T06:54:14.177890338Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69694b5d67-pq9tr,Uid:095e7c99-377a-4f2d-96a4-21b09a1d4a94,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"39c7c7543e59827ff46f5e9eb955329dc1049583ef20c9cac25dbb9e7fd33d47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.177962 kubelet[3292]: E1013 06:54:14.177949 3292 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39c7c7543e59827ff46f5e9eb955329dc1049583ef20c9cac25dbb9e7fd33d47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.177982 kubelet[3292]: E1013 06:54:14.177969 3292 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39c7c7543e59827ff46f5e9eb955329dc1049583ef20c9cac25dbb9e7fd33d47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-69694b5d67-pq9tr" Oct 13 06:54:14.177982 kubelet[3292]: E1013 06:54:14.177979 3292 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39c7c7543e59827ff46f5e9eb955329dc1049583ef20c9cac25dbb9e7fd33d47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-69694b5d67-pq9tr" Oct 13 06:54:14.178017 kubelet[3292]: E1013 06:54:14.177995 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-69694b5d67-pq9tr_calico-system(095e7c99-377a-4f2d-96a4-21b09a1d4a94)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-69694b5d67-pq9tr_calico-system(095e7c99-377a-4f2d-96a4-21b09a1d4a94)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"39c7c7543e59827ff46f5e9eb955329dc1049583ef20c9cac25dbb9e7fd33d47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-69694b5d67-pq9tr" podUID="095e7c99-377a-4f2d-96a4-21b09a1d4a94" Oct 13 06:54:14.409863 systemd[1]: Created slice kubepods-besteffort-pod168acdb3_8fc1_4f19_b797_a3b8ada8a829.slice - libcontainer container kubepods-besteffort-pod168acdb3_8fc1_4f19_b797_a3b8ada8a829.slice. Oct 13 06:54:14.416043 containerd[1925]: time="2025-10-13T06:54:14.415969407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gbdt9,Uid:168acdb3-8fc1-4f19-b797-a3b8ada8a829,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:14.441202 containerd[1925]: time="2025-10-13T06:54:14.441154972Z" level=error msg="Failed to destroy network for sandbox \"72c5c2273537fcd2b017c94ecb7c148b8a1c09698a1da178c27686c16c1d6ba7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.441624 containerd[1925]: time="2025-10-13T06:54:14.441608006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gbdt9,Uid:168acdb3-8fc1-4f19-b797-a3b8ada8a829,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72c5c2273537fcd2b017c94ecb7c148b8a1c09698a1da178c27686c16c1d6ba7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.441828 kubelet[3292]: E1013 06:54:14.441783 3292 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72c5c2273537fcd2b017c94ecb7c148b8a1c09698a1da178c27686c16c1d6ba7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Oct 13 06:54:14.441828 kubelet[3292]: E1013 06:54:14.441817 3292 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72c5c2273537fcd2b017c94ecb7c148b8a1c09698a1da178c27686c16c1d6ba7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gbdt9" Oct 13 06:54:14.441882 kubelet[3292]: E1013 06:54:14.441831 3292 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72c5c2273537fcd2b017c94ecb7c148b8a1c09698a1da178c27686c16c1d6ba7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gbdt9" Oct 13 06:54:14.441882 kubelet[3292]: E1013 06:54:14.441854 3292 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gbdt9_calico-system(168acdb3-8fc1-4f19-b797-a3b8ada8a829)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gbdt9_calico-system(168acdb3-8fc1-4f19-b797-a3b8ada8a829)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72c5c2273537fcd2b017c94ecb7c148b8a1c09698a1da178c27686c16c1d6ba7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gbdt9" podUID="168acdb3-8fc1-4f19-b797-a3b8ada8a829" Oct 13 06:54:14.486396 containerd[1925]: time="2025-10-13T06:54:14.486312977Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Oct 13 06:54:15.118341 systemd[1]: run-netns-cni\x2d542226ca\x2d7794\x2d182c\x2d1f51\x2dd854c0455356.mount: Deactivated successfully. Oct 13 06:54:15.118405 systemd[1]: run-netns-cni\x2d28ab4843\x2d72c0\x2ddd90\x2debcd\x2da119d7e930f3.mount: Deactivated successfully. Oct 13 06:54:15.118453 systemd[1]: run-netns-cni\x2d5fd025f8\x2d785f\x2d0793\x2dfdeb\x2dd256f13ef799.mount: Deactivated successfully. Oct 13 06:54:15.118493 systemd[1]: run-netns-cni\x2d830154b3\x2d4e28\x2d7510\x2df6b8\x2def67e1906be9.mount: Deactivated successfully. Oct 13 06:54:15.118532 systemd[1]: run-netns-cni\x2d5838e0ef\x2db13c\x2d7703\x2df313\x2d2dc0fea1f25a.mount: Deactivated successfully. Oct 13 06:54:15.118569 systemd[1]: run-netns-cni\x2d3ad730f1\x2d8a05\x2d6887\x2da157\x2d042fbedf2eaa.mount: Deactivated successfully. Oct 13 06:54:20.060761 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1932116324.mount: Deactivated successfully. Oct 13 06:54:20.070733 containerd[1925]: time="2025-10-13T06:54:20.070683943Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:20.070946 containerd[1925]: time="2025-10-13T06:54:20.070901254Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Oct 13 06:54:20.071248 containerd[1925]: time="2025-10-13T06:54:20.071207070Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:20.071929 containerd[1925]: time="2025-10-13T06:54:20.071889486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:20.072273 containerd[1925]: time="2025-10-13T06:54:20.072232716Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 5.585849411s" Oct 13 06:54:20.072273 containerd[1925]: time="2025-10-13T06:54:20.072247714Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Oct 13 06:54:20.075593 containerd[1925]: time="2025-10-13T06:54:20.075576519Z" level=info msg="CreateContainer within sandbox \"127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Oct 13 06:54:20.086266 containerd[1925]: time="2025-10-13T06:54:20.086224323Z" level=info msg="Container e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:20.090477 containerd[1925]: time="2025-10-13T06:54:20.090433120Z" level=info msg="CreateContainer within sandbox \"127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\"" Oct 13 06:54:20.090728 containerd[1925]: time="2025-10-13T06:54:20.090691134Z" level=info msg="StartContainer for \"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\"" Oct 13 06:54:20.091445 containerd[1925]: time="2025-10-13T06:54:20.091427813Z" level=info msg="connecting to shim e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac" address="unix:///run/containerd/s/23298c5bff17e46006b89fa64715a827f9477d580fa30fd696cefb0aa6f3d155" protocol=ttrpc version=3 Oct 13 06:54:20.114934 systemd[1]: Started cri-containerd-e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac.scope - libcontainer container e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac. Oct 13 06:54:20.139829 containerd[1925]: time="2025-10-13T06:54:20.139805878Z" level=info msg="StartContainer for \"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" returns successfully" Oct 13 06:54:20.203202 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Oct 13 06:54:20.203256 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Oct 13 06:54:20.331956 kubelet[3292]: I1013 06:54:20.331910 3292 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/095e7c99-377a-4f2d-96a4-21b09a1d4a94-whisker-ca-bundle\") pod \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\" (UID: \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\") " Oct 13 06:54:20.331956 kubelet[3292]: I1013 06:54:20.331938 3292 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/095e7c99-377a-4f2d-96a4-21b09a1d4a94-whisker-backend-key-pair\") pod \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\" (UID: \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\") " Oct 13 06:54:20.331956 kubelet[3292]: I1013 06:54:20.331955 3292 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzmtm\" (UniqueName: \"kubernetes.io/projected/095e7c99-377a-4f2d-96a4-21b09a1d4a94-kube-api-access-gzmtm\") pod \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\" (UID: \"095e7c99-377a-4f2d-96a4-21b09a1d4a94\") " Oct 13 06:54:20.332273 kubelet[3292]: I1013 06:54:20.332175 3292 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/095e7c99-377a-4f2d-96a4-21b09a1d4a94-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "095e7c99-377a-4f2d-96a4-21b09a1d4a94" (UID: "095e7c99-377a-4f2d-96a4-21b09a1d4a94"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Oct 13 06:54:20.333842 kubelet[3292]: I1013 06:54:20.333789 3292 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/095e7c99-377a-4f2d-96a4-21b09a1d4a94-kube-api-access-gzmtm" (OuterVolumeSpecName: "kube-api-access-gzmtm") pod "095e7c99-377a-4f2d-96a4-21b09a1d4a94" (UID: "095e7c99-377a-4f2d-96a4-21b09a1d4a94"). InnerVolumeSpecName "kube-api-access-gzmtm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Oct 13 06:54:20.333902 kubelet[3292]: I1013 06:54:20.333847 3292 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/095e7c99-377a-4f2d-96a4-21b09a1d4a94-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "095e7c99-377a-4f2d-96a4-21b09a1d4a94" (UID: "095e7c99-377a-4f2d-96a4-21b09a1d4a94"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Oct 13 06:54:20.409137 systemd[1]: Removed slice kubepods-besteffort-pod095e7c99_377a_4f2d_96a4_21b09a1d4a94.slice - libcontainer container kubepods-besteffort-pod095e7c99_377a_4f2d_96a4_21b09a1d4a94.slice. Oct 13 06:54:20.432427 kubelet[3292]: I1013 06:54:20.432318 3292 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/095e7c99-377a-4f2d-96a4-21b09a1d4a94-whisker-backend-key-pair\") on node \"ci-4459.1.0-a-3e5fd6a38a\" DevicePath \"\"" Oct 13 06:54:20.432427 kubelet[3292]: I1013 06:54:20.432380 3292 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gzmtm\" (UniqueName: \"kubernetes.io/projected/095e7c99-377a-4f2d-96a4-21b09a1d4a94-kube-api-access-gzmtm\") on node \"ci-4459.1.0-a-3e5fd6a38a\" DevicePath \"\"" Oct 13 06:54:20.432427 kubelet[3292]: I1013 06:54:20.432410 3292 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/095e7c99-377a-4f2d-96a4-21b09a1d4a94-whisker-ca-bundle\") on node \"ci-4459.1.0-a-3e5fd6a38a\" DevicePath \"\"" Oct 13 06:54:20.551925 kubelet[3292]: I1013 06:54:20.551808 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-jw4g9" podStartSLOduration=1.646560207 podStartE2EDuration="17.551770813s" podCreationTimestamp="2025-10-13 06:54:03 +0000 UTC" firstStartedPulling="2025-10-13 06:54:04.167387005 +0000 UTC m=+15.822081052" lastFinishedPulling="2025-10-13 06:54:20.072597611 +0000 UTC m=+31.727291658" observedRunningTime="2025-10-13 06:54:20.550564929 +0000 UTC m=+32.205259052" watchObservedRunningTime="2025-10-13 06:54:20.551770813 +0000 UTC m=+32.206464905" Oct 13 06:54:20.605095 systemd[1]: Created slice kubepods-besteffort-podc81a31db_2c78_422e_8e9d_bbb1b218c4e1.slice - libcontainer container kubepods-besteffort-podc81a31db_2c78_422e_8e9d_bbb1b218c4e1.slice. Oct 13 06:54:20.634079 kubelet[3292]: I1013 06:54:20.634029 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c81a31db-2c78-422e-8e9d-bbb1b218c4e1-whisker-backend-key-pair\") pod \"whisker-cdbf5c4f4-n7ppl\" (UID: \"c81a31db-2c78-422e-8e9d-bbb1b218c4e1\") " pod="calico-system/whisker-cdbf5c4f4-n7ppl" Oct 13 06:54:20.634226 kubelet[3292]: I1013 06:54:20.634113 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68gwz\" (UniqueName: \"kubernetes.io/projected/c81a31db-2c78-422e-8e9d-bbb1b218c4e1-kube-api-access-68gwz\") pod \"whisker-cdbf5c4f4-n7ppl\" (UID: \"c81a31db-2c78-422e-8e9d-bbb1b218c4e1\") " pod="calico-system/whisker-cdbf5c4f4-n7ppl" Oct 13 06:54:20.634226 kubelet[3292]: I1013 06:54:20.634147 3292 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c81a31db-2c78-422e-8e9d-bbb1b218c4e1-whisker-ca-bundle\") pod \"whisker-cdbf5c4f4-n7ppl\" (UID: \"c81a31db-2c78-422e-8e9d-bbb1b218c4e1\") " pod="calico-system/whisker-cdbf5c4f4-n7ppl" Oct 13 06:54:20.911100 containerd[1925]: time="2025-10-13T06:54:20.910898403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-cdbf5c4f4-n7ppl,Uid:c81a31db-2c78-422e-8e9d-bbb1b218c4e1,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:20.971717 systemd-networkd[1700]: cali30465d86240: Link UP Oct 13 06:54:20.971900 systemd-networkd[1700]: cali30465d86240: Gained carrier Oct 13 06:54:20.979556 containerd[1925]: 2025-10-13 06:54:20.922 [INFO][4713] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:54:20.979556 containerd[1925]: 2025-10-13 06:54:20.930 [INFO][4713] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0 whisker-cdbf5c4f4- calico-system c81a31db-2c78-422e-8e9d-bbb1b218c4e1 884 0 2025-10-13 06:54:20 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:cdbf5c4f4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459.1.0-a-3e5fd6a38a whisker-cdbf5c4f4-n7ppl eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali30465d86240 [] [] }} ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Namespace="calico-system" Pod="whisker-cdbf5c4f4-n7ppl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-" Oct 13 06:54:20.979556 containerd[1925]: 2025-10-13 06:54:20.930 [INFO][4713] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Namespace="calico-system" Pod="whisker-cdbf5c4f4-n7ppl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" Oct 13 06:54:20.979556 containerd[1925]: 2025-10-13 06:54:20.943 [INFO][4737] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" HandleID="k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.944 [INFO][4737] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" HandleID="k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039c320), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.1.0-a-3e5fd6a38a", "pod":"whisker-cdbf5c4f4-n7ppl", "timestamp":"2025-10-13 06:54:20.943894763 +0000 UTC"}, Hostname:"ci-4459.1.0-a-3e5fd6a38a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.944 [INFO][4737] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.944 [INFO][4737] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.944 [INFO][4737] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.1.0-a-3e5fd6a38a' Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.948 [INFO][4737] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.951 [INFO][4737] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.954 [INFO][4737] ipam/ipam.go 511: Trying affinity for 192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.955 [INFO][4737] ipam/ipam.go 158: Attempting to load block cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.979779 containerd[1925]: 2025-10-13 06:54:20.957 [INFO][4737] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.980046 containerd[1925]: 2025-10-13 06:54:20.957 [INFO][4737] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.980046 containerd[1925]: 2025-10-13 06:54:20.958 [INFO][4737] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb Oct 13 06:54:20.980046 containerd[1925]: 2025-10-13 06:54:20.961 [INFO][4737] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.980046 containerd[1925]: 2025-10-13 06:54:20.964 [INFO][4737] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.71.1/26] block=192.168.71.0/26 handle="k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.980046 containerd[1925]: 2025-10-13 06:54:20.964 [INFO][4737] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.71.1/26] handle="k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:20.980046 containerd[1925]: 2025-10-13 06:54:20.964 [INFO][4737] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:54:20.980046 containerd[1925]: 2025-10-13 06:54:20.964 [INFO][4737] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.1/26] IPv6=[] ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" HandleID="k8s-pod-network.e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" Oct 13 06:54:20.980238 containerd[1925]: 2025-10-13 06:54:20.966 [INFO][4713] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Namespace="calico-system" Pod="whisker-cdbf5c4f4-n7ppl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0", GenerateName:"whisker-cdbf5c4f4-", Namespace:"calico-system", SelfLink:"", UID:"c81a31db-2c78-422e-8e9d-bbb1b218c4e1", ResourceVersion:"884", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"cdbf5c4f4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"", Pod:"whisker-cdbf5c4f4-n7ppl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.71.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali30465d86240", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:20.980238 containerd[1925]: 2025-10-13 06:54:20.966 [INFO][4713] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.1/32] ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Namespace="calico-system" Pod="whisker-cdbf5c4f4-n7ppl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" Oct 13 06:54:20.980334 containerd[1925]: 2025-10-13 06:54:20.966 [INFO][4713] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali30465d86240 ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Namespace="calico-system" Pod="whisker-cdbf5c4f4-n7ppl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" Oct 13 06:54:20.980334 containerd[1925]: 2025-10-13 06:54:20.972 [INFO][4713] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Namespace="calico-system" Pod="whisker-cdbf5c4f4-n7ppl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" Oct 13 06:54:20.980396 containerd[1925]: 2025-10-13 06:54:20.972 [INFO][4713] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Namespace="calico-system" Pod="whisker-cdbf5c4f4-n7ppl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0", GenerateName:"whisker-cdbf5c4f4-", Namespace:"calico-system", SelfLink:"", UID:"c81a31db-2c78-422e-8e9d-bbb1b218c4e1", ResourceVersion:"884", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"cdbf5c4f4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb", Pod:"whisker-cdbf5c4f4-n7ppl", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.71.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali30465d86240", MAC:"8a:d9:cc:fb:98:d7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:20.980465 containerd[1925]: 2025-10-13 06:54:20.978 [INFO][4713] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" Namespace="calico-system" Pod="whisker-cdbf5c4f4-n7ppl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-whisker--cdbf5c4f4--n7ppl-eth0" Oct 13 06:54:21.006274 containerd[1925]: time="2025-10-13T06:54:21.006222005Z" level=info msg="connecting to shim e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb" address="unix:///run/containerd/s/c761e6009e76cc6ad7ce7fe724627a47e8cc729786d0fe4209cb3e56a1c9f570" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:21.031850 systemd[1]: Started cri-containerd-e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb.scope - libcontainer container e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb. Oct 13 06:54:21.073287 systemd[1]: var-lib-kubelet-pods-095e7c99\x2d377a\x2d4f2d\x2d96a4\x2d21b09a1d4a94-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dgzmtm.mount: Deactivated successfully. Oct 13 06:54:21.073365 systemd[1]: var-lib-kubelet-pods-095e7c99\x2d377a\x2d4f2d\x2d96a4\x2d21b09a1d4a94-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Oct 13 06:54:21.073484 containerd[1925]: time="2025-10-13T06:54:21.073398549Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-cdbf5c4f4-n7ppl,Uid:c81a31db-2c78-422e-8e9d-bbb1b218c4e1,Namespace:calico-system,Attempt:0,} returns sandbox id \"e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb\"" Oct 13 06:54:21.074043 containerd[1925]: time="2025-10-13T06:54:21.074032299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Oct 13 06:54:21.633358 containerd[1925]: time="2025-10-13T06:54:21.633331554Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"d0598055d00366d086fcba47baf95e1771e42cfe894c270c4b85e6a6a1ee44f2\" pid:4945 exit_status:1 exited_at:{seconds:1760338461 nanos:633137964}" Oct 13 06:54:22.395204 kubelet[3292]: I1013 06:54:22.395183 3292 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="095e7c99-377a-4f2d-96a4-21b09a1d4a94" path="/var/lib/kubelet/pods/095e7c99-377a-4f2d-96a4-21b09a1d4a94/volumes" Oct 13 06:54:22.464105 systemd-networkd[1700]: cali30465d86240: Gained IPv6LL Oct 13 06:54:22.571053 containerd[1925]: time="2025-10-13T06:54:22.570995916Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"21b67a23c77df779010e7132f38ebc83e797bfb03e0138775e11ef0ccd928959\" pid:5037 exit_status:1 exited_at:{seconds:1760338462 nanos:570796383}" Oct 13 06:54:23.003672 containerd[1925]: time="2025-10-13T06:54:23.003640656Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:23.003911 containerd[1925]: time="2025-10-13T06:54:23.003898362Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Oct 13 06:54:23.004228 containerd[1925]: time="2025-10-13T06:54:23.004214003Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:23.005149 containerd[1925]: time="2025-10-13T06:54:23.005135667Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:23.005872 containerd[1925]: time="2025-10-13T06:54:23.005859909Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.931812868s" Oct 13 06:54:23.005913 containerd[1925]: time="2025-10-13T06:54:23.005875037Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Oct 13 06:54:23.006966 containerd[1925]: time="2025-10-13T06:54:23.006954188Z" level=info msg="CreateContainer within sandbox \"e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Oct 13 06:54:23.009468 containerd[1925]: time="2025-10-13T06:54:23.009457135Z" level=info msg="Container 053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:23.012342 containerd[1925]: time="2025-10-13T06:54:23.012330615Z" level=info msg="CreateContainer within sandbox \"e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524\"" Oct 13 06:54:23.012594 containerd[1925]: time="2025-10-13T06:54:23.012582136Z" level=info msg="StartContainer for \"053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524\"" Oct 13 06:54:23.013093 containerd[1925]: time="2025-10-13T06:54:23.013081779Z" level=info msg="connecting to shim 053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524" address="unix:///run/containerd/s/c761e6009e76cc6ad7ce7fe724627a47e8cc729786d0fe4209cb3e56a1c9f570" protocol=ttrpc version=3 Oct 13 06:54:23.037977 systemd[1]: Started cri-containerd-053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524.scope - libcontainer container 053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524. Oct 13 06:54:23.064737 containerd[1925]: time="2025-10-13T06:54:23.064713393Z" level=info msg="StartContainer for \"053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524\" returns successfully" Oct 13 06:54:23.065247 containerd[1925]: time="2025-10-13T06:54:23.065235819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Oct 13 06:54:25.402714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3096483982.mount: Deactivated successfully. Oct 13 06:54:25.406997 containerd[1925]: time="2025-10-13T06:54:25.406977195Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:25.407216 containerd[1925]: time="2025-10-13T06:54:25.407203124Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Oct 13 06:54:25.407579 containerd[1925]: time="2025-10-13T06:54:25.407568570Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:25.408666 containerd[1925]: time="2025-10-13T06:54:25.408649423Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:25.408983 containerd[1925]: time="2025-10-13T06:54:25.408939954Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.343689247s" Oct 13 06:54:25.408983 containerd[1925]: time="2025-10-13T06:54:25.408955581Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Oct 13 06:54:25.409952 containerd[1925]: time="2025-10-13T06:54:25.409938777Z" level=info msg="CreateContainer within sandbox \"e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Oct 13 06:54:25.422917 containerd[1925]: time="2025-10-13T06:54:25.422877597Z" level=info msg="Container bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:25.425873 containerd[1925]: time="2025-10-13T06:54:25.425832915Z" level=info msg="CreateContainer within sandbox \"e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7\"" Oct 13 06:54:25.426052 containerd[1925]: time="2025-10-13T06:54:25.426007444Z" level=info msg="StartContainer for \"bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7\"" Oct 13 06:54:25.426555 containerd[1925]: time="2025-10-13T06:54:25.426511506Z" level=info msg="connecting to shim bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7" address="unix:///run/containerd/s/c761e6009e76cc6ad7ce7fe724627a47e8cc729786d0fe4209cb3e56a1c9f570" protocol=ttrpc version=3 Oct 13 06:54:25.441948 systemd[1]: Started cri-containerd-bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7.scope - libcontainer container bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7. Oct 13 06:54:25.468710 containerd[1925]: time="2025-10-13T06:54:25.468685032Z" level=info msg="StartContainer for \"bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7\" returns successfully" Oct 13 06:54:25.534969 kubelet[3292]: I1013 06:54:25.534929 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-cdbf5c4f4-n7ppl" podStartSLOduration=1.199492509 podStartE2EDuration="5.534918022s" podCreationTimestamp="2025-10-13 06:54:20 +0000 UTC" firstStartedPulling="2025-10-13 06:54:21.073940182 +0000 UTC m=+32.728634230" lastFinishedPulling="2025-10-13 06:54:25.409365699 +0000 UTC m=+37.064059743" observedRunningTime="2025-10-13 06:54:25.534492307 +0000 UTC m=+37.189186354" watchObservedRunningTime="2025-10-13 06:54:25.534918022 +0000 UTC m=+37.189612067" Oct 13 06:54:27.395684 containerd[1925]: time="2025-10-13T06:54:27.395565724Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-866f8d6d77-5wphw,Uid:55c626e4-fa08-4709-8ae8-a6e16ad60876,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:27.396765 containerd[1925]: time="2025-10-13T06:54:27.395587553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bg2bt,Uid:12f6bdc5-c572-4ad3-9971-792e82e94f2d,Namespace:kube-system,Attempt:0,}" Oct 13 06:54:27.456553 systemd-networkd[1700]: cali09e9341dd8d: Link UP Oct 13 06:54:27.456709 systemd-networkd[1700]: cali09e9341dd8d: Gained carrier Oct 13 06:54:27.461772 containerd[1925]: 2025-10-13 06:54:27.414 [INFO][5352] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:54:27.461772 containerd[1925]: 2025-10-13 06:54:27.420 [INFO][5352] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0 coredns-668d6bf9bc- kube-system 12f6bdc5-c572-4ad3-9971-792e82e94f2d 813 0 2025-10-13 06:53:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.1.0-a-3e5fd6a38a coredns-668d6bf9bc-bg2bt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali09e9341dd8d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Namespace="kube-system" Pod="coredns-668d6bf9bc-bg2bt" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-" Oct 13 06:54:27.461772 containerd[1925]: 2025-10-13 06:54:27.420 [INFO][5352] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Namespace="kube-system" Pod="coredns-668d6bf9bc-bg2bt" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" Oct 13 06:54:27.461772 containerd[1925]: 2025-10-13 06:54:27.433 [INFO][5386] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" HandleID="k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.433 [INFO][5386] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" HandleID="k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003459a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.1.0-a-3e5fd6a38a", "pod":"coredns-668d6bf9bc-bg2bt", "timestamp":"2025-10-13 06:54:27.433581575 +0000 UTC"}, Hostname:"ci-4459.1.0-a-3e5fd6a38a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.433 [INFO][5386] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.433 [INFO][5386] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.433 [INFO][5386] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.1.0-a-3e5fd6a38a' Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.439 [INFO][5386] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.442 [INFO][5386] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.445 [INFO][5386] ipam/ipam.go 511: Trying affinity for 192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.447 [INFO][5386] ipam/ipam.go 158: Attempting to load block cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.461916 containerd[1925]: 2025-10-13 06:54:27.448 [INFO][5386] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.462075 containerd[1925]: 2025-10-13 06:54:27.448 [INFO][5386] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.462075 containerd[1925]: 2025-10-13 06:54:27.449 [INFO][5386] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0 Oct 13 06:54:27.462075 containerd[1925]: 2025-10-13 06:54:27.452 [INFO][5386] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.462075 containerd[1925]: 2025-10-13 06:54:27.454 [INFO][5386] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.71.2/26] block=192.168.71.0/26 handle="k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.462075 containerd[1925]: 2025-10-13 06:54:27.454 [INFO][5386] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.71.2/26] handle="k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.462075 containerd[1925]: 2025-10-13 06:54:27.454 [INFO][5386] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:54:27.462075 containerd[1925]: 2025-10-13 06:54:27.454 [INFO][5386] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.2/26] IPv6=[] ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" HandleID="k8s-pod-network.486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" Oct 13 06:54:27.462190 containerd[1925]: 2025-10-13 06:54:27.455 [INFO][5352] cni-plugin/k8s.go 418: Populated endpoint ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Namespace="kube-system" Pod="coredns-668d6bf9bc-bg2bt" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"12f6bdc5-c572-4ad3-9971-792e82e94f2d", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 53, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"", Pod:"coredns-668d6bf9bc-bg2bt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali09e9341dd8d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:27.462190 containerd[1925]: 2025-10-13 06:54:27.455 [INFO][5352] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.2/32] ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Namespace="kube-system" Pod="coredns-668d6bf9bc-bg2bt" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" Oct 13 06:54:27.462190 containerd[1925]: 2025-10-13 06:54:27.455 [INFO][5352] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali09e9341dd8d ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Namespace="kube-system" Pod="coredns-668d6bf9bc-bg2bt" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" Oct 13 06:54:27.462190 containerd[1925]: 2025-10-13 06:54:27.456 [INFO][5352] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Namespace="kube-system" Pod="coredns-668d6bf9bc-bg2bt" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" Oct 13 06:54:27.462190 containerd[1925]: 2025-10-13 06:54:27.456 [INFO][5352] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Namespace="kube-system" Pod="coredns-668d6bf9bc-bg2bt" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"12f6bdc5-c572-4ad3-9971-792e82e94f2d", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 53, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0", Pod:"coredns-668d6bf9bc-bg2bt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali09e9341dd8d", MAC:"46:c6:4f:20:61:ae", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:27.462190 containerd[1925]: 2025-10-13 06:54:27.460 [INFO][5352] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" Namespace="kube-system" Pod="coredns-668d6bf9bc-bg2bt" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--bg2bt-eth0" Oct 13 06:54:27.469617 containerd[1925]: time="2025-10-13T06:54:27.469592872Z" level=info msg="connecting to shim 486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0" address="unix:///run/containerd/s/35d5fb9e12c9ecf73af42905bda745e0b1e5b0c205794e5d1889edbe9dc2c159" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:27.491799 systemd[1]: Started cri-containerd-486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0.scope - libcontainer container 486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0. Oct 13 06:54:27.525014 containerd[1925]: time="2025-10-13T06:54:27.524995512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bg2bt,Uid:12f6bdc5-c572-4ad3-9971-792e82e94f2d,Namespace:kube-system,Attempt:0,} returns sandbox id \"486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0\"" Oct 13 06:54:27.526084 containerd[1925]: time="2025-10-13T06:54:27.526071201Z" level=info msg="CreateContainer within sandbox \"486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 13 06:54:27.529171 containerd[1925]: time="2025-10-13T06:54:27.529158617Z" level=info msg="Container 8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:27.531260 containerd[1925]: time="2025-10-13T06:54:27.531219315Z" level=info msg="CreateContainer within sandbox \"486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772\"" Oct 13 06:54:27.531355 containerd[1925]: time="2025-10-13T06:54:27.531345698Z" level=info msg="StartContainer for \"8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772\"" Oct 13 06:54:27.531768 containerd[1925]: time="2025-10-13T06:54:27.531728863Z" level=info msg="connecting to shim 8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772" address="unix:///run/containerd/s/35d5fb9e12c9ecf73af42905bda745e0b1e5b0c205794e5d1889edbe9dc2c159" protocol=ttrpc version=3 Oct 13 06:54:27.548815 systemd[1]: Started cri-containerd-8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772.scope - libcontainer container 8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772. Oct 13 06:54:27.554220 systemd-networkd[1700]: cali163985708c3: Link UP Oct 13 06:54:27.554324 systemd-networkd[1700]: cali163985708c3: Gained carrier Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.413 [INFO][5342] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.420 [INFO][5342] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0 calico-kube-controllers-866f8d6d77- calico-system 55c626e4-fa08-4709-8ae8-a6e16ad60876 820 0 2025-10-13 06:54:04 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:866f8d6d77 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459.1.0-a-3e5fd6a38a calico-kube-controllers-866f8d6d77-5wphw eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali163985708c3 [] [] }} ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Namespace="calico-system" Pod="calico-kube-controllers-866f8d6d77-5wphw" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.420 [INFO][5342] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Namespace="calico-system" Pod="calico-kube-controllers-866f8d6d77-5wphw" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.433 [INFO][5388] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" HandleID="k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.434 [INFO][5388] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" HandleID="k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fbd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.1.0-a-3e5fd6a38a", "pod":"calico-kube-controllers-866f8d6d77-5wphw", "timestamp":"2025-10-13 06:54:27.433961069 +0000 UTC"}, Hostname:"ci-4459.1.0-a-3e5fd6a38a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.434 [INFO][5388] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.454 [INFO][5388] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.454 [INFO][5388] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.1.0-a-3e5fd6a38a' Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.540 [INFO][5388] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.542 [INFO][5388] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.545 [INFO][5388] ipam/ipam.go 511: Trying affinity for 192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.546 [INFO][5388] ipam/ipam.go 158: Attempting to load block cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.547 [INFO][5388] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.547 [INFO][5388] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.548 [INFO][5388] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0 Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.549 [INFO][5388] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.552 [INFO][5388] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.71.3/26] block=192.168.71.0/26 handle="k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.552 [INFO][5388] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.71.3/26] handle="k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.552 [INFO][5388] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:54:27.560419 containerd[1925]: 2025-10-13 06:54:27.552 [INFO][5388] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.3/26] IPv6=[] ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" HandleID="k8s-pod-network.ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" Oct 13 06:54:27.561045 containerd[1925]: 2025-10-13 06:54:27.553 [INFO][5342] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Namespace="calico-system" Pod="calico-kube-controllers-866f8d6d77-5wphw" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0", GenerateName:"calico-kube-controllers-866f8d6d77-", Namespace:"calico-system", SelfLink:"", UID:"55c626e4-fa08-4709-8ae8-a6e16ad60876", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"866f8d6d77", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"", Pod:"calico-kube-controllers-866f8d6d77-5wphw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.71.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali163985708c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:27.561045 containerd[1925]: 2025-10-13 06:54:27.553 [INFO][5342] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.3/32] ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Namespace="calico-system" Pod="calico-kube-controllers-866f8d6d77-5wphw" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" Oct 13 06:54:27.561045 containerd[1925]: 2025-10-13 06:54:27.553 [INFO][5342] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali163985708c3 ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Namespace="calico-system" Pod="calico-kube-controllers-866f8d6d77-5wphw" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" Oct 13 06:54:27.561045 containerd[1925]: 2025-10-13 06:54:27.554 [INFO][5342] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Namespace="calico-system" Pod="calico-kube-controllers-866f8d6d77-5wphw" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" Oct 13 06:54:27.561045 containerd[1925]: 2025-10-13 06:54:27.554 [INFO][5342] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Namespace="calico-system" Pod="calico-kube-controllers-866f8d6d77-5wphw" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0", GenerateName:"calico-kube-controllers-866f8d6d77-", Namespace:"calico-system", SelfLink:"", UID:"55c626e4-fa08-4709-8ae8-a6e16ad60876", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"866f8d6d77", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0", Pod:"calico-kube-controllers-866f8d6d77-5wphw", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.71.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali163985708c3", MAC:"0a:98:db:13:b6:89", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:27.561045 containerd[1925]: 2025-10-13 06:54:27.559 [INFO][5342] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" Namespace="calico-system" Pod="calico-kube-controllers-866f8d6d77-5wphw" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--kube--controllers--866f8d6d77--5wphw-eth0" Oct 13 06:54:27.563183 containerd[1925]: time="2025-10-13T06:54:27.563161583Z" level=info msg="StartContainer for \"8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772\" returns successfully" Oct 13 06:54:27.570010 containerd[1925]: time="2025-10-13T06:54:27.569977079Z" level=info msg="connecting to shim ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0" address="unix:///run/containerd/s/40ba4b24ae49acb6ad77b761f22f9adeefb5bdfc4ff757aa2bd969fa00d57785" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:27.586827 systemd[1]: Started cri-containerd-ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0.scope - libcontainer container ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0. Oct 13 06:54:27.613206 containerd[1925]: time="2025-10-13T06:54:27.613187303Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-866f8d6d77-5wphw,Uid:55c626e4-fa08-4709-8ae8-a6e16ad60876,Namespace:calico-system,Attempt:0,} returns sandbox id \"ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0\"" Oct 13 06:54:27.613853 containerd[1925]: time="2025-10-13T06:54:27.613842443Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Oct 13 06:54:28.394621 containerd[1925]: time="2025-10-13T06:54:28.394549912Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bv4dr,Uid:966b92e5-6493-4cbc-a333-cedc1d052bfa,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:28.394706 containerd[1925]: time="2025-10-13T06:54:28.394549910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75cd76566-6brsl,Uid:4bb65662-ba1f-4201-a746-530b80972c43,Namespace:calico-apiserver,Attempt:0,}" Oct 13 06:54:28.395101 containerd[1925]: time="2025-10-13T06:54:28.395084788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75cd76566-rw7m9,Uid:270bd17b-94f3-4836-a4f6-8e717be90a35,Namespace:calico-apiserver,Attempt:0,}" Oct 13 06:54:28.395157 containerd[1925]: time="2025-10-13T06:54:28.395099869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gpd7c,Uid:389641a2-f104-4e70-937e-4d07a8816bed,Namespace:kube-system,Attempt:0,}" Oct 13 06:54:28.468257 systemd-networkd[1700]: cali502c43791d7: Link UP Oct 13 06:54:28.468387 systemd-networkd[1700]: cali502c43791d7: Gained carrier Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.414 [INFO][5608] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.421 [INFO][5608] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0 calico-apiserver-75cd76566- calico-apiserver 4bb65662-ba1f-4201-a746-530b80972c43 822 0 2025-10-13 06:54:01 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75cd76566 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.1.0-a-3e5fd6a38a calico-apiserver-75cd76566-6brsl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali502c43791d7 [] [] }} ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-6brsl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.421 [INFO][5608] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-6brsl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.434 [INFO][5701] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" HandleID="k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.434 [INFO][5701] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" HandleID="k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e79e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.1.0-a-3e5fd6a38a", "pod":"calico-apiserver-75cd76566-6brsl", "timestamp":"2025-10-13 06:54:28.434461372 +0000 UTC"}, Hostname:"ci-4459.1.0-a-3e5fd6a38a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.434 [INFO][5701] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.434 [INFO][5701] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.434 [INFO][5701] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.1.0-a-3e5fd6a38a' Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.440 [INFO][5701] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.443 [INFO][5701] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.446 [INFO][5701] ipam/ipam.go 511: Trying affinity for 192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.448 [INFO][5701] ipam/ipam.go 158: Attempting to load block cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.449 [INFO][5701] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.449 [INFO][5701] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.451 [INFO][5701] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79 Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.462 [INFO][5701] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.466 [INFO][5701] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.71.4/26] block=192.168.71.0/26 handle="k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.466 [INFO][5701] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.71.4/26] handle="k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.466 [INFO][5701] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:54:28.473309 containerd[1925]: 2025-10-13 06:54:28.466 [INFO][5701] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.4/26] IPv6=[] ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" HandleID="k8s-pod-network.fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" Oct 13 06:54:28.474172 containerd[1925]: 2025-10-13 06:54:28.467 [INFO][5608] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-6brsl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0", GenerateName:"calico-apiserver-75cd76566-", Namespace:"calico-apiserver", SelfLink:"", UID:"4bb65662-ba1f-4201-a746-530b80972c43", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75cd76566", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"", Pod:"calico-apiserver-75cd76566-6brsl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali502c43791d7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:28.474172 containerd[1925]: 2025-10-13 06:54:28.467 [INFO][5608] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.4/32] ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-6brsl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" Oct 13 06:54:28.474172 containerd[1925]: 2025-10-13 06:54:28.467 [INFO][5608] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali502c43791d7 ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-6brsl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" Oct 13 06:54:28.474172 containerd[1925]: 2025-10-13 06:54:28.468 [INFO][5608] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-6brsl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" Oct 13 06:54:28.474172 containerd[1925]: 2025-10-13 06:54:28.468 [INFO][5608] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-6brsl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0", GenerateName:"calico-apiserver-75cd76566-", Namespace:"calico-apiserver", SelfLink:"", UID:"4bb65662-ba1f-4201-a746-530b80972c43", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75cd76566", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79", Pod:"calico-apiserver-75cd76566-6brsl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali502c43791d7", MAC:"96:85:fb:db:7e:1c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:28.474172 containerd[1925]: 2025-10-13 06:54:28.472 [INFO][5608] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-6brsl" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--6brsl-eth0" Oct 13 06:54:28.480919 containerd[1925]: time="2025-10-13T06:54:28.480864573Z" level=info msg="connecting to shim fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79" address="unix:///run/containerd/s/b983ef786babf624e5d05912609a83b15a2f580802bc5375c92c6389865da583" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:28.502806 systemd[1]: Started cri-containerd-fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79.scope - libcontainer container fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79. Oct 13 06:54:28.532939 containerd[1925]: time="2025-10-13T06:54:28.532888568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75cd76566-6brsl,Uid:4bb65662-ba1f-4201-a746-530b80972c43,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79\"" Oct 13 06:54:28.539156 kubelet[3292]: I1013 06:54:28.539118 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-bg2bt" podStartSLOduration=34.539106847 podStartE2EDuration="34.539106847s" podCreationTimestamp="2025-10-13 06:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:54:28.538792427 +0000 UTC m=+40.193486474" watchObservedRunningTime="2025-10-13 06:54:28.539106847 +0000 UTC m=+40.193800892" Oct 13 06:54:28.554064 systemd-networkd[1700]: cali07ef52d40eb: Link UP Oct 13 06:54:28.554173 systemd-networkd[1700]: cali07ef52d40eb: Gained carrier Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.413 [INFO][5606] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.420 [INFO][5606] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0 goldmane-54d579b49d- calico-system 966b92e5-6493-4cbc-a333-cedc1d052bfa 819 0 2025-10-13 06:54:03 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459.1.0-a-3e5fd6a38a goldmane-54d579b49d-bv4dr eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali07ef52d40eb [] [] }} ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Namespace="calico-system" Pod="goldmane-54d579b49d-bv4dr" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.420 [INFO][5606] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Namespace="calico-system" Pod="goldmane-54d579b49d-bv4dr" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.434 [INFO][5692] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" HandleID="k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.434 [INFO][5692] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" HandleID="k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00071bd50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.1.0-a-3e5fd6a38a", "pod":"goldmane-54d579b49d-bv4dr", "timestamp":"2025-10-13 06:54:28.434458651 +0000 UTC"}, Hostname:"ci-4459.1.0-a-3e5fd6a38a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.434 [INFO][5692] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.466 [INFO][5692] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.466 [INFO][5692] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.1.0-a-3e5fd6a38a' Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.539 [INFO][5692] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.543 [INFO][5692] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.545 [INFO][5692] ipam/ipam.go 511: Trying affinity for 192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.546 [INFO][5692] ipam/ipam.go 158: Attempting to load block cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.547 [INFO][5692] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.547 [INFO][5692] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.548 [INFO][5692] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050 Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.550 [INFO][5692] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.552 [INFO][5692] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.71.5/26] block=192.168.71.0/26 handle="k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.552 [INFO][5692] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.71.5/26] handle="k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.552 [INFO][5692] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:54:28.558873 containerd[1925]: 2025-10-13 06:54:28.552 [INFO][5692] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.5/26] IPv6=[] ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" HandleID="k8s-pod-network.d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" Oct 13 06:54:28.559266 containerd[1925]: 2025-10-13 06:54:28.553 [INFO][5606] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Namespace="calico-system" Pod="goldmane-54d579b49d-bv4dr" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"966b92e5-6493-4cbc-a333-cedc1d052bfa", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"", Pod:"goldmane-54d579b49d-bv4dr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.71.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali07ef52d40eb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:28.559266 containerd[1925]: 2025-10-13 06:54:28.553 [INFO][5606] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.5/32] ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Namespace="calico-system" Pod="goldmane-54d579b49d-bv4dr" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" Oct 13 06:54:28.559266 containerd[1925]: 2025-10-13 06:54:28.553 [INFO][5606] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali07ef52d40eb ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Namespace="calico-system" Pod="goldmane-54d579b49d-bv4dr" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" Oct 13 06:54:28.559266 containerd[1925]: 2025-10-13 06:54:28.554 [INFO][5606] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Namespace="calico-system" Pod="goldmane-54d579b49d-bv4dr" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" Oct 13 06:54:28.559266 containerd[1925]: 2025-10-13 06:54:28.554 [INFO][5606] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Namespace="calico-system" Pod="goldmane-54d579b49d-bv4dr" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"966b92e5-6493-4cbc-a333-cedc1d052bfa", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050", Pod:"goldmane-54d579b49d-bv4dr", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.71.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali07ef52d40eb", MAC:"7e:b1:24:83:4b:26", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:28.559266 containerd[1925]: 2025-10-13 06:54:28.558 [INFO][5606] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" Namespace="calico-system" Pod="goldmane-54d579b49d-bv4dr" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-goldmane--54d579b49d--bv4dr-eth0" Oct 13 06:54:28.566154 containerd[1925]: time="2025-10-13T06:54:28.566101692Z" level=info msg="connecting to shim d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050" address="unix:///run/containerd/s/e8f4b4505087dc8e1319531d411cfdec04d228451ac8c3c2556a8a6b9466c913" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:28.586950 systemd[1]: Started cri-containerd-d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050.scope - libcontainer container d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050. Oct 13 06:54:28.614545 containerd[1925]: time="2025-10-13T06:54:28.614520354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-bv4dr,Uid:966b92e5-6493-4cbc-a333-cedc1d052bfa,Namespace:calico-system,Attempt:0,} returns sandbox id \"d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050\"" Oct 13 06:54:28.656157 systemd-networkd[1700]: calibe10f51e2fd: Link UP Oct 13 06:54:28.656532 systemd-networkd[1700]: calibe10f51e2fd: Gained carrier Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.415 [INFO][5623] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.421 [INFO][5623] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0 calico-apiserver-75cd76566- calico-apiserver 270bd17b-94f3-4836-a4f6-8e717be90a35 817 0 2025-10-13 06:54:01 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75cd76566 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459.1.0-a-3e5fd6a38a calico-apiserver-75cd76566-rw7m9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calibe10f51e2fd [] [] }} ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-rw7m9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.421 [INFO][5623] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-rw7m9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.436 [INFO][5694] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" HandleID="k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.436 [INFO][5694] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" HandleID="k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ce090), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459.1.0-a-3e5fd6a38a", "pod":"calico-apiserver-75cd76566-rw7m9", "timestamp":"2025-10-13 06:54:28.436145765 +0000 UTC"}, Hostname:"ci-4459.1.0-a-3e5fd6a38a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.436 [INFO][5694] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.552 [INFO][5694] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.552 [INFO][5694] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.1.0-a-3e5fd6a38a' Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.640 [INFO][5694] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.644 [INFO][5694] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.646 [INFO][5694] ipam/ipam.go 511: Trying affinity for 192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.647 [INFO][5694] ipam/ipam.go 158: Attempting to load block cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.648 [INFO][5694] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.648 [INFO][5694] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.649 [INFO][5694] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7 Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.651 [INFO][5694] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.654 [INFO][5694] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.71.6/26] block=192.168.71.0/26 handle="k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.654 [INFO][5694] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.71.6/26] handle="k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.654 [INFO][5694] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:54:28.661313 containerd[1925]: 2025-10-13 06:54:28.654 [INFO][5694] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.6/26] IPv6=[] ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" HandleID="k8s-pod-network.c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" Oct 13 06:54:28.661741 containerd[1925]: 2025-10-13 06:54:28.655 [INFO][5623] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-rw7m9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0", GenerateName:"calico-apiserver-75cd76566-", Namespace:"calico-apiserver", SelfLink:"", UID:"270bd17b-94f3-4836-a4f6-8e717be90a35", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75cd76566", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"", Pod:"calico-apiserver-75cd76566-rw7m9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibe10f51e2fd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:28.661741 containerd[1925]: 2025-10-13 06:54:28.655 [INFO][5623] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.6/32] ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-rw7m9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" Oct 13 06:54:28.661741 containerd[1925]: 2025-10-13 06:54:28.655 [INFO][5623] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibe10f51e2fd ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-rw7m9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" Oct 13 06:54:28.661741 containerd[1925]: 2025-10-13 06:54:28.656 [INFO][5623] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-rw7m9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" Oct 13 06:54:28.661741 containerd[1925]: 2025-10-13 06:54:28.656 [INFO][5623] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-rw7m9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0", GenerateName:"calico-apiserver-75cd76566-", Namespace:"calico-apiserver", SelfLink:"", UID:"270bd17b-94f3-4836-a4f6-8e717be90a35", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75cd76566", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7", Pod:"calico-apiserver-75cd76566-rw7m9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.71.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibe10f51e2fd", MAC:"86:45:2b:83:fd:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:28.661741 containerd[1925]: 2025-10-13 06:54:28.660 [INFO][5623] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" Namespace="calico-apiserver" Pod="calico-apiserver-75cd76566-rw7m9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-calico--apiserver--75cd76566--rw7m9-eth0" Oct 13 06:54:28.669096 containerd[1925]: time="2025-10-13T06:54:28.669041515Z" level=info msg="connecting to shim c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7" address="unix:///run/containerd/s/164a8c44d0548343e344480497af2f2b6f6a75a6666da2a8a4fb3e7a2e47ae5b" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:28.697099 systemd[1]: Started cri-containerd-c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7.scope - libcontainer container c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7. Oct 13 06:54:28.750059 containerd[1925]: time="2025-10-13T06:54:28.750038732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75cd76566-rw7m9,Uid:270bd17b-94f3-4836-a4f6-8e717be90a35,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7\"" Oct 13 06:54:28.758983 systemd-networkd[1700]: caliaef526cfe1a: Link UP Oct 13 06:54:28.759166 systemd-networkd[1700]: caliaef526cfe1a: Gained carrier Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.413 [INFO][5634] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.421 [INFO][5634] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0 coredns-668d6bf9bc- kube-system 389641a2-f104-4e70-937e-4d07a8816bed 821 0 2025-10-13 06:53:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459.1.0-a-3e5fd6a38a coredns-668d6bf9bc-gpd7c eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliaef526cfe1a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpd7c" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.421 [INFO][5634] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpd7c" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.437 [INFO][5702] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" HandleID="k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.437 [INFO][5702] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" HandleID="k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e6b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459.1.0-a-3e5fd6a38a", "pod":"coredns-668d6bf9bc-gpd7c", "timestamp":"2025-10-13 06:54:28.437076375 +0000 UTC"}, Hostname:"ci-4459.1.0-a-3e5fd6a38a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.437 [INFO][5702] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.654 [INFO][5702] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.654 [INFO][5702] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.1.0-a-3e5fd6a38a' Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.741 [INFO][5702] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.745 [INFO][5702] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.748 [INFO][5702] ipam/ipam.go 511: Trying affinity for 192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.749 [INFO][5702] ipam/ipam.go 158: Attempting to load block cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.751 [INFO][5702] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.751 [INFO][5702] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.752 [INFO][5702] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043 Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.754 [INFO][5702] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.757 [INFO][5702] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.71.7/26] block=192.168.71.0/26 handle="k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.757 [INFO][5702] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.71.7/26] handle="k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.757 [INFO][5702] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:54:28.764054 containerd[1925]: 2025-10-13 06:54:28.757 [INFO][5702] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.7/26] IPv6=[] ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" HandleID="k8s-pod-network.44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" Oct 13 06:54:28.764443 containerd[1925]: 2025-10-13 06:54:28.758 [INFO][5634] cni-plugin/k8s.go 418: Populated endpoint ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpd7c" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"389641a2-f104-4e70-937e-4d07a8816bed", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 53, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"", Pod:"coredns-668d6bf9bc-gpd7c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliaef526cfe1a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:28.764443 containerd[1925]: 2025-10-13 06:54:28.758 [INFO][5634] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.7/32] ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpd7c" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" Oct 13 06:54:28.764443 containerd[1925]: 2025-10-13 06:54:28.758 [INFO][5634] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaef526cfe1a ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpd7c" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" Oct 13 06:54:28.764443 containerd[1925]: 2025-10-13 06:54:28.759 [INFO][5634] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpd7c" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" Oct 13 06:54:28.764443 containerd[1925]: 2025-10-13 06:54:28.759 [INFO][5634] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpd7c" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"389641a2-f104-4e70-937e-4d07a8816bed", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 53, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043", Pod:"coredns-668d6bf9bc-gpd7c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.71.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliaef526cfe1a", MAC:"12:33:83:c4:ff:d8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:28.764443 containerd[1925]: 2025-10-13 06:54:28.763 [INFO][5634] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpd7c" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-coredns--668d6bf9bc--gpd7c-eth0" Oct 13 06:54:28.771584 containerd[1925]: time="2025-10-13T06:54:28.771560598Z" level=info msg="connecting to shim 44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043" address="unix:///run/containerd/s/65a9bf297f7dfbddbd5f63c8539ff71faba447425570fb0e7069f7d21f284599" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:28.794091 systemd[1]: Started cri-containerd-44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043.scope - libcontainer container 44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043. Oct 13 06:54:28.799940 systemd-networkd[1700]: cali163985708c3: Gained IPv6LL Oct 13 06:54:28.877629 containerd[1925]: time="2025-10-13T06:54:28.877608698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gpd7c,Uid:389641a2-f104-4e70-937e-4d07a8816bed,Namespace:kube-system,Attempt:0,} returns sandbox id \"44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043\"" Oct 13 06:54:28.878622 containerd[1925]: time="2025-10-13T06:54:28.878609909Z" level=info msg="CreateContainer within sandbox \"44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Oct 13 06:54:28.881539 containerd[1925]: time="2025-10-13T06:54:28.881527254Z" level=info msg="Container b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:28.883545 containerd[1925]: time="2025-10-13T06:54:28.883508673Z" level=info msg="CreateContainer within sandbox \"44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff\"" Oct 13 06:54:28.883716 containerd[1925]: time="2025-10-13T06:54:28.883705081Z" level=info msg="StartContainer for \"b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff\"" Oct 13 06:54:28.884067 containerd[1925]: time="2025-10-13T06:54:28.884056212Z" level=info msg="connecting to shim b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff" address="unix:///run/containerd/s/65a9bf297f7dfbddbd5f63c8539ff71faba447425570fb0e7069f7d21f284599" protocol=ttrpc version=3 Oct 13 06:54:28.901976 systemd[1]: Started cri-containerd-b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff.scope - libcontainer container b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff. Oct 13 06:54:28.915108 containerd[1925]: time="2025-10-13T06:54:28.915053092Z" level=info msg="StartContainer for \"b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff\" returns successfully" Oct 13 06:54:28.928745 systemd-networkd[1700]: cali09e9341dd8d: Gained IPv6LL Oct 13 06:54:29.395372 containerd[1925]: time="2025-10-13T06:54:29.395258713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gbdt9,Uid:168acdb3-8fc1-4f19-b797-a3b8ada8a829,Namespace:calico-system,Attempt:0,}" Oct 13 06:54:29.404857 kubelet[3292]: I1013 06:54:29.404813 3292 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:54:29.448772 systemd-networkd[1700]: calieeee1a42096: Link UP Oct 13 06:54:29.448895 systemd-networkd[1700]: calieeee1a42096: Gained carrier Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.415 [INFO][6063] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.420 [INFO][6063] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0 csi-node-driver- calico-system 168acdb3-8fc1-4f19-b797-a3b8ada8a829 704 0 2025-10-13 06:54:04 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459.1.0-a-3e5fd6a38a csi-node-driver-gbdt9 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calieeee1a42096 [] [] }} ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Namespace="calico-system" Pod="csi-node-driver-gbdt9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.420 [INFO][6063] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Namespace="calico-system" Pod="csi-node-driver-gbdt9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.432 [INFO][6084] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" HandleID="k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.432 [INFO][6084] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" HandleID="k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a5840), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459.1.0-a-3e5fd6a38a", "pod":"csi-node-driver-gbdt9", "timestamp":"2025-10-13 06:54:29.43263593 +0000 UTC"}, Hostname:"ci-4459.1.0-a-3e5fd6a38a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.432 [INFO][6084] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.432 [INFO][6084] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.432 [INFO][6084] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459.1.0-a-3e5fd6a38a' Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.436 [INFO][6084] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.438 [INFO][6084] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.439 [INFO][6084] ipam/ipam.go 511: Trying affinity for 192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.440 [INFO][6084] ipam/ipam.go 158: Attempting to load block cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.441 [INFO][6084] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.71.0/26 host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.441 [INFO][6084] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.71.0/26 handle="k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.442 [INFO][6084] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.443 [INFO][6084] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.71.0/26 handle="k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.446 [INFO][6084] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.71.8/26] block=192.168.71.0/26 handle="k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.446 [INFO][6084] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.71.8/26] handle="k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" host="ci-4459.1.0-a-3e5fd6a38a" Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.446 [INFO][6084] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Oct 13 06:54:29.453337 containerd[1925]: 2025-10-13 06:54:29.446 [INFO][6084] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.71.8/26] IPv6=[] ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" HandleID="k8s-pod-network.0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Workload="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" Oct 13 06:54:29.453739 containerd[1925]: 2025-10-13 06:54:29.447 [INFO][6063] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Namespace="calico-system" Pod="csi-node-driver-gbdt9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"168acdb3-8fc1-4f19-b797-a3b8ada8a829", ResourceVersion:"704", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"", Pod:"csi-node-driver-gbdt9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.71.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieeee1a42096", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:29.453739 containerd[1925]: 2025-10-13 06:54:29.447 [INFO][6063] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.71.8/32] ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Namespace="calico-system" Pod="csi-node-driver-gbdt9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" Oct 13 06:54:29.453739 containerd[1925]: 2025-10-13 06:54:29.447 [INFO][6063] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieeee1a42096 ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Namespace="calico-system" Pod="csi-node-driver-gbdt9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" Oct 13 06:54:29.453739 containerd[1925]: 2025-10-13 06:54:29.448 [INFO][6063] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Namespace="calico-system" Pod="csi-node-driver-gbdt9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" Oct 13 06:54:29.453739 containerd[1925]: 2025-10-13 06:54:29.449 [INFO][6063] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Namespace="calico-system" Pod="csi-node-driver-gbdt9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"168acdb3-8fc1-4f19-b797-a3b8ada8a829", ResourceVersion:"704", Generation:0, CreationTimestamp:time.Date(2025, time.October, 13, 6, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459.1.0-a-3e5fd6a38a", ContainerID:"0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a", Pod:"csi-node-driver-gbdt9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.71.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calieeee1a42096", MAC:"c2:41:0d:ad:bf:3c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Oct 13 06:54:29.453739 containerd[1925]: 2025-10-13 06:54:29.452 [INFO][6063] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" Namespace="calico-system" Pod="csi-node-driver-gbdt9" WorkloadEndpoint="ci--4459.1.0--a--3e5fd6a38a-k8s-csi--node--driver--gbdt9-eth0" Oct 13 06:54:29.460869 containerd[1925]: time="2025-10-13T06:54:29.460848804Z" level=info msg="connecting to shim 0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a" address="unix:///run/containerd/s/c0d26a27f1f1982423180b4d52f171e60e352237dd1bb31fa31ac27615ca1786" namespace=k8s.io protocol=ttrpc version=3 Oct 13 06:54:29.490103 systemd[1]: Started cri-containerd-0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a.scope - libcontainer container 0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a. Oct 13 06:54:29.504052 systemd-networkd[1700]: cali502c43791d7: Gained IPv6LL Oct 13 06:54:29.546310 containerd[1925]: time="2025-10-13T06:54:29.546285418Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gbdt9,Uid:168acdb3-8fc1-4f19-b797-a3b8ada8a829,Namespace:calico-system,Attempt:0,} returns sandbox id \"0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a\"" Oct 13 06:54:29.552132 kubelet[3292]: I1013 06:54:29.552102 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-gpd7c" podStartSLOduration=35.552089336 podStartE2EDuration="35.552089336s" podCreationTimestamp="2025-10-13 06:53:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 06:54:29.551764847 +0000 UTC m=+41.206458895" watchObservedRunningTime="2025-10-13 06:54:29.552089336 +0000 UTC m=+41.206783381" Oct 13 06:54:29.695757 systemd-networkd[1700]: calibe10f51e2fd: Gained IPv6LL Oct 13 06:54:29.889014 systemd-networkd[1700]: vxlan.calico: Link UP Oct 13 06:54:29.889018 systemd-networkd[1700]: vxlan.calico: Gained carrier Oct 13 06:54:30.080840 systemd-networkd[1700]: cali07ef52d40eb: Gained IPv6LL Oct 13 06:54:30.655787 systemd-networkd[1700]: caliaef526cfe1a: Gained IPv6LL Oct 13 06:54:30.752079 containerd[1925]: time="2025-10-13T06:54:30.752026889Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:30.752330 containerd[1925]: time="2025-10-13T06:54:30.752165986Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Oct 13 06:54:30.752600 containerd[1925]: time="2025-10-13T06:54:30.752587658Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:30.753482 containerd[1925]: time="2025-10-13T06:54:30.753439562Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:30.753862 containerd[1925]: time="2025-10-13T06:54:30.753821579Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.139964505s" Oct 13 06:54:30.753862 containerd[1925]: time="2025-10-13T06:54:30.753837210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Oct 13 06:54:30.754250 containerd[1925]: time="2025-10-13T06:54:30.754240287Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Oct 13 06:54:30.757212 containerd[1925]: time="2025-10-13T06:54:30.757171263Z" level=info msg="CreateContainer within sandbox \"ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Oct 13 06:54:30.760258 containerd[1925]: time="2025-10-13T06:54:30.760212533Z" level=info msg="Container 9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:30.763023 containerd[1925]: time="2025-10-13T06:54:30.763010119Z" level=info msg="CreateContainer within sandbox \"ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\"" Oct 13 06:54:30.763242 containerd[1925]: time="2025-10-13T06:54:30.763230634Z" level=info msg="StartContainer for \"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\"" Oct 13 06:54:30.763746 containerd[1925]: time="2025-10-13T06:54:30.763733888Z" level=info msg="connecting to shim 9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5" address="unix:///run/containerd/s/40ba4b24ae49acb6ad77b761f22f9adeefb5bdfc4ff757aa2bd969fa00d57785" protocol=ttrpc version=3 Oct 13 06:54:30.783223 systemd[1]: Started cri-containerd-9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5.scope - libcontainer container 9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5. Oct 13 06:54:30.865267 containerd[1925]: time="2025-10-13T06:54:30.865241469Z" level=info msg="StartContainer for \"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" returns successfully" Oct 13 06:54:31.231962 systemd-networkd[1700]: calieeee1a42096: Gained IPv6LL Oct 13 06:54:31.360020 systemd-networkd[1700]: vxlan.calico: Gained IPv6LL Oct 13 06:54:31.556214 kubelet[3292]: I1013 06:54:31.556080 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-866f8d6d77-5wphw" podStartSLOduration=24.415620131 podStartE2EDuration="27.556066929s" podCreationTimestamp="2025-10-13 06:54:04 +0000 UTC" firstStartedPulling="2025-10-13 06:54:27.613743507 +0000 UTC m=+39.268437555" lastFinishedPulling="2025-10-13 06:54:30.754190309 +0000 UTC m=+42.408884353" observedRunningTime="2025-10-13 06:54:31.55590248 +0000 UTC m=+43.210596528" watchObservedRunningTime="2025-10-13 06:54:31.556066929 +0000 UTC m=+43.210760978" Oct 13 06:54:32.549901 kubelet[3292]: I1013 06:54:32.549886 3292 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:54:34.396118 containerd[1925]: time="2025-10-13T06:54:34.396062223Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:34.396318 containerd[1925]: time="2025-10-13T06:54:34.396185038Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Oct 13 06:54:34.396661 containerd[1925]: time="2025-10-13T06:54:34.396619932Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:34.397506 containerd[1925]: time="2025-10-13T06:54:34.397466696Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:34.397921 containerd[1925]: time="2025-10-13T06:54:34.397879266Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.643624881s" Oct 13 06:54:34.397921 containerd[1925]: time="2025-10-13T06:54:34.397893721Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Oct 13 06:54:34.398328 containerd[1925]: time="2025-10-13T06:54:34.398278424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Oct 13 06:54:34.398775 containerd[1925]: time="2025-10-13T06:54:34.398756875Z" level=info msg="CreateContainer within sandbox \"fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Oct 13 06:54:34.401358 containerd[1925]: time="2025-10-13T06:54:34.401320055Z" level=info msg="Container 70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:34.403896 containerd[1925]: time="2025-10-13T06:54:34.403854365Z" level=info msg="CreateContainer within sandbox \"fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a\"" Oct 13 06:54:34.404088 containerd[1925]: time="2025-10-13T06:54:34.404043202Z" level=info msg="StartContainer for \"70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a\"" Oct 13 06:54:34.404559 containerd[1925]: time="2025-10-13T06:54:34.404522546Z" level=info msg="connecting to shim 70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a" address="unix:///run/containerd/s/b983ef786babf624e5d05912609a83b15a2f580802bc5375c92c6389865da583" protocol=ttrpc version=3 Oct 13 06:54:34.418913 systemd[1]: Started cri-containerd-70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a.scope - libcontainer container 70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a. Oct 13 06:54:34.444505 containerd[1925]: time="2025-10-13T06:54:34.444484607Z" level=info msg="StartContainer for \"70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a\" returns successfully" Oct 13 06:54:34.559885 kubelet[3292]: I1013 06:54:34.559854 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-75cd76566-6brsl" podStartSLOduration=27.694943657 podStartE2EDuration="33.559844073s" podCreationTimestamp="2025-10-13 06:54:01 +0000 UTC" firstStartedPulling="2025-10-13 06:54:28.533299769 +0000 UTC m=+40.187993815" lastFinishedPulling="2025-10-13 06:54:34.398200183 +0000 UTC m=+46.052894231" observedRunningTime="2025-10-13 06:54:34.559481614 +0000 UTC m=+46.214175667" watchObservedRunningTime="2025-10-13 06:54:34.559844073 +0000 UTC m=+46.214538118" Oct 13 06:54:37.124131 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount247633660.mount: Deactivated successfully. Oct 13 06:54:37.324002 containerd[1925]: time="2025-10-13T06:54:37.323948529Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:37.324241 containerd[1925]: time="2025-10-13T06:54:37.324156929Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Oct 13 06:54:37.324520 containerd[1925]: time="2025-10-13T06:54:37.324480759Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:37.325399 containerd[1925]: time="2025-10-13T06:54:37.325359568Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:37.326140 containerd[1925]: time="2025-10-13T06:54:37.326099116Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 2.927806607s" Oct 13 06:54:37.326140 containerd[1925]: time="2025-10-13T06:54:37.326114484Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Oct 13 06:54:37.326567 containerd[1925]: time="2025-10-13T06:54:37.326552114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Oct 13 06:54:37.327026 containerd[1925]: time="2025-10-13T06:54:37.327014395Z" level=info msg="CreateContainer within sandbox \"d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Oct 13 06:54:37.329748 containerd[1925]: time="2025-10-13T06:54:37.329711640Z" level=info msg="Container 3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:37.332337 containerd[1925]: time="2025-10-13T06:54:37.332324458Z" level=info msg="CreateContainer within sandbox \"d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\"" Oct 13 06:54:37.332510 containerd[1925]: time="2025-10-13T06:54:37.332499045Z" level=info msg="StartContainer for \"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\"" Oct 13 06:54:37.333048 containerd[1925]: time="2025-10-13T06:54:37.333034487Z" level=info msg="connecting to shim 3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a" address="unix:///run/containerd/s/e8f4b4505087dc8e1319531d411cfdec04d228451ac8c3c2556a8a6b9466c913" protocol=ttrpc version=3 Oct 13 06:54:37.355794 systemd[1]: Started cri-containerd-3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a.scope - libcontainer container 3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a. Oct 13 06:54:37.386030 containerd[1925]: time="2025-10-13T06:54:37.385922979Z" level=info msg="StartContainer for \"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" returns successfully" Oct 13 06:54:37.825701 containerd[1925]: time="2025-10-13T06:54:37.825677960Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:37.825885 containerd[1925]: time="2025-10-13T06:54:37.825871233Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Oct 13 06:54:37.826979 containerd[1925]: time="2025-10-13T06:54:37.826935736Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 500.368738ms" Oct 13 06:54:37.826979 containerd[1925]: time="2025-10-13T06:54:37.826952291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Oct 13 06:54:37.827387 containerd[1925]: time="2025-10-13T06:54:37.827375744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Oct 13 06:54:37.827878 containerd[1925]: time="2025-10-13T06:54:37.827864533Z" level=info msg="CreateContainer within sandbox \"c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Oct 13 06:54:37.830310 containerd[1925]: time="2025-10-13T06:54:37.830297359Z" level=info msg="Container c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:37.832880 containerd[1925]: time="2025-10-13T06:54:37.832865927Z" level=info msg="CreateContainer within sandbox \"c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7\"" Oct 13 06:54:37.833076 containerd[1925]: time="2025-10-13T06:54:37.833028393Z" level=info msg="StartContainer for \"c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7\"" Oct 13 06:54:37.833529 containerd[1925]: time="2025-10-13T06:54:37.833518568Z" level=info msg="connecting to shim c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7" address="unix:///run/containerd/s/164a8c44d0548343e344480497af2f2b6f6a75a6666da2a8a4fb3e7a2e47ae5b" protocol=ttrpc version=3 Oct 13 06:54:37.855981 systemd[1]: Started cri-containerd-c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7.scope - libcontainer container c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7. Oct 13 06:54:37.908897 containerd[1925]: time="2025-10-13T06:54:37.908852524Z" level=info msg="StartContainer for \"c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7\" returns successfully" Oct 13 06:54:38.570243 kubelet[3292]: I1013 06:54:38.570194 3292 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:54:38.585730 kubelet[3292]: I1013 06:54:38.585614 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-bv4dr" podStartSLOduration=26.874217992 podStartE2EDuration="35.58558236s" podCreationTimestamp="2025-10-13 06:54:03 +0000 UTC" firstStartedPulling="2025-10-13 06:54:28.615124883 +0000 UTC m=+40.269818932" lastFinishedPulling="2025-10-13 06:54:37.326489253 +0000 UTC m=+48.981183300" observedRunningTime="2025-10-13 06:54:37.586333718 +0000 UTC m=+49.241027858" watchObservedRunningTime="2025-10-13 06:54:38.58558236 +0000 UTC m=+50.240276438" Oct 13 06:54:38.985694 kubelet[3292]: I1013 06:54:38.985576 3292 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:54:39.058166 containerd[1925]: time="2025-10-13T06:54:39.058143163Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"6ba0b93be5d3214f19309b9df6a9ff906ebfad85bfbb0c00d1d873ef861d174f\" pid:6603 exited_at:{seconds:1760338479 nanos:58011775}" Oct 13 06:54:39.067475 kubelet[3292]: I1013 06:54:39.067442 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-75cd76566-rw7m9" podStartSLOduration=28.990682199 podStartE2EDuration="38.067431947s" podCreationTimestamp="2025-10-13 06:54:01 +0000 UTC" firstStartedPulling="2025-10-13 06:54:28.750578638 +0000 UTC m=+40.405272685" lastFinishedPulling="2025-10-13 06:54:37.827328385 +0000 UTC m=+49.482022433" observedRunningTime="2025-10-13 06:54:38.585240206 +0000 UTC m=+50.239934294" watchObservedRunningTime="2025-10-13 06:54:39.067431947 +0000 UTC m=+50.722125991" Oct 13 06:54:39.086321 containerd[1925]: time="2025-10-13T06:54:39.086298937Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"8262011243ef12b2698d04c0dc3f4849e214fd494f3cad85959ac50515191f14\" pid:6624 exited_at:{seconds:1760338479 nanos:86214633}" Oct 13 06:54:39.393103 containerd[1925]: time="2025-10-13T06:54:39.393048604Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:39.393270 containerd[1925]: time="2025-10-13T06:54:39.393228029Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Oct 13 06:54:39.393632 containerd[1925]: time="2025-10-13T06:54:39.393593723Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:39.394625 containerd[1925]: time="2025-10-13T06:54:39.394582176Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:39.394844 containerd[1925]: time="2025-10-13T06:54:39.394803436Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.567414474s" Oct 13 06:54:39.394844 containerd[1925]: time="2025-10-13T06:54:39.394819253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Oct 13 06:54:39.395681 containerd[1925]: time="2025-10-13T06:54:39.395668822Z" level=info msg="CreateContainer within sandbox \"0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Oct 13 06:54:39.399606 containerd[1925]: time="2025-10-13T06:54:39.399562131Z" level=info msg="Container f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:39.403290 containerd[1925]: time="2025-10-13T06:54:39.403246667Z" level=info msg="CreateContainer within sandbox \"0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748\"" Oct 13 06:54:39.403493 containerd[1925]: time="2025-10-13T06:54:39.403453226Z" level=info msg="StartContainer for \"f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748\"" Oct 13 06:54:39.404241 containerd[1925]: time="2025-10-13T06:54:39.404199753Z" level=info msg="connecting to shim f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748" address="unix:///run/containerd/s/c0d26a27f1f1982423180b4d52f171e60e352237dd1bb31fa31ac27615ca1786" protocol=ttrpc version=3 Oct 13 06:54:39.420794 systemd[1]: Started cri-containerd-f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748.scope - libcontainer container f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748. Oct 13 06:54:39.439236 containerd[1925]: time="2025-10-13T06:54:39.439216210Z" level=info msg="StartContainer for \"f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748\" returns successfully" Oct 13 06:54:39.439637 containerd[1925]: time="2025-10-13T06:54:39.439625142Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Oct 13 06:54:39.578005 kubelet[3292]: I1013 06:54:39.577896 3292 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:54:41.028096 containerd[1925]: time="2025-10-13T06:54:41.028071257Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:41.028372 containerd[1925]: time="2025-10-13T06:54:41.028249122Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Oct 13 06:54:41.028687 containerd[1925]: time="2025-10-13T06:54:41.028674136Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:41.029475 containerd[1925]: time="2025-10-13T06:54:41.029461646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Oct 13 06:54:41.029848 containerd[1925]: time="2025-10-13T06:54:41.029836600Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.590195948s" Oct 13 06:54:41.029880 containerd[1925]: time="2025-10-13T06:54:41.029851955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Oct 13 06:54:41.030810 containerd[1925]: time="2025-10-13T06:54:41.030796120Z" level=info msg="CreateContainer within sandbox \"0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Oct 13 06:54:41.033641 containerd[1925]: time="2025-10-13T06:54:41.033629341Z" level=info msg="Container 5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e: CDI devices from CRI Config.CDIDevices: []" Oct 13 06:54:41.038259 containerd[1925]: time="2025-10-13T06:54:41.038218058Z" level=info msg="CreateContainer within sandbox \"0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e\"" Oct 13 06:54:41.038496 containerd[1925]: time="2025-10-13T06:54:41.038485072Z" level=info msg="StartContainer for \"5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e\"" Oct 13 06:54:41.039245 containerd[1925]: time="2025-10-13T06:54:41.039234924Z" level=info msg="connecting to shim 5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e" address="unix:///run/containerd/s/c0d26a27f1f1982423180b4d52f171e60e352237dd1bb31fa31ac27615ca1786" protocol=ttrpc version=3 Oct 13 06:54:41.058830 systemd[1]: Started cri-containerd-5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e.scope - libcontainer container 5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e. Oct 13 06:54:41.076912 containerd[1925]: time="2025-10-13T06:54:41.076859042Z" level=info msg="StartContainer for \"5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e\" returns successfully" Oct 13 06:54:41.433723 kubelet[3292]: I1013 06:54:41.433633 3292 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Oct 13 06:54:41.434540 kubelet[3292]: I1013 06:54:41.433744 3292 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Oct 13 06:54:41.631914 kubelet[3292]: I1013 06:54:41.631781 3292 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-gbdt9" podStartSLOduration=26.148245788 podStartE2EDuration="37.631726201s" podCreationTimestamp="2025-10-13 06:54:04 +0000 UTC" firstStartedPulling="2025-10-13 06:54:29.546733594 +0000 UTC m=+41.201427641" lastFinishedPulling="2025-10-13 06:54:41.030214009 +0000 UTC m=+52.684908054" observedRunningTime="2025-10-13 06:54:41.6304872 +0000 UTC m=+53.285181320" watchObservedRunningTime="2025-10-13 06:54:41.631726201 +0000 UTC m=+53.286420300" Oct 13 06:54:43.066597 kubelet[3292]: I1013 06:54:43.066483 3292 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:54:43.122795 containerd[1925]: time="2025-10-13T06:54:43.122767209Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"14efe54b970b35c81a47778bafdaa457254e16834cd481834300316ba609b319\" pid:6725 exit_status:1 exited_at:{seconds:1760338483 nanos:122531290}" Oct 13 06:54:43.171363 containerd[1925]: time="2025-10-13T06:54:43.171314907Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"f448bb4307b86b0f4eb8351ea6a2869d42ac1ea359d23e97c42a832007fdeb7f\" pid:6756 exit_status:1 exited_at:{seconds:1760338483 nanos:171152070}" Oct 13 06:54:52.580858 containerd[1925]: time="2025-10-13T06:54:52.580776112Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"e3cf718f3d9d461fd62ec739980690ee97d5f7356ca3742363a74390ac10b2c3\" pid:6801 exited_at:{seconds:1760338492 nanos:580613514}" Oct 13 06:54:56.331427 containerd[1925]: time="2025-10-13T06:54:56.331393120Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"90340942f2616b3e04361a1759867620b744cc799ecdaf63c9093af89a80cfe9\" pid:6838 exited_at:{seconds:1760338496 nanos:331191402}" Oct 13 06:55:05.293600 containerd[1925]: time="2025-10-13T06:55:05.293575579Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"11de241c51409dfc6997c93c166e3a73fc7a134f98db0113c0ba55c1af6fa51a\" pid:6862 exited_at:{seconds:1760338505 nanos:293389518}" Oct 13 06:55:09.151447 containerd[1925]: time="2025-10-13T06:55:09.151374302Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"e36c8b1c3fbc03abd6a9801e7ed6de4b3fff2c4876da627d4f0dd4b5aa831736\" pid:6894 exited_at:{seconds:1760338509 nanos:150920981}" Oct 13 06:55:09.862741 kubelet[3292]: I1013 06:55:09.862642 3292 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 06:55:13.205184 containerd[1925]: time="2025-10-13T06:55:13.205117329Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"daba1a91aa27691c9f8c054cc33d978b1f110c3b2207750f2ee5ae5032b2c407\" pid:6928 exited_at:{seconds:1760338513 nanos:204933070}" Oct 13 06:55:22.614377 containerd[1925]: time="2025-10-13T06:55:22.614343977Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"4f7972b16f97657bed8bf13c80ef95a199094a79073708ee4d3c1e2a280d8ce2\" pid:6962 exited_at:{seconds:1760338522 nanos:614098286}" Oct 13 06:55:39.123053 containerd[1925]: time="2025-10-13T06:55:39.123022403Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"5738e3bb6925524841ecf5c64a59b37c6af99e9e4e1b3ac2618cc693d42d46c7\" pid:6999 exited_at:{seconds:1760338539 nanos:122886042}" Oct 13 06:55:43.247868 containerd[1925]: time="2025-10-13T06:55:43.247842656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"70da3b668beb0ae649ffd090ca32ed0e341aad954b71df99df9edf313b89cc27\" pid:7025 exited_at:{seconds:1760338543 nanos:247638454}" Oct 13 06:55:52.590924 containerd[1925]: time="2025-10-13T06:55:52.590903291Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"169e2e6fcb05933564b64933386faa5dd72379888200371585c5ff376c390d13\" pid:7069 exited_at:{seconds:1760338552 nanos:590651348}" Oct 13 06:55:56.323823 containerd[1925]: time="2025-10-13T06:55:56.323800476Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"f78e978f9146a53c9423ed6b4d2b6b6f92a8cbc1db95cd9c69d4608d9c721f67\" pid:7104 exited_at:{seconds:1760338556 nanos:323683845}" Oct 13 06:56:05.270578 containerd[1925]: time="2025-10-13T06:56:05.270553246Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"b4e8a18e87e5655ae971bccae69de4c903d571900e44c73c04fbc8bce4c48d94\" pid:7134 exited_at:{seconds:1760338565 nanos:270392540}" Oct 13 06:56:09.136380 containerd[1925]: time="2025-10-13T06:56:09.136353636Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"a09a35b51236ee057a246e293fc8c3d49aa0e70bcc12d252ca550e59d3968e2a\" pid:7182 exited_at:{seconds:1760338569 nanos:136200081}" Oct 13 06:56:13.250552 containerd[1925]: time="2025-10-13T06:56:13.250526290Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"d6ef6d02a225bc4925a61d5a2df875a61a084d92a29dda686ba05f129c83be9b\" pid:7205 exited_at:{seconds:1760338573 nanos:250354414}" Oct 13 06:56:22.577308 containerd[1925]: time="2025-10-13T06:56:22.577280981Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"237cd13904cb23e8a4853bbff8e73d013e03de8c0ed35dae928bed806107243c\" pid:7238 exited_at:{seconds:1760338582 nanos:577098759}" Oct 13 06:56:39.155195 containerd[1925]: time="2025-10-13T06:56:39.155157764Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"6b6a7c5fa338f536f4adae668b198dbc7d32ee340757e91438cf8e9744e6bcfa\" pid:7275 exited_at:{seconds:1760338599 nanos:154948625}" Oct 13 06:56:43.184011 containerd[1925]: time="2025-10-13T06:56:43.183983907Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"891f85d5f1901e1366b14aa307d841574eb87da93a1b5204d4b05dda42070560\" pid:7298 exited_at:{seconds:1760338603 nanos:183833165}" Oct 13 06:56:52.580247 containerd[1925]: time="2025-10-13T06:56:52.580202613Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"46baa4007bed9adc4dbfbb0815d08fe189b4c6c527053c0afc2d8cf8508573f2\" pid:7330 exited_at:{seconds:1760338612 nanos:579993410}" Oct 13 06:56:56.321611 containerd[1925]: time="2025-10-13T06:56:56.321578743Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"c49a0ebaf753dbe3ad5dcb4d3634abc819e919b675f83f6666d8a98a17011c6a\" pid:7368 exited_at:{seconds:1760338616 nanos:321477041}" Oct 13 06:57:05.267036 containerd[1925]: time="2025-10-13T06:57:05.267013323Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"dbbc7209f18e9785ae11ce149947f31d5abf741a90ed058c52bd2721860a3630\" pid:7390 exited_at:{seconds:1760338625 nanos:266727789}" Oct 13 06:57:09.111165 containerd[1925]: time="2025-10-13T06:57:09.111131000Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"6588b274a131547c531fc3e41229e5a5d171db17b66a6d4b89651f0a15753eb2\" pid:7422 exited_at:{seconds:1760338629 nanos:110964106}" Oct 13 06:57:13.247974 containerd[1925]: time="2025-10-13T06:57:13.247947366Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"f24456eeefa2078aeef3d70af0bfc619590771f0d27d8d072c31813cb518a9f1\" pid:7453 exited_at:{seconds:1760338633 nanos:247742734}" Oct 13 06:57:22.580187 containerd[1925]: time="2025-10-13T06:57:22.580164509Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"eace8a2f805dbc1ad226c6362e76a0a71dc55db45580e24a30d904fb6ac3138a\" pid:7486 exited_at:{seconds:1760338642 nanos:579918622}" Oct 13 06:57:39.105776 containerd[1925]: time="2025-10-13T06:57:39.105747859Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"ada9f6a45ed810f5e9de4bec3e4436aa84eaeaafb25c211b5ed1624467d043d1\" pid:7548 exited_at:{seconds:1760338659 nanos:105602058}" Oct 13 06:57:43.242504 containerd[1925]: time="2025-10-13T06:57:43.242471004Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"6fa02fb8086210cc598379a70e0591c78611451bdd38b5dd305df30e46255ce1\" pid:7568 exited_at:{seconds:1760338663 nanos:242279457}" Oct 13 06:57:52.584571 containerd[1925]: time="2025-10-13T06:57:52.584505185Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"ad189b2a8b2470d802fe5b406725f50275bb10b697275487198e9386d0bf0444\" pid:7602 exited_at:{seconds:1760338672 nanos:584302318}" Oct 13 06:57:56.318490 containerd[1925]: time="2025-10-13T06:57:56.318464103Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"843f9cff5198d4515eb9f2f49ed4c23e22a446dc941ad756064bb62d7d3afba2\" pid:7642 exited_at:{seconds:1760338676 nanos:318352411}" Oct 13 06:58:05.284126 containerd[1925]: time="2025-10-13T06:58:05.284003794Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"351df8e82b2404ce3c687a42efdb7c3017e6466bc2d1127dd622ecb5ed72efbf\" pid:7663 exited_at:{seconds:1760338685 nanos:283313418}" Oct 13 06:58:09.106536 containerd[1925]: time="2025-10-13T06:58:09.106513117Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"9d6a9da3c7c68cf12010638a39c7bacd4bb36807340d74a7f5d7555fe31bbf5b\" pid:7696 exited_at:{seconds:1760338689 nanos:106393552}" Oct 13 06:58:13.186450 containerd[1925]: time="2025-10-13T06:58:13.186425066Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"a0b76c3c4f46f2ac09f629410b945077696baea73b62d85fa70ea842597b79d8\" pid:7718 exited_at:{seconds:1760338693 nanos:186181954}" Oct 13 06:58:22.633575 containerd[1925]: time="2025-10-13T06:58:22.633547202Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"566c8670219124c1f42f35033ac37eb0b5d26047f061714f0ad9db7f54c417a5\" pid:7751 exited_at:{seconds:1760338702 nanos:633354692}" Oct 13 06:58:39.110062 containerd[1925]: time="2025-10-13T06:58:39.110010893Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"95c2d6fbf6841d62044c29a0e01a9ed19b9a8fb60d7c5450d51206e961147397\" pid:7791 exited_at:{seconds:1760338719 nanos:109849968}" Oct 13 06:58:43.189603 containerd[1925]: time="2025-10-13T06:58:43.189478641Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"0cd02e2965c2774c684a26704b063112ed9897bd1fed73ba1d82a85222e32588\" pid:7813 exited_at:{seconds:1760338723 nanos:189076913}" Oct 13 06:58:44.712703 containerd[1925]: time="2025-10-13T06:58:44.712453302Z" level=warning msg="container event discarded" container=f078d8e1d1f60349c59982011462b6193230d410f810f0e9f2a4b484101798c1 type=CONTAINER_CREATED_EVENT Oct 13 06:58:44.712703 containerd[1925]: time="2025-10-13T06:58:44.712617146Z" level=warning msg="container event discarded" container=f078d8e1d1f60349c59982011462b6193230d410f810f0e9f2a4b484101798c1 type=CONTAINER_STARTED_EVENT Oct 13 06:58:44.729144 containerd[1925]: time="2025-10-13T06:58:44.728980741Z" level=warning msg="container event discarded" container=56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17 type=CONTAINER_CREATED_EVENT Oct 13 06:58:44.729144 containerd[1925]: time="2025-10-13T06:58:44.729083489Z" level=warning msg="container event discarded" container=1577cd7c0c6924e3ef62ca735e45ee404c4ca9f6d71251bcf1f5ac5738bb22bf type=CONTAINER_CREATED_EVENT Oct 13 06:58:44.729144 containerd[1925]: time="2025-10-13T06:58:44.729112208Z" level=warning msg="container event discarded" container=1577cd7c0c6924e3ef62ca735e45ee404c4ca9f6d71251bcf1f5ac5738bb22bf type=CONTAINER_STARTED_EVENT Oct 13 06:58:44.729144 containerd[1925]: time="2025-10-13T06:58:44.729131524Z" level=warning msg="container event discarded" container=6d7b123152fa7bedd13d2acb58f9e99f0a128ba6a1528bde2743e5ea9c971689 type=CONTAINER_CREATED_EVENT Oct 13 06:58:44.729144 containerd[1925]: time="2025-10-13T06:58:44.729151995Z" level=warning msg="container event discarded" container=6d7b123152fa7bedd13d2acb58f9e99f0a128ba6a1528bde2743e5ea9c971689 type=CONTAINER_STARTED_EVENT Oct 13 06:58:44.741847 containerd[1925]: time="2025-10-13T06:58:44.741640695Z" level=warning msg="container event discarded" container=833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058 type=CONTAINER_CREATED_EVENT Oct 13 06:58:44.741847 containerd[1925]: time="2025-10-13T06:58:44.741771327Z" level=warning msg="container event discarded" container=0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01 type=CONTAINER_CREATED_EVENT Oct 13 06:58:44.774339 containerd[1925]: time="2025-10-13T06:58:44.774175142Z" level=warning msg="container event discarded" container=56d304896c43effacbee1bc1854375a10a57d3d7571f4b00a0e6d5a02dd1cd17 type=CONTAINER_STARTED_EVENT Oct 13 06:58:44.774339 containerd[1925]: time="2025-10-13T06:58:44.774285915Z" level=warning msg="container event discarded" container=833501b402e40ce24410ade0ea1062811dcba0271c0c13d72c7ee087d395a058 type=CONTAINER_STARTED_EVENT Oct 13 06:58:44.774339 containerd[1925]: time="2025-10-13T06:58:44.774314408Z" level=warning msg="container event discarded" container=0a68adffc76fb355ac68faa0b983ff9c644c42719cb3883de5e2ad077a857a01 type=CONTAINER_STARTED_EVENT Oct 13 06:58:52.591578 containerd[1925]: time="2025-10-13T06:58:52.591548749Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"6f7cc8dd958df0f0612934c0127a253d5fb26cc591139624caaa3284ad383807\" pid:7849 exited_at:{seconds:1760338732 nanos:591322748}" Oct 13 06:58:54.700421 containerd[1925]: time="2025-10-13T06:58:54.700277923Z" level=warning msg="container event discarded" container=2b7bcf33c1480c5675226c0c130f9643910606b093211ddda38e99e09f64e1b6 type=CONTAINER_CREATED_EVENT Oct 13 06:58:54.700421 containerd[1925]: time="2025-10-13T06:58:54.700390636Z" level=warning msg="container event discarded" container=2b7bcf33c1480c5675226c0c130f9643910606b093211ddda38e99e09f64e1b6 type=CONTAINER_STARTED_EVENT Oct 13 06:58:54.700421 containerd[1925]: time="2025-10-13T06:58:54.700420627Z" level=warning msg="container event discarded" container=f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7 type=CONTAINER_CREATED_EVENT Oct 13 06:58:54.741960 containerd[1925]: time="2025-10-13T06:58:54.741827897Z" level=warning msg="container event discarded" container=f92ca9f44dd65d4a2c32a11bb0423f6894a526acc8d9317a86992360cde34bb7 type=CONTAINER_STARTED_EVENT Oct 13 06:58:54.920027 containerd[1925]: time="2025-10-13T06:58:54.919911166Z" level=warning msg="container event discarded" container=d30ce034e5d6d395ab8f0234a2aba2fa4937927d8204a1ab9d7feeb79be0bc8f type=CONTAINER_CREATED_EVENT Oct 13 06:58:54.920027 containerd[1925]: time="2025-10-13T06:58:54.920009876Z" level=warning msg="container event discarded" container=d30ce034e5d6d395ab8f0234a2aba2fa4937927d8204a1ab9d7feeb79be0bc8f type=CONTAINER_STARTED_EVENT Oct 13 06:58:56.328542 containerd[1925]: time="2025-10-13T06:58:56.328513318Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"e0090496afb6a4dee4ca7a18b449cb31148219700b5ac92943246b93030e6e81\" pid:7886 exited_at:{seconds:1760338736 nanos:328386352}" Oct 13 06:58:56.790812 containerd[1925]: time="2025-10-13T06:58:56.790532057Z" level=warning msg="container event discarded" container=45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d type=CONTAINER_CREATED_EVENT Oct 13 06:58:56.826248 containerd[1925]: time="2025-10-13T06:58:56.826077619Z" level=warning msg="container event discarded" container=45562ee5a5ee996d55b30b3ad8d940f165be472980611b029e1d2dc1ca5ab60d type=CONTAINER_STARTED_EVENT Oct 13 06:59:03.869103 containerd[1925]: time="2025-10-13T06:59:03.868820088Z" level=warning msg="container event discarded" container=7d3ab56939d85fcf5f392a0c1b2cfd3b8eb9be515ab170b153a63bcdb1e8bc3c type=CONTAINER_CREATED_EVENT Oct 13 06:59:03.869103 containerd[1925]: time="2025-10-13T06:59:03.868978040Z" level=warning msg="container event discarded" container=7d3ab56939d85fcf5f392a0c1b2cfd3b8eb9be515ab170b153a63bcdb1e8bc3c type=CONTAINER_STARTED_EVENT Oct 13 06:59:04.177347 containerd[1925]: time="2025-10-13T06:59:04.177045677Z" level=warning msg="container event discarded" container=127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74 type=CONTAINER_CREATED_EVENT Oct 13 06:59:04.177347 containerd[1925]: time="2025-10-13T06:59:04.177154258Z" level=warning msg="container event discarded" container=127254e3572bcb0d801ffd0d5665bd7e5f5398e9270ea4dc08d83a2150684a74 type=CONTAINER_STARTED_EVENT Oct 13 06:59:05.265563 containerd[1925]: time="2025-10-13T06:59:05.265536921Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"0379b251e4ac149cfca41e739f104aad9275ea007ad8e0251fc0ff182b484062\" pid:7908 exited_at:{seconds:1760338745 nanos:265340954}" Oct 13 06:59:06.164483 containerd[1925]: time="2025-10-13T06:59:06.164369968Z" level=warning msg="container event discarded" container=72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6 type=CONTAINER_CREATED_EVENT Oct 13 06:59:06.208990 containerd[1925]: time="2025-10-13T06:59:06.208877500Z" level=warning msg="container event discarded" container=72940bc9253f543a1d93d271c0998c94925bd48e4cd04d16581ca86bedfa3fc6 type=CONTAINER_STARTED_EVENT Oct 13 06:59:07.984368 containerd[1925]: time="2025-10-13T06:59:07.984242898Z" level=warning msg="container event discarded" container=3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788 type=CONTAINER_CREATED_EVENT Oct 13 06:59:08.021903 containerd[1925]: time="2025-10-13T06:59:08.021787410Z" level=warning msg="container event discarded" container=3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788 type=CONTAINER_STARTED_EVENT Oct 13 06:59:08.897371 containerd[1925]: time="2025-10-13T06:59:08.897253381Z" level=warning msg="container event discarded" container=3fc0f38cf311f71dba2a6c5b6ca491ca35c1887810bef7ae80ca2c71cf5fc788 type=CONTAINER_STOPPED_EVENT Oct 13 06:59:09.107780 containerd[1925]: time="2025-10-13T06:59:09.107749213Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"622972fc8e43f078985e195f9bf56d21181b019c20b4681dbc30f323ebe17933\" pid:7940 exited_at:{seconds:1760338749 nanos:107573682}" Oct 13 06:59:13.130896 containerd[1925]: time="2025-10-13T06:59:13.130718171Z" level=warning msg="container event discarded" container=9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4 type=CONTAINER_CREATED_EVENT Oct 13 06:59:13.168146 containerd[1925]: time="2025-10-13T06:59:13.168105475Z" level=warning msg="container event discarded" container=9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4 type=CONTAINER_STARTED_EVENT Oct 13 06:59:13.190635 containerd[1925]: time="2025-10-13T06:59:13.190610380Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"df77e573eb514eadaa7abc616a40682c3d069c47e8ae4fc9e5b8063d8f4f9ad4\" pid:7968 exited_at:{seconds:1760338753 nanos:190428766}" Oct 13 06:59:14.080006 containerd[1925]: time="2025-10-13T06:59:14.079751260Z" level=warning msg="container event discarded" container=9cac848becb0097c76be8c532521607c4277d346fc11127960fa15682d4936b4 type=CONTAINER_STOPPED_EVENT Oct 13 06:59:20.100226 containerd[1925]: time="2025-10-13T06:59:20.100071176Z" level=warning msg="container event discarded" container=e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac type=CONTAINER_CREATED_EVENT Oct 13 06:59:20.149701 containerd[1925]: time="2025-10-13T06:59:20.149530363Z" level=warning msg="container event discarded" container=e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac type=CONTAINER_STARTED_EVENT Oct 13 06:59:21.084401 containerd[1925]: time="2025-10-13T06:59:21.084247233Z" level=warning msg="container event discarded" container=e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb type=CONTAINER_CREATED_EVENT Oct 13 06:59:21.084401 containerd[1925]: time="2025-10-13T06:59:21.084342303Z" level=warning msg="container event discarded" container=e84bb2e9a85ae922f429bc6a8ca34da81490428956877bccd6f694a1e8b9f9fb type=CONTAINER_STARTED_EVENT Oct 13 06:59:22.577833 containerd[1925]: time="2025-10-13T06:59:22.577781753Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"d6be9d5696e58c42f7908414ba183b957523796bb5ba8e44576e32e431a5cc35\" pid:8017 exited_at:{seconds:1760338762 nanos:577592964}" Oct 13 06:59:23.022630 containerd[1925]: time="2025-10-13T06:59:23.022355446Z" level=warning msg="container event discarded" container=053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524 type=CONTAINER_CREATED_EVENT Oct 13 06:59:23.074891 containerd[1925]: time="2025-10-13T06:59:23.074787558Z" level=warning msg="container event discarded" container=053ca434a587443433d7ce53e5a47245c1093ab319cb1a9c11705cac19f6f524 type=CONTAINER_STARTED_EVENT Oct 13 06:59:25.436269 containerd[1925]: time="2025-10-13T06:59:25.436106613Z" level=warning msg="container event discarded" container=bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7 type=CONTAINER_CREATED_EVENT Oct 13 06:59:25.478757 containerd[1925]: time="2025-10-13T06:59:25.478587987Z" level=warning msg="container event discarded" container=bee85c99fb024a0687618395bc4989ed144b2b17c35a57dbd76207350262bdc7 type=CONTAINER_STARTED_EVENT Oct 13 06:59:27.535909 containerd[1925]: time="2025-10-13T06:59:27.535681830Z" level=warning msg="container event discarded" container=486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0 type=CONTAINER_CREATED_EVENT Oct 13 06:59:27.535909 containerd[1925]: time="2025-10-13T06:59:27.535831397Z" level=warning msg="container event discarded" container=486ddbb25baa095f1b192fd67c5d9bf9551bc260cf5236d4e724df73b8e044d0 type=CONTAINER_STARTED_EVENT Oct 13 06:59:27.535909 containerd[1925]: time="2025-10-13T06:59:27.535867914Z" level=warning msg="container event discarded" container=8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772 type=CONTAINER_CREATED_EVENT Oct 13 06:59:27.573214 containerd[1925]: time="2025-10-13T06:59:27.573137629Z" level=warning msg="container event discarded" container=8e2921a732bf29409cbb8cd5fe84dcc5edeaeb02ffb10bf938f23f7b4508b772 type=CONTAINER_STARTED_EVENT Oct 13 06:59:27.623696 containerd[1925]: time="2025-10-13T06:59:27.623502557Z" level=warning msg="container event discarded" container=ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0 type=CONTAINER_CREATED_EVENT Oct 13 06:59:27.623696 containerd[1925]: time="2025-10-13T06:59:27.623602035Z" level=warning msg="container event discarded" container=ef3c22da83599379696212e23ee5028ea31eac7dbfd34d5c4cc2212495867ae0 type=CONTAINER_STARTED_EVENT Oct 13 06:59:28.544071 containerd[1925]: time="2025-10-13T06:59:28.543967835Z" level=warning msg="container event discarded" container=fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79 type=CONTAINER_CREATED_EVENT Oct 13 06:59:28.544071 containerd[1925]: time="2025-10-13T06:59:28.544058109Z" level=warning msg="container event discarded" container=fc0d10e6166a46de7eabadcd17993bbe2f2e2b9a7b45611558f7edfa76877a79 type=CONTAINER_STARTED_EVENT Oct 13 06:59:28.625696 containerd[1925]: time="2025-10-13T06:59:28.625492679Z" level=warning msg="container event discarded" container=d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050 type=CONTAINER_CREATED_EVENT Oct 13 06:59:28.625696 containerd[1925]: time="2025-10-13T06:59:28.625615407Z" level=warning msg="container event discarded" container=d2e0bef7b29318091e408e6242a888ec0530f9b8c36ef1dd385564a4fb2c4050 type=CONTAINER_STARTED_EVENT Oct 13 06:59:28.761019 containerd[1925]: time="2025-10-13T06:59:28.760889279Z" level=warning msg="container event discarded" container=c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7 type=CONTAINER_CREATED_EVENT Oct 13 06:59:28.761019 containerd[1925]: time="2025-10-13T06:59:28.761005754Z" level=warning msg="container event discarded" container=c50cc921a2078b0d45cd63d945923e1e3c436e11b65f20c037ad62d29caa50b7 type=CONTAINER_STARTED_EVENT Oct 13 06:59:28.887935 containerd[1925]: time="2025-10-13T06:59:28.887876246Z" level=warning msg="container event discarded" container=44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043 type=CONTAINER_CREATED_EVENT Oct 13 06:59:28.887935 containerd[1925]: time="2025-10-13T06:59:28.887899699Z" level=warning msg="container event discarded" container=44c023decd6ed809d6566101533c24b7794156abb13ddeccd0b7407d484b2043 type=CONTAINER_STARTED_EVENT Oct 13 06:59:28.887935 containerd[1925]: time="2025-10-13T06:59:28.887906262Z" level=warning msg="container event discarded" container=b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff type=CONTAINER_CREATED_EVENT Oct 13 06:59:28.925452 containerd[1925]: time="2025-10-13T06:59:28.925364116Z" level=warning msg="container event discarded" container=b38b424f154db2a2915f22ff748cf8dc9f45575f3e56f25e555f2332e5118bff type=CONTAINER_STARTED_EVENT Oct 13 06:59:29.556610 containerd[1925]: time="2025-10-13T06:59:29.556526406Z" level=warning msg="container event discarded" container=0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a type=CONTAINER_CREATED_EVENT Oct 13 06:59:29.556610 containerd[1925]: time="2025-10-13T06:59:29.556573041Z" level=warning msg="container event discarded" container=0b4f1e5260236445f0f2cfc37fea14a25fb80fdf62bfe28ce6960773b652e73a type=CONTAINER_STARTED_EVENT Oct 13 06:59:30.773543 containerd[1925]: time="2025-10-13T06:59:30.773372839Z" level=warning msg="container event discarded" container=9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5 type=CONTAINER_CREATED_EVENT Oct 13 06:59:30.875681 containerd[1925]: time="2025-10-13T06:59:30.875521584Z" level=warning msg="container event discarded" container=9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5 type=CONTAINER_STARTED_EVENT Oct 13 06:59:34.413999 containerd[1925]: time="2025-10-13T06:59:34.413850439Z" level=warning msg="container event discarded" container=70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a type=CONTAINER_CREATED_EVENT Oct 13 06:59:34.454638 containerd[1925]: time="2025-10-13T06:59:34.454497810Z" level=warning msg="container event discarded" container=70923c8ef5cf279fbdc9f1f154519fc62cbdecfc373f1960317c7fa5b7f4079a type=CONTAINER_STARTED_EVENT Oct 13 06:59:37.343162 containerd[1925]: time="2025-10-13T06:59:37.343006138Z" level=warning msg="container event discarded" container=3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a type=CONTAINER_CREATED_EVENT Oct 13 06:59:37.396618 containerd[1925]: time="2025-10-13T06:59:37.396453795Z" level=warning msg="container event discarded" container=3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a type=CONTAINER_STARTED_EVENT Oct 13 06:59:37.843270 containerd[1925]: time="2025-10-13T06:59:37.843085372Z" level=warning msg="container event discarded" container=c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7 type=CONTAINER_CREATED_EVENT Oct 13 06:59:37.918780 containerd[1925]: time="2025-10-13T06:59:37.918588523Z" level=warning msg="container event discarded" container=c6353dad14e7de23acaf7ed24bc9df71241458f945e19359909a839e8a286fd7 type=CONTAINER_STARTED_EVENT Oct 13 06:59:39.147468 containerd[1925]: time="2025-10-13T06:59:39.147434195Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"a2567a96fe8b58fcde618a9a460e5f45351f463c6059d48996904cc29b3650ce\" pid:8054 exited_at:{seconds:1760338779 nanos:147254529}" Oct 13 06:59:39.413103 containerd[1925]: time="2025-10-13T06:59:39.412818620Z" level=warning msg="container event discarded" container=f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748 type=CONTAINER_CREATED_EVENT Oct 13 06:59:39.449392 containerd[1925]: time="2025-10-13T06:59:39.449247852Z" level=warning msg="container event discarded" container=f3233eac1882bf90dbb1ad5e5912f389f922636a4a6a5647c649ee1c5ade3748 type=CONTAINER_STARTED_EVENT Oct 13 06:59:41.048575 containerd[1925]: time="2025-10-13T06:59:41.048366038Z" level=warning msg="container event discarded" container=5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e type=CONTAINER_CREATED_EVENT Oct 13 06:59:41.087075 containerd[1925]: time="2025-10-13T06:59:41.086905011Z" level=warning msg="container event discarded" container=5557499f304a496a0a66c6973f2b65fb9916133af3931b60a8d7b7bbfa30947e type=CONTAINER_STARTED_EVENT Oct 13 06:59:43.185504 containerd[1925]: time="2025-10-13T06:59:43.185479781Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"70f6ad6a45ee51d12e34cdbe957129eed5dda2f333041fac9e5761d5e7f944ff\" pid:8078 exited_at:{seconds:1760338783 nanos:185176975}" Oct 13 06:59:52.584091 containerd[1925]: time="2025-10-13T06:59:52.584026722Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"8e2e4aa794d7f068f25c40b974a2040f62d5fbb93fa761536ea2c434c9543895\" pid:8121 exited_at:{seconds:1760338792 nanos:583792414}" Oct 13 06:59:56.323976 containerd[1925]: time="2025-10-13T06:59:56.323946533Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"8fdb3468021897cc13d9d961be75b803ba99cd1e96e02ac97eae8bf554182cf1\" pid:8157 exited_at:{seconds:1760338796 nanos:323807073}" Oct 13 07:00:05.259592 containerd[1925]: time="2025-10-13T07:00:05.259567270Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"2463d2b4a3e160e98c7acf368b6709bfd81c429f52c9352bde7d70b310959c76\" pid:8179 exited_at:{seconds:1760338805 nanos:259367658}" Oct 13 07:00:09.107036 containerd[1925]: time="2025-10-13T07:00:09.107013678Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"942a96eb93ab74f5d0e1f5c0195e9b8339ba881bfc7325570e5158979fe170c8\" pid:8213 exited_at:{seconds:1760338809 nanos:106896197}" Oct 13 07:00:13.186445 containerd[1925]: time="2025-10-13T07:00:13.186419399Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"4e38f1f5f5d10c1515bd5b0121e4d8093237c9b65760c2e564a72431c28a673c\" pid:8234 exited_at:{seconds:1760338813 nanos:186252768}" Oct 13 07:00:21.266162 systemd[1]: Started sshd@9-139.178.94.25:22-147.75.109.163:58800.service - OpenSSH per-connection server daemon (147.75.109.163:58800). Oct 13 07:00:21.371664 sshd[8260]: Accepted publickey for core from 147.75.109.163 port 58800 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:21.372511 sshd-session[8260]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:21.376138 systemd-logind[1915]: New session 12 of user core. Oct 13 07:00:21.392835 systemd[1]: Started session-12.scope - Session 12 of User core. Oct 13 07:00:21.507670 sshd[8263]: Connection closed by 147.75.109.163 port 58800 Oct 13 07:00:21.507838 sshd-session[8260]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:21.509625 systemd[1]: sshd@9-139.178.94.25:22-147.75.109.163:58800.service: Deactivated successfully. Oct 13 07:00:21.510652 systemd[1]: session-12.scope: Deactivated successfully. Oct 13 07:00:21.511432 systemd-logind[1915]: Session 12 logged out. Waiting for processes to exit. Oct 13 07:00:21.512018 systemd-logind[1915]: Removed session 12. Oct 13 07:00:22.615581 containerd[1925]: time="2025-10-13T07:00:22.615550778Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"85818cf9724c7e154c1cce90502595953c63ef9604cb8757b4d48c01e12670f9\" pid:8303 exited_at:{seconds:1760338822 nanos:615321573}" Oct 13 07:00:26.536403 systemd[1]: Started sshd@10-139.178.94.25:22-147.75.109.163:46012.service - OpenSSH per-connection server daemon (147.75.109.163:46012). Oct 13 07:00:26.643792 sshd[8330]: Accepted publickey for core from 147.75.109.163 port 46012 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:26.644828 sshd-session[8330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:26.648405 systemd-logind[1915]: New session 13 of user core. Oct 13 07:00:26.662919 systemd[1]: Started session-13.scope - Session 13 of User core. Oct 13 07:00:26.801051 sshd[8333]: Connection closed by 147.75.109.163 port 46012 Oct 13 07:00:26.801187 sshd-session[8330]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:26.803160 systemd[1]: sshd@10-139.178.94.25:22-147.75.109.163:46012.service: Deactivated successfully. Oct 13 07:00:26.804240 systemd[1]: session-13.scope: Deactivated successfully. Oct 13 07:00:26.805018 systemd-logind[1915]: Session 13 logged out. Waiting for processes to exit. Oct 13 07:00:26.805590 systemd-logind[1915]: Removed session 13. Oct 13 07:00:31.822132 systemd[1]: Started sshd@11-139.178.94.25:22-147.75.109.163:35012.service - OpenSSH per-connection server daemon (147.75.109.163:35012). Oct 13 07:00:31.869047 sshd[8360]: Accepted publickey for core from 147.75.109.163 port 35012 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:31.869775 sshd-session[8360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:31.872466 systemd-logind[1915]: New session 14 of user core. Oct 13 07:00:31.884824 systemd[1]: Started session-14.scope - Session 14 of User core. Oct 13 07:00:31.972588 sshd[8363]: Connection closed by 147.75.109.163 port 35012 Oct 13 07:00:31.972779 sshd-session[8360]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:31.974623 systemd[1]: sshd@11-139.178.94.25:22-147.75.109.163:35012.service: Deactivated successfully. Oct 13 07:00:31.975635 systemd[1]: session-14.scope: Deactivated successfully. Oct 13 07:00:31.976379 systemd-logind[1915]: Session 14 logged out. Waiting for processes to exit. Oct 13 07:00:31.976964 systemd-logind[1915]: Removed session 14. Oct 13 07:00:36.998006 systemd[1]: Started sshd@12-139.178.94.25:22-147.75.109.163:35016.service - OpenSSH per-connection server daemon (147.75.109.163:35016). Oct 13 07:00:37.091152 sshd[8387]: Accepted publickey for core from 147.75.109.163 port 35016 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:37.091837 sshd-session[8387]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:37.094584 systemd-logind[1915]: New session 15 of user core. Oct 13 07:00:37.111825 systemd[1]: Started session-15.scope - Session 15 of User core. Oct 13 07:00:37.237827 sshd[8390]: Connection closed by 147.75.109.163 port 35016 Oct 13 07:00:37.237982 sshd-session[8387]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:37.239693 systemd[1]: sshd@12-139.178.94.25:22-147.75.109.163:35016.service: Deactivated successfully. Oct 13 07:00:37.240801 systemd[1]: session-15.scope: Deactivated successfully. Oct 13 07:00:37.241487 systemd-logind[1915]: Session 15 logged out. Waiting for processes to exit. Oct 13 07:00:37.242159 systemd-logind[1915]: Removed session 15. Oct 13 07:00:39.155872 containerd[1925]: time="2025-10-13T07:00:39.155839751Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"a5a14531bb912eacd451442a508368b6d8fc83a540f0dcb8871221a31ca7f12f\" pid:8426 exited_at:{seconds:1760338839 nanos:155627302}" Oct 13 07:00:42.263380 systemd[1]: Started sshd@13-139.178.94.25:22-147.75.109.163:35896.service - OpenSSH per-connection server daemon (147.75.109.163:35896). Oct 13 07:00:42.308881 sshd[8438]: Accepted publickey for core from 147.75.109.163 port 35896 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:42.309511 sshd-session[8438]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:42.312310 systemd-logind[1915]: New session 16 of user core. Oct 13 07:00:42.332760 systemd[1]: Started session-16.scope - Session 16 of User core. Oct 13 07:00:42.419841 sshd[8441]: Connection closed by 147.75.109.163 port 35896 Oct 13 07:00:42.420024 sshd-session[8438]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:42.435947 systemd[1]: sshd@13-139.178.94.25:22-147.75.109.163:35896.service: Deactivated successfully. Oct 13 07:00:42.436930 systemd[1]: session-16.scope: Deactivated successfully. Oct 13 07:00:42.437427 systemd-logind[1915]: Session 16 logged out. Waiting for processes to exit. Oct 13 07:00:42.438743 systemd[1]: Started sshd@14-139.178.94.25:22-147.75.109.163:35908.service - OpenSSH per-connection server daemon (147.75.109.163:35908). Oct 13 07:00:42.439221 systemd-logind[1915]: Removed session 16. Oct 13 07:00:42.483685 sshd[8467]: Accepted publickey for core from 147.75.109.163 port 35908 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:42.484519 sshd-session[8467]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:42.488004 systemd-logind[1915]: New session 17 of user core. Oct 13 07:00:42.511862 systemd[1]: Started session-17.scope - Session 17 of User core. Oct 13 07:00:42.616961 sshd[8470]: Connection closed by 147.75.109.163 port 35908 Oct 13 07:00:42.617186 sshd-session[8467]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:42.639938 systemd[1]: sshd@14-139.178.94.25:22-147.75.109.163:35908.service: Deactivated successfully. Oct 13 07:00:42.641349 systemd[1]: session-17.scope: Deactivated successfully. Oct 13 07:00:42.642030 systemd-logind[1915]: Session 17 logged out. Waiting for processes to exit. Oct 13 07:00:42.643922 systemd[1]: Started sshd@15-139.178.94.25:22-147.75.109.163:35916.service - OpenSSH per-connection server daemon (147.75.109.163:35916). Oct 13 07:00:42.644637 systemd-logind[1915]: Removed session 17. Oct 13 07:00:42.719678 sshd[8493]: Accepted publickey for core from 147.75.109.163 port 35916 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:42.720897 sshd-session[8493]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:42.725733 systemd-logind[1915]: New session 18 of user core. Oct 13 07:00:42.743894 systemd[1]: Started session-18.scope - Session 18 of User core. Oct 13 07:00:42.875577 sshd[8498]: Connection closed by 147.75.109.163 port 35916 Oct 13 07:00:42.875738 sshd-session[8493]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:42.877555 systemd[1]: sshd@15-139.178.94.25:22-147.75.109.163:35916.service: Deactivated successfully. Oct 13 07:00:42.878563 systemd[1]: session-18.scope: Deactivated successfully. Oct 13 07:00:42.879304 systemd-logind[1915]: Session 18 logged out. Waiting for processes to exit. Oct 13 07:00:42.879946 systemd-logind[1915]: Removed session 18. Oct 13 07:00:43.193334 containerd[1925]: time="2025-10-13T07:00:43.193210943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"f7839058f7e89c6d1452c4683b5fb9cc217c2867a2eaa5c71e92596a694b7da6\" pid:8534 exited_at:{seconds:1760338843 nanos:192956286}" Oct 13 07:00:47.899292 systemd[1]: Started sshd@16-139.178.94.25:22-147.75.109.163:35924.service - OpenSSH per-connection server daemon (147.75.109.163:35924). Oct 13 07:00:47.975755 sshd[8555]: Accepted publickey for core from 147.75.109.163 port 35924 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:47.979010 sshd-session[8555]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:47.991237 systemd-logind[1915]: New session 19 of user core. Oct 13 07:00:48.005022 systemd[1]: Started session-19.scope - Session 19 of User core. Oct 13 07:00:48.138819 sshd[8558]: Connection closed by 147.75.109.163 port 35924 Oct 13 07:00:48.138991 sshd-session[8555]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:48.141289 systemd[1]: sshd@16-139.178.94.25:22-147.75.109.163:35924.service: Deactivated successfully. Oct 13 07:00:48.142378 systemd[1]: session-19.scope: Deactivated successfully. Oct 13 07:00:48.142939 systemd-logind[1915]: Session 19 logged out. Waiting for processes to exit. Oct 13 07:00:48.143738 systemd-logind[1915]: Removed session 19. Oct 13 07:00:52.576100 containerd[1925]: time="2025-10-13T07:00:52.576052348Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"4d93d5d1423b671532a097fde3ad8aceb57c411a1d267656aff659c085577d31\" pid:8602 exited_at:{seconds:1760338852 nanos:575868243}" Oct 13 07:00:53.161364 systemd[1]: Started sshd@17-139.178.94.25:22-147.75.109.163:56840.service - OpenSSH per-connection server daemon (147.75.109.163:56840). Oct 13 07:00:53.282688 sshd[8628]: Accepted publickey for core from 147.75.109.163 port 56840 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:53.283683 sshd-session[8628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:53.287607 systemd-logind[1915]: New session 20 of user core. Oct 13 07:00:53.308927 systemd[1]: Started session-20.scope - Session 20 of User core. Oct 13 07:00:53.398049 sshd[8631]: Connection closed by 147.75.109.163 port 56840 Oct 13 07:00:53.398205 sshd-session[8628]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:53.399897 systemd[1]: sshd@17-139.178.94.25:22-147.75.109.163:56840.service: Deactivated successfully. Oct 13 07:00:53.400891 systemd[1]: session-20.scope: Deactivated successfully. Oct 13 07:00:53.401714 systemd-logind[1915]: Session 20 logged out. Waiting for processes to exit. Oct 13 07:00:53.402307 systemd-logind[1915]: Removed session 20. Oct 13 07:00:56.365774 containerd[1925]: time="2025-10-13T07:00:56.365743542Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"ad7529acbc404dca1b0d22ab99986f1d0dd6d1282669224b3da720b300509691\" pid:8686 exited_at:{seconds:1760338856 nanos:365516717}" Oct 13 07:00:58.413902 systemd[1]: Started sshd@18-139.178.94.25:22-147.75.109.163:56854.service - OpenSSH per-connection server daemon (147.75.109.163:56854). Oct 13 07:00:58.457844 sshd[8698]: Accepted publickey for core from 147.75.109.163 port 56854 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:00:58.458778 sshd-session[8698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:00:58.462017 systemd-logind[1915]: New session 21 of user core. Oct 13 07:00:58.476096 systemd[1]: Started session-21.scope - Session 21 of User core. Oct 13 07:00:58.634969 sshd[8701]: Connection closed by 147.75.109.163 port 56854 Oct 13 07:00:58.635167 sshd-session[8698]: pam_unix(sshd:session): session closed for user core Oct 13 07:00:58.637078 systemd[1]: sshd@18-139.178.94.25:22-147.75.109.163:56854.service: Deactivated successfully. Oct 13 07:00:58.638232 systemd[1]: session-21.scope: Deactivated successfully. Oct 13 07:00:58.639260 systemd-logind[1915]: Session 21 logged out. Waiting for processes to exit. Oct 13 07:00:58.639918 systemd-logind[1915]: Removed session 21. Oct 13 07:01:03.652564 systemd[1]: Started sshd@19-139.178.94.25:22-147.75.109.163:46182.service - OpenSSH per-connection server daemon (147.75.109.163:46182). Oct 13 07:01:03.701332 sshd[8726]: Accepted publickey for core from 147.75.109.163 port 46182 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:03.701909 sshd-session[8726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:03.704331 systemd-logind[1915]: New session 22 of user core. Oct 13 07:01:03.721918 systemd[1]: Started session-22.scope - Session 22 of User core. Oct 13 07:01:03.807837 sshd[8729]: Connection closed by 147.75.109.163 port 46182 Oct 13 07:01:03.808016 sshd-session[8726]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:03.831565 systemd[1]: sshd@19-139.178.94.25:22-147.75.109.163:46182.service: Deactivated successfully. Oct 13 07:01:03.835823 systemd[1]: session-22.scope: Deactivated successfully. Oct 13 07:01:03.838043 systemd-logind[1915]: Session 22 logged out. Waiting for processes to exit. Oct 13 07:01:03.844432 systemd[1]: Started sshd@20-139.178.94.25:22-147.75.109.163:46188.service - OpenSSH per-connection server daemon (147.75.109.163:46188). Oct 13 07:01:03.846416 systemd-logind[1915]: Removed session 22. Oct 13 07:01:03.937530 sshd[8754]: Accepted publickey for core from 147.75.109.163 port 46188 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:03.938218 sshd-session[8754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:03.941191 systemd-logind[1915]: New session 23 of user core. Oct 13 07:01:03.964923 systemd[1]: Started session-23.scope - Session 23 of User core. Oct 13 07:01:04.253129 sshd[8757]: Connection closed by 147.75.109.163 port 46188 Oct 13 07:01:04.253350 sshd-session[8754]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:04.264281 systemd[1]: sshd@20-139.178.94.25:22-147.75.109.163:46188.service: Deactivated successfully. Oct 13 07:01:04.265767 systemd[1]: session-23.scope: Deactivated successfully. Oct 13 07:01:04.266585 systemd-logind[1915]: Session 23 logged out. Waiting for processes to exit. Oct 13 07:01:04.268699 systemd[1]: Started sshd@21-139.178.94.25:22-147.75.109.163:46204.service - OpenSSH per-connection server daemon (147.75.109.163:46204). Oct 13 07:01:04.269337 systemd-logind[1915]: Removed session 23. Oct 13 07:01:04.336230 sshd[8780]: Accepted publickey for core from 147.75.109.163 port 46204 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:04.337440 sshd-session[8780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:04.342570 systemd-logind[1915]: New session 24 of user core. Oct 13 07:01:04.365960 systemd[1]: Started session-24.scope - Session 24 of User core. Oct 13 07:01:05.050622 sshd[8783]: Connection closed by 147.75.109.163 port 46204 Oct 13 07:01:05.050921 sshd-session[8780]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:05.065690 systemd[1]: sshd@21-139.178.94.25:22-147.75.109.163:46204.service: Deactivated successfully. Oct 13 07:01:05.067032 systemd[1]: session-24.scope: Deactivated successfully. Oct 13 07:01:05.067597 systemd-logind[1915]: Session 24 logged out. Waiting for processes to exit. Oct 13 07:01:05.069361 systemd[1]: Started sshd@22-139.178.94.25:22-147.75.109.163:46218.service - OpenSSH per-connection server daemon (147.75.109.163:46218). Oct 13 07:01:05.069880 systemd-logind[1915]: Removed session 24. Oct 13 07:01:05.116750 sshd[8814]: Accepted publickey for core from 147.75.109.163 port 46218 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:05.117563 sshd-session[8814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:05.120956 systemd-logind[1915]: New session 25 of user core. Oct 13 07:01:05.133890 systemd[1]: Started session-25.scope - Session 25 of User core. Oct 13 07:01:05.290091 containerd[1925]: time="2025-10-13T07:01:05.290053339Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"5370fdc53bcd4eea5b956880be751b1b527af258263785e8e366602e0be8476e\" pid:8845 exited_at:{seconds:1760338865 nanos:289877512}" Oct 13 07:01:05.330904 sshd[8818]: Connection closed by 147.75.109.163 port 46218 Oct 13 07:01:05.331083 sshd-session[8814]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:05.345838 systemd[1]: sshd@22-139.178.94.25:22-147.75.109.163:46218.service: Deactivated successfully. Oct 13 07:01:05.346827 systemd[1]: session-25.scope: Deactivated successfully. Oct 13 07:01:05.347361 systemd-logind[1915]: Session 25 logged out. Waiting for processes to exit. Oct 13 07:01:05.348617 systemd[1]: Started sshd@23-139.178.94.25:22-147.75.109.163:46230.service - OpenSSH per-connection server daemon (147.75.109.163:46230). Oct 13 07:01:05.349020 systemd-logind[1915]: Removed session 25. Oct 13 07:01:05.390669 sshd[8876]: Accepted publickey for core from 147.75.109.163 port 46230 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:05.391416 sshd-session[8876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:05.394664 systemd-logind[1915]: New session 26 of user core. Oct 13 07:01:05.411175 systemd[1]: Started session-26.scope - Session 26 of User core. Oct 13 07:01:05.558622 sshd[8880]: Connection closed by 147.75.109.163 port 46230 Oct 13 07:01:05.558890 sshd-session[8876]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:05.561299 systemd[1]: sshd@23-139.178.94.25:22-147.75.109.163:46230.service: Deactivated successfully. Oct 13 07:01:05.562416 systemd[1]: session-26.scope: Deactivated successfully. Oct 13 07:01:05.563013 systemd-logind[1915]: Session 26 logged out. Waiting for processes to exit. Oct 13 07:01:05.563607 systemd-logind[1915]: Removed session 26. Oct 13 07:01:09.101983 containerd[1925]: time="2025-10-13T07:01:09.101961966Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9de29d827864c049047c7a02c29e5b8fa77008393a11fd425d43bd7c93658da5\" id:\"17f568ab7409f6fda5c33ca027a73916efe0fda3efb3ffed5fd793b99be0cf28\" pid:8916 exited_at:{seconds:1760338869 nanos:101868638}" Oct 13 07:01:10.573345 systemd[1]: Started sshd@24-139.178.94.25:22-147.75.109.163:46234.service - OpenSSH per-connection server daemon (147.75.109.163:46234). Oct 13 07:01:10.662368 sshd[8927]: Accepted publickey for core from 147.75.109.163 port 46234 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:10.664236 sshd-session[8927]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:10.670909 systemd-logind[1915]: New session 27 of user core. Oct 13 07:01:10.685067 systemd[1]: Started session-27.scope - Session 27 of User core. Oct 13 07:01:10.783515 sshd[8930]: Connection closed by 147.75.109.163 port 46234 Oct 13 07:01:10.783728 sshd-session[8927]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:10.785673 systemd[1]: sshd@24-139.178.94.25:22-147.75.109.163:46234.service: Deactivated successfully. Oct 13 07:01:10.786820 systemd[1]: session-27.scope: Deactivated successfully. Oct 13 07:01:10.787580 systemd-logind[1915]: Session 27 logged out. Waiting for processes to exit. Oct 13 07:01:10.788203 systemd-logind[1915]: Removed session 27. Oct 13 07:01:13.192235 containerd[1925]: time="2025-10-13T07:01:13.192210352Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3f21e1e6ebe4c44652fff7ea93d47a2d1306cdeacfb736c0aa5e92a45bad025a\" id:\"b27326ae9d96e34a4a54eb07c1729218dc44e1adbf6d2c68353cb36266c435b5\" pid:8966 exited_at:{seconds:1760338873 nanos:192045324}" Oct 13 07:01:15.804237 systemd[1]: Started sshd@25-139.178.94.25:22-147.75.109.163:47900.service - OpenSSH per-connection server daemon (147.75.109.163:47900). Oct 13 07:01:15.861874 sshd[8989]: Accepted publickey for core from 147.75.109.163 port 47900 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:15.862610 sshd-session[8989]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:15.865481 systemd-logind[1915]: New session 28 of user core. Oct 13 07:01:15.883926 systemd[1]: Started session-28.scope - Session 28 of User core. Oct 13 07:01:15.968673 sshd[8992]: Connection closed by 147.75.109.163 port 47900 Oct 13 07:01:15.968855 sshd-session[8989]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:15.970858 systemd[1]: sshd@25-139.178.94.25:22-147.75.109.163:47900.service: Deactivated successfully. Oct 13 07:01:15.971788 systemd[1]: session-28.scope: Deactivated successfully. Oct 13 07:01:15.972286 systemd-logind[1915]: Session 28 logged out. Waiting for processes to exit. Oct 13 07:01:15.972850 systemd-logind[1915]: Removed session 28. Oct 13 07:01:20.417816 systemd-timesyncd[1877]: Contacted time server 50.218.103.254:123 (0.flatcar.pool.ntp.org). Oct 13 07:01:20.984952 systemd[1]: Started sshd@26-139.178.94.25:22-147.75.109.163:47916.service - OpenSSH per-connection server daemon (147.75.109.163:47916). Oct 13 07:01:21.037129 sshd[9017]: Accepted publickey for core from 147.75.109.163 port 47916 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:21.037987 sshd-session[9017]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:21.040969 systemd-logind[1915]: New session 29 of user core. Oct 13 07:01:21.049903 systemd[1]: Started session-29.scope - Session 29 of User core. Oct 13 07:01:21.134500 sshd[9020]: Connection closed by 147.75.109.163 port 47916 Oct 13 07:01:21.134715 sshd-session[9017]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:21.136521 systemd[1]: sshd@26-139.178.94.25:22-147.75.109.163:47916.service: Deactivated successfully. Oct 13 07:01:21.137520 systemd[1]: session-29.scope: Deactivated successfully. Oct 13 07:01:21.138255 systemd-logind[1915]: Session 29 logged out. Waiting for processes to exit. Oct 13 07:01:21.138894 systemd-logind[1915]: Removed session 29. Oct 13 07:01:22.582073 containerd[1925]: time="2025-10-13T07:01:22.582031927Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e23b7a59bfff307205d7eeb541cf0a92278cfb91f3e26883273d1e021be628ac\" id:\"16839a12afc714dbcf252b7af0b45f533749de6bdb2d50ddb1b23d7dc7d0778a\" pid:9055 exited_at:{seconds:1760338882 nanos:581802833}" Oct 13 07:01:26.148613 systemd[1]: Started sshd@27-139.178.94.25:22-147.75.109.163:50532.service - OpenSSH per-connection server daemon (147.75.109.163:50532). Oct 13 07:01:26.182792 sshd[9081]: Accepted publickey for core from 147.75.109.163 port 50532 ssh2: RSA SHA256:lNdrIynqbel7rjCycawM5qnMkHHZ4OL4/jrt2P4buCw Oct 13 07:01:26.183467 sshd-session[9081]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Oct 13 07:01:26.186100 systemd-logind[1915]: New session 30 of user core. Oct 13 07:01:26.194836 systemd[1]: Started session-30.scope - Session 30 of User core. Oct 13 07:01:26.274499 sshd[9084]: Connection closed by 147.75.109.163 port 50532 Oct 13 07:01:26.275970 sshd-session[9081]: pam_unix(sshd:session): session closed for user core Oct 13 07:01:26.278956 systemd[1]: sshd@27-139.178.94.25:22-147.75.109.163:50532.service: Deactivated successfully. Oct 13 07:01:26.280586 systemd[1]: session-30.scope: Deactivated successfully. Oct 13 07:01:26.283592 systemd-logind[1915]: Session 30 logged out. Waiting for processes to exit. Oct 13 07:01:26.284521 systemd-logind[1915]: Removed session 30.