Sep 9 05:42:38.596906 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 9 03:39:34 -00 2025 Sep 9 05:42:38.596967 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:42:38.596983 kernel: BIOS-provided physical RAM map: Sep 9 05:42:38.596995 kernel: BIOS-e820: [mem 0x0000000000000000-0x0000000000000fff] reserved Sep 9 05:42:38.597007 kernel: BIOS-e820: [mem 0x0000000000001000-0x0000000000054fff] usable Sep 9 05:42:38.597019 kernel: BIOS-e820: [mem 0x0000000000055000-0x000000000005ffff] reserved Sep 9 05:42:38.597040 kernel: BIOS-e820: [mem 0x0000000000060000-0x0000000000097fff] usable Sep 9 05:42:38.597054 kernel: BIOS-e820: [mem 0x0000000000098000-0x000000000009ffff] reserved Sep 9 05:42:38.597068 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bd329fff] usable Sep 9 05:42:38.597081 kernel: BIOS-e820: [mem 0x00000000bd32a000-0x00000000bd331fff] ACPI data Sep 9 05:42:38.597096 kernel: BIOS-e820: [mem 0x00000000bd332000-0x00000000bf8ecfff] usable Sep 9 05:42:38.597110 kernel: BIOS-e820: [mem 0x00000000bf8ed000-0x00000000bfb6cfff] reserved Sep 9 05:42:38.597124 kernel: BIOS-e820: [mem 0x00000000bfb6d000-0x00000000bfb7efff] ACPI data Sep 9 05:42:38.597140 kernel: BIOS-e820: [mem 0x00000000bfb7f000-0x00000000bfbfefff] ACPI NVS Sep 9 05:42:38.597162 kernel: BIOS-e820: [mem 0x00000000bfbff000-0x00000000bffdffff] usable Sep 9 05:42:38.597179 kernel: BIOS-e820: [mem 0x00000000bffe0000-0x00000000bfffffff] reserved Sep 9 05:42:38.597195 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000021fffffff] usable Sep 9 05:42:38.597212 kernel: NX (Execute Disable) protection: active Sep 9 05:42:38.597226 kernel: APIC: Static calls initialized Sep 9 05:42:38.597241 kernel: efi: EFI v2.7 by EDK II Sep 9 05:42:38.597256 kernel: efi: TPMFinalLog=0xbfbf7000 ACPI=0xbfb7e000 ACPI 2.0=0xbfb7e014 SMBIOS=0xbf9e8000 RNG=0xbfb73018 TPMEventLog=0xbd32a018 Sep 9 05:42:38.597271 kernel: random: crng init done Sep 9 05:42:38.597290 kernel: secureboot: Secure boot disabled Sep 9 05:42:38.597305 kernel: SMBIOS 2.4 present. Sep 9 05:42:38.597330 kernel: DMI: Google Google Compute Engine/Google Compute Engine, BIOS Google 08/14/2025 Sep 9 05:42:38.597348 kernel: DMI: Memory slots populated: 1/1 Sep 9 05:42:38.597365 kernel: Hypervisor detected: KVM Sep 9 05:42:38.597380 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 05:42:38.597395 kernel: kvm-clock: using sched offset of 14527267781 cycles Sep 9 05:42:38.597411 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 05:42:38.597426 kernel: tsc: Detected 2299.998 MHz processor Sep 9 05:42:38.597446 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 05:42:38.597466 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 05:42:38.597481 kernel: last_pfn = 0x220000 max_arch_pfn = 0x400000000 Sep 9 05:42:38.597498 kernel: MTRR map: 3 entries (2 fixed + 1 variable; max 18), built from 8 variable MTRRs Sep 9 05:42:38.597513 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 05:42:38.597528 kernel: last_pfn = 0xbffe0 max_arch_pfn = 0x400000000 Sep 9 05:42:38.597544 kernel: Using GB pages for direct mapping Sep 9 05:42:38.597560 kernel: ACPI: Early table checksum verification disabled Sep 9 05:42:38.597577 kernel: ACPI: RSDP 0x00000000BFB7E014 000024 (v02 Google) Sep 9 05:42:38.597603 kernel: ACPI: XSDT 0x00000000BFB7D0E8 00005C (v01 Google GOOGFACP 00000001 01000013) Sep 9 05:42:38.597621 kernel: ACPI: FACP 0x00000000BFB78000 0000F4 (v02 Google GOOGFACP 00000001 GOOG 00000001) Sep 9 05:42:38.597638 kernel: ACPI: DSDT 0x00000000BFB79000 001A64 (v01 Google GOOGDSDT 00000001 GOOG 00000001) Sep 9 05:42:38.597654 kernel: ACPI: FACS 0x00000000BFBF2000 000040 Sep 9 05:42:38.597671 kernel: ACPI: SSDT 0x00000000BFB7C000 000316 (v02 GOOGLE Tpm2Tabl 00001000 INTL 20250404) Sep 9 05:42:38.597687 kernel: ACPI: TPM2 0x00000000BFB7B000 000034 (v04 GOOGLE 00000001 GOOG 00000001) Sep 9 05:42:38.597706 kernel: ACPI: SRAT 0x00000000BFB77000 0000C8 (v03 Google GOOGSRAT 00000001 GOOG 00000001) Sep 9 05:42:38.597723 kernel: ACPI: APIC 0x00000000BFB76000 000076 (v05 Google GOOGAPIC 00000001 GOOG 00000001) Sep 9 05:42:38.597740 kernel: ACPI: SSDT 0x00000000BFB75000 000980 (v01 Google GOOGSSDT 00000001 GOOG 00000001) Sep 9 05:42:38.597758 kernel: ACPI: WAET 0x00000000BFB74000 000028 (v01 Google GOOGWAET 00000001 GOOG 00000001) Sep 9 05:42:38.597775 kernel: ACPI: Reserving FACP table memory at [mem 0xbfb78000-0xbfb780f3] Sep 9 05:42:38.597793 kernel: ACPI: Reserving DSDT table memory at [mem 0xbfb79000-0xbfb7aa63] Sep 9 05:42:38.597811 kernel: ACPI: Reserving FACS table memory at [mem 0xbfbf2000-0xbfbf203f] Sep 9 05:42:38.597828 kernel: ACPI: Reserving SSDT table memory at [mem 0xbfb7c000-0xbfb7c315] Sep 9 05:42:38.597842 kernel: ACPI: Reserving TPM2 table memory at [mem 0xbfb7b000-0xbfb7b033] Sep 9 05:42:38.597862 kernel: ACPI: Reserving SRAT table memory at [mem 0xbfb77000-0xbfb770c7] Sep 9 05:42:38.597878 kernel: ACPI: Reserving APIC table memory at [mem 0xbfb76000-0xbfb76075] Sep 9 05:42:38.597896 kernel: ACPI: Reserving SSDT table memory at [mem 0xbfb75000-0xbfb7597f] Sep 9 05:42:38.597911 kernel: ACPI: Reserving WAET table memory at [mem 0xbfb74000-0xbfb74027] Sep 9 05:42:38.597927 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 9 05:42:38.597966 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0xbfffffff] Sep 9 05:42:38.597984 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x21fffffff] Sep 9 05:42:38.598001 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0xbfffffff] -> [mem 0x00001000-0xbfffffff] Sep 9 05:42:38.598019 kernel: NUMA: Node 0 [mem 0x00001000-0xbfffffff] + [mem 0x100000000-0x21fffffff] -> [mem 0x00001000-0x21fffffff] Sep 9 05:42:38.598042 kernel: NODE_DATA(0) allocated [mem 0x21fff6dc0-0x21fffdfff] Sep 9 05:42:38.598059 kernel: Zone ranges: Sep 9 05:42:38.598074 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 05:42:38.598091 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 9 05:42:38.598107 kernel: Normal [mem 0x0000000100000000-0x000000021fffffff] Sep 9 05:42:38.598124 kernel: Device empty Sep 9 05:42:38.598142 kernel: Movable zone start for each node Sep 9 05:42:38.598158 kernel: Early memory node ranges Sep 9 05:42:38.598175 kernel: node 0: [mem 0x0000000000001000-0x0000000000054fff] Sep 9 05:42:38.598196 kernel: node 0: [mem 0x0000000000060000-0x0000000000097fff] Sep 9 05:42:38.598214 kernel: node 0: [mem 0x0000000000100000-0x00000000bd329fff] Sep 9 05:42:38.598231 kernel: node 0: [mem 0x00000000bd332000-0x00000000bf8ecfff] Sep 9 05:42:38.598247 kernel: node 0: [mem 0x00000000bfbff000-0x00000000bffdffff] Sep 9 05:42:38.598263 kernel: node 0: [mem 0x0000000100000000-0x000000021fffffff] Sep 9 05:42:38.598279 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000021fffffff] Sep 9 05:42:38.598296 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 05:42:38.598321 kernel: On node 0, zone DMA: 11 pages in unavailable ranges Sep 9 05:42:38.598339 kernel: On node 0, zone DMA: 104 pages in unavailable ranges Sep 9 05:42:38.598357 kernel: On node 0, zone DMA32: 8 pages in unavailable ranges Sep 9 05:42:38.598379 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 9 05:42:38.598396 kernel: On node 0, zone Normal: 32 pages in unavailable ranges Sep 9 05:42:38.598413 kernel: ACPI: PM-Timer IO Port: 0xb008 Sep 9 05:42:38.598430 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 05:42:38.598447 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 9 05:42:38.598464 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 05:42:38.598486 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 05:42:38.598504 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 05:42:38.598522 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 05:42:38.598544 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 05:42:38.598561 kernel: CPU topo: Max. logical packages: 1 Sep 9 05:42:38.598578 kernel: CPU topo: Max. logical dies: 1 Sep 9 05:42:38.598594 kernel: CPU topo: Max. dies per package: 1 Sep 9 05:42:38.598611 kernel: CPU topo: Max. threads per core: 2 Sep 9 05:42:38.598628 kernel: CPU topo: Num. cores per package: 1 Sep 9 05:42:38.598645 kernel: CPU topo: Num. threads per package: 2 Sep 9 05:42:38.598661 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 9 05:42:38.598679 kernel: [mem 0xc0000000-0xffffffff] available for PCI devices Sep 9 05:42:38.598701 kernel: Booting paravirtualized kernel on KVM Sep 9 05:42:38.598719 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 05:42:38.598736 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 9 05:42:38.598753 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 9 05:42:38.598770 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 9 05:42:38.598792 kernel: pcpu-alloc: [0] 0 1 Sep 9 05:42:38.598809 kernel: kvm-guest: PV spinlocks enabled Sep 9 05:42:38.598826 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 9 05:42:38.598844 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:42:38.598867 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 05:42:38.598884 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Sep 9 05:42:38.598901 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 05:42:38.598918 kernel: Fallback order for Node 0: 0 Sep 9 05:42:38.598936 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1965138 Sep 9 05:42:38.598977 kernel: Policy zone: Normal Sep 9 05:42:38.598995 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 05:42:38.599013 kernel: software IO TLB: area num 2. Sep 9 05:42:38.599048 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 9 05:42:38.599066 kernel: Kernel/User page tables isolation: enabled Sep 9 05:42:38.599084 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 05:42:38.599107 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 05:42:38.599125 kernel: Dynamic Preempt: voluntary Sep 9 05:42:38.599144 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 05:42:38.599163 kernel: rcu: RCU event tracing is enabled. Sep 9 05:42:38.599182 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 9 05:42:38.599205 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 05:42:38.599223 kernel: Rude variant of Tasks RCU enabled. Sep 9 05:42:38.599242 kernel: Tracing variant of Tasks RCU enabled. Sep 9 05:42:38.599260 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 05:42:38.599279 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 9 05:42:38.599298 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:42:38.599325 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:42:38.599344 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:42:38.599363 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 9 05:42:38.599385 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 05:42:38.599403 kernel: Console: colour dummy device 80x25 Sep 9 05:42:38.599422 kernel: printk: legacy console [ttyS0] enabled Sep 9 05:42:38.599440 kernel: ACPI: Core revision 20240827 Sep 9 05:42:38.599456 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 05:42:38.599473 kernel: x2apic enabled Sep 9 05:42:38.599491 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 05:42:38.599509 kernel: ..TIMER: vector=0x30 apic1=0 pin1=0 apic2=-1 pin2=-1 Sep 9 05:42:38.599526 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Sep 9 05:42:38.599548 kernel: Calibrating delay loop (skipped) preset value.. 4599.99 BogoMIPS (lpj=2299998) Sep 9 05:42:38.599565 kernel: Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 1024 Sep 9 05:42:38.599584 kernel: Last level dTLB entries: 4KB 1024, 2MB 1024, 4MB 1024, 1GB 4 Sep 9 05:42:38.599603 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 05:42:38.599621 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 9 05:42:38.599639 kernel: Spectre V2 : Mitigation: IBRS Sep 9 05:42:38.599657 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 9 05:42:38.599675 kernel: RETBleed: Mitigation: IBRS Sep 9 05:42:38.599694 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 05:42:38.599716 kernel: Spectre V2 : User space: Mitigation: STIBP via prctl Sep 9 05:42:38.599734 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 05:42:38.599751 kernel: MDS: Mitigation: Clear CPU buffers Sep 9 05:42:38.599768 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 9 05:42:38.599785 kernel: active return thunk: its_return_thunk Sep 9 05:42:38.599802 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 9 05:42:38.599818 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 05:42:38.599837 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 05:42:38.599858 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 05:42:38.599876 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 05:42:38.599893 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Sep 9 05:42:38.599910 kernel: Freeing SMP alternatives memory: 32K Sep 9 05:42:38.599928 kernel: pid_max: default: 32768 minimum: 301 Sep 9 05:42:38.599965 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 05:42:38.599982 kernel: landlock: Up and running. Sep 9 05:42:38.599999 kernel: SELinux: Initializing. Sep 9 05:42:38.600016 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 05:42:38.600038 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 05:42:38.600057 kernel: smpboot: CPU0: Intel(R) Xeon(R) CPU @ 2.30GHz (family: 0x6, model: 0x3f, stepping: 0x0) Sep 9 05:42:38.600075 kernel: Performance Events: unsupported p6 CPU model 63 no PMU driver, software events only. Sep 9 05:42:38.600093 kernel: signal: max sigframe size: 1776 Sep 9 05:42:38.600111 kernel: rcu: Hierarchical SRCU implementation. Sep 9 05:42:38.600131 kernel: rcu: Max phase no-delay instances is 400. Sep 9 05:42:38.600149 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 05:42:38.600168 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 9 05:42:38.600186 kernel: smp: Bringing up secondary CPUs ... Sep 9 05:42:38.600209 kernel: smpboot: x86: Booting SMP configuration: Sep 9 05:42:38.600227 kernel: .... node #0, CPUs: #1 Sep 9 05:42:38.600247 kernel: Transient Scheduler Attacks: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Sep 9 05:42:38.600266 kernel: Transient Scheduler Attacks: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 9 05:42:38.600284 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 05:42:38.600303 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) Sep 9 05:42:38.600332 kernel: Memory: 7564264K/7860552K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 290716K reserved, 0K cma-reserved) Sep 9 05:42:38.600351 kernel: devtmpfs: initialized Sep 9 05:42:38.600373 kernel: x86/mm: Memory block size: 128MB Sep 9 05:42:38.600392 kernel: ACPI: PM: Registering ACPI NVS region [mem 0xbfb7f000-0xbfbfefff] (524288 bytes) Sep 9 05:42:38.600412 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 05:42:38.600430 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 9 05:42:38.600448 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 05:42:38.600465 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 05:42:38.600482 kernel: audit: initializing netlink subsys (disabled) Sep 9 05:42:38.600501 kernel: audit: type=2000 audit(1757396554.253:1): state=initialized audit_enabled=0 res=1 Sep 9 05:42:38.600518 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 05:42:38.600539 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 05:42:38.600557 kernel: cpuidle: using governor menu Sep 9 05:42:38.600574 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 05:42:38.600590 kernel: dca service started, version 1.12.1 Sep 9 05:42:38.600607 kernel: PCI: Using configuration type 1 for base access Sep 9 05:42:38.600624 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 05:42:38.600641 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 05:42:38.600658 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 05:42:38.600675 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 05:42:38.600696 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 05:42:38.600714 kernel: ACPI: Added _OSI(Module Device) Sep 9 05:42:38.600732 kernel: ACPI: Added _OSI(Processor Device) Sep 9 05:42:38.600751 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 05:42:38.600770 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Sep 9 05:42:38.600787 kernel: ACPI: Interpreter enabled Sep 9 05:42:38.600805 kernel: ACPI: PM: (supports S0 S3 S5) Sep 9 05:42:38.600822 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 05:42:38.600839 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 05:42:38.600861 kernel: PCI: Ignoring E820 reservations for host bridge windows Sep 9 05:42:38.600879 kernel: ACPI: Enabled 16 GPEs in block 00 to 0F Sep 9 05:42:38.600897 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 05:42:38.601170 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Sep 9 05:42:38.601373 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Sep 9 05:42:38.601554 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Sep 9 05:42:38.601578 kernel: PCI host bridge to bus 0000:00 Sep 9 05:42:38.601775 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 05:42:38.601973 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 05:42:38.602146 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 05:42:38.602303 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfefff window] Sep 9 05:42:38.602474 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 05:42:38.602672 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint Sep 9 05:42:38.602870 kernel: pci 0000:00:01.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 9 05:42:38.603097 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 9 05:42:38.603291 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Sep 9 05:42:38.603506 kernel: pci 0000:00:03.0: [1af4:1004] type 00 class 0x000000 conventional PCI endpoint Sep 9 05:42:38.603700 kernel: pci 0000:00:03.0: BAR 0 [io 0xc040-0xc07f] Sep 9 05:42:38.603894 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc0001000-0xc000107f] Sep 9 05:42:38.606193 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 9 05:42:38.606412 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc03f] Sep 9 05:42:38.606594 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc0000000-0xc000007f] Sep 9 05:42:38.606785 kernel: pci 0000:00:05.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 9 05:42:38.606981 kernel: pci 0000:00:05.0: BAR 0 [io 0xc080-0xc09f] Sep 9 05:42:38.607157 kernel: pci 0000:00:05.0: BAR 1 [mem 0xc0002000-0xc000203f] Sep 9 05:42:38.607179 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 05:42:38.607198 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 05:42:38.607220 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 05:42:38.607238 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 05:42:38.607255 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Sep 9 05:42:38.607273 kernel: iommu: Default domain type: Translated Sep 9 05:42:38.607291 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 05:42:38.607308 kernel: efivars: Registered efivars operations Sep 9 05:42:38.607333 kernel: PCI: Using ACPI for IRQ routing Sep 9 05:42:38.607351 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 05:42:38.607368 kernel: e820: reserve RAM buffer [mem 0x00055000-0x0005ffff] Sep 9 05:42:38.607389 kernel: e820: reserve RAM buffer [mem 0x00098000-0x0009ffff] Sep 9 05:42:38.607406 kernel: e820: reserve RAM buffer [mem 0xbd32a000-0xbfffffff] Sep 9 05:42:38.607424 kernel: e820: reserve RAM buffer [mem 0xbf8ed000-0xbfffffff] Sep 9 05:42:38.607440 kernel: e820: reserve RAM buffer [mem 0xbffe0000-0xbfffffff] Sep 9 05:42:38.607457 kernel: vgaarb: loaded Sep 9 05:42:38.607475 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 05:42:38.607493 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 05:42:38.607511 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 05:42:38.607529 kernel: pnp: PnP ACPI init Sep 9 05:42:38.607546 kernel: pnp: PnP ACPI: found 7 devices Sep 9 05:42:38.607568 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 05:42:38.607586 kernel: NET: Registered PF_INET protocol family Sep 9 05:42:38.607603 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 9 05:42:38.607621 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Sep 9 05:42:38.607639 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 05:42:38.607656 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 05:42:38.607674 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 9 05:42:38.607692 kernel: TCP: Hash tables configured (established 65536 bind 65536) Sep 9 05:42:38.607709 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Sep 9 05:42:38.607731 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Sep 9 05:42:38.607748 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 05:42:38.607766 kernel: NET: Registered PF_XDP protocol family Sep 9 05:42:38.607932 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 05:42:38.610171 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 05:42:38.610349 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 05:42:38.610510 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfefff window] Sep 9 05:42:38.610696 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 9 05:42:38.610726 kernel: PCI: CLS 0 bytes, default 64 Sep 9 05:42:38.610745 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 9 05:42:38.610762 kernel: software IO TLB: mapped [mem 0x00000000b7f7f000-0x00000000bbf7f000] (64MB) Sep 9 05:42:38.610781 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 9 05:42:38.610799 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Sep 9 05:42:38.610818 kernel: clocksource: Switched to clocksource tsc Sep 9 05:42:38.610836 kernel: Initialise system trusted keyrings Sep 9 05:42:38.610853 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Sep 9 05:42:38.610875 kernel: Key type asymmetric registered Sep 9 05:42:38.610892 kernel: Asymmetric key parser 'x509' registered Sep 9 05:42:38.610910 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 05:42:38.610928 kernel: io scheduler mq-deadline registered Sep 9 05:42:38.610961 kernel: io scheduler kyber registered Sep 9 05:42:38.610979 kernel: io scheduler bfq registered Sep 9 05:42:38.610997 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 05:42:38.611016 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Sep 9 05:42:38.611198 kernel: virtio-pci 0000:00:03.0: virtio_pci: leaving for legacy driver Sep 9 05:42:38.611225 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 10 Sep 9 05:42:38.611412 kernel: virtio-pci 0000:00:04.0: virtio_pci: leaving for legacy driver Sep 9 05:42:38.611435 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Sep 9 05:42:38.611614 kernel: virtio-pci 0000:00:05.0: virtio_pci: leaving for legacy driver Sep 9 05:42:38.611638 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 05:42:38.611655 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 05:42:38.611674 kernel: 00:04: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 9 05:42:38.611692 kernel: 00:05: ttyS2 at I/O 0x3e8 (irq = 6, base_baud = 115200) is a 16550A Sep 9 05:42:38.611715 kernel: 00:06: ttyS3 at I/O 0x2e8 (irq = 7, base_baud = 115200) is a 16550A Sep 9 05:42:38.611901 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x9009, rev-id 0) Sep 9 05:42:38.611926 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 05:42:38.611959 kernel: i8042: Warning: Keylock active Sep 9 05:42:38.611977 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 05:42:38.611995 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 05:42:38.612172 kernel: rtc_cmos 00:00: RTC can wake from S4 Sep 9 05:42:38.612348 kernel: rtc_cmos 00:00: registered as rtc0 Sep 9 05:42:38.612517 kernel: rtc_cmos 00:00: setting system clock to 2025-09-09T05:42:37 UTC (1757396557) Sep 9 05:42:38.612680 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Sep 9 05:42:38.612702 kernel: intel_pstate: CPU model not supported Sep 9 05:42:38.612720 kernel: pstore: Using crash dump compression: deflate Sep 9 05:42:38.612738 kernel: pstore: Registered efi_pstore as persistent store backend Sep 9 05:42:38.612756 kernel: NET: Registered PF_INET6 protocol family Sep 9 05:42:38.612773 kernel: Segment Routing with IPv6 Sep 9 05:42:38.612791 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 05:42:38.612813 kernel: NET: Registered PF_PACKET protocol family Sep 9 05:42:38.612831 kernel: Key type dns_resolver registered Sep 9 05:42:38.612849 kernel: IPI shorthand broadcast: enabled Sep 9 05:42:38.612867 kernel: sched_clock: Marking stable (3466004026, 148088223)->(3652730595, -38638346) Sep 9 05:42:38.612885 kernel: registered taskstats version 1 Sep 9 05:42:38.612902 kernel: Loading compiled-in X.509 certificates Sep 9 05:42:38.612920 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 884b9ad6a330f59ae6e6488b20a5491e41ff24a3' Sep 9 05:42:38.612937 kernel: Demotion targets for Node 0: null Sep 9 05:42:38.614995 kernel: Key type .fscrypt registered Sep 9 05:42:38.615016 kernel: Key type fscrypt-provisioning registered Sep 9 05:42:38.615042 kernel: ima: Allocated hash algorithm: sha1 Sep 9 05:42:38.615061 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Sep 9 05:42:38.615080 kernel: ima: No architecture policies found Sep 9 05:42:38.615099 kernel: clk: Disabling unused clocks Sep 9 05:42:38.615118 kernel: Warning: unable to open an initial console. Sep 9 05:42:38.615137 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 9 05:42:38.615156 kernel: Write protecting the kernel read-only data: 24576k Sep 9 05:42:38.615174 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 9 05:42:38.615197 kernel: Run /init as init process Sep 9 05:42:38.615215 kernel: with arguments: Sep 9 05:42:38.615234 kernel: /init Sep 9 05:42:38.615252 kernel: with environment: Sep 9 05:42:38.615270 kernel: HOME=/ Sep 9 05:42:38.615289 kernel: TERM=linux Sep 9 05:42:38.615307 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 05:42:38.615336 systemd[1]: Successfully made /usr/ read-only. Sep 9 05:42:38.615364 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:42:38.615384 systemd[1]: Detected virtualization google. Sep 9 05:42:38.615403 systemd[1]: Detected architecture x86-64. Sep 9 05:42:38.615422 systemd[1]: Running in initrd. Sep 9 05:42:38.615441 systemd[1]: No hostname configured, using default hostname. Sep 9 05:42:38.615462 systemd[1]: Hostname set to . Sep 9 05:42:38.615479 systemd[1]: Initializing machine ID from random generator. Sep 9 05:42:38.615498 systemd[1]: Queued start job for default target initrd.target. Sep 9 05:42:38.615539 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:42:38.615566 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:42:38.615588 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 05:42:38.615608 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:42:38.615629 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 05:42:38.615655 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 05:42:38.615677 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 05:42:38.615698 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 05:42:38.615719 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:42:38.615739 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:42:38.615759 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:42:38.615779 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:42:38.615803 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:42:38.615824 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:42:38.615843 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:42:38.615864 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:42:38.615884 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 05:42:38.615905 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 05:42:38.615925 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:42:38.615963 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:42:38.615984 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:42:38.616008 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:42:38.616027 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 05:42:38.616048 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:42:38.616068 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 05:42:38.616089 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 05:42:38.616110 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 05:42:38.616131 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:42:38.616151 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:42:38.616213 systemd-journald[207]: Collecting audit messages is disabled. Sep 9 05:42:38.616259 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:42:38.616281 systemd-journald[207]: Journal started Sep 9 05:42:38.616333 systemd-journald[207]: Runtime Journal (/run/log/journal/ae6198f069224e43b0f9a317f661b07a) is 8M, max 148.9M, 140.9M free. Sep 9 05:42:38.640185 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:42:38.639669 systemd-modules-load[209]: Inserted module 'overlay' Sep 9 05:42:38.641499 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 05:42:38.642200 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:42:38.642644 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 05:42:38.650138 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:42:38.654153 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:42:38.678827 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:42:38.767129 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 05:42:38.767177 kernel: Bridge firewalling registered Sep 9 05:42:38.686020 systemd-tmpfiles[220]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 05:42:38.714975 systemd-modules-load[209]: Inserted module 'br_netfilter' Sep 9 05:42:38.777706 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:42:38.799487 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:42:38.806545 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:42:38.825759 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 05:42:38.852157 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:42:38.882565 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:42:38.899205 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:42:38.905494 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:42:38.909125 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:42:38.922350 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:42:38.944583 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 05:42:38.961873 systemd-resolved[237]: Positive Trust Anchors: Sep 9 05:42:38.961883 systemd-resolved[237]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:42:38.961927 systemd-resolved[237]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:42:38.965637 systemd-resolved[237]: Defaulting to hostname 'linux'. Sep 9 05:42:38.967027 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:42:38.979181 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:42:39.077121 dracut-cmdline[246]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:42:39.184992 kernel: SCSI subsystem initialized Sep 9 05:42:39.201997 kernel: Loading iSCSI transport class v2.0-870. Sep 9 05:42:39.217989 kernel: iscsi: registered transport (tcp) Sep 9 05:42:39.250350 kernel: iscsi: registered transport (qla4xxx) Sep 9 05:42:39.250447 kernel: QLogic iSCSI HBA Driver Sep 9 05:42:39.274533 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:42:39.318609 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:42:39.342044 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:42:39.416041 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 05:42:39.418062 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 05:42:39.502986 kernel: raid6: avx2x4 gen() 18109 MB/s Sep 9 05:42:39.523995 kernel: raid6: avx2x2 gen() 18117 MB/s Sep 9 05:42:39.549991 kernel: raid6: avx2x1 gen() 13623 MB/s Sep 9 05:42:39.550083 kernel: raid6: using algorithm avx2x2 gen() 18117 MB/s Sep 9 05:42:39.577133 kernel: raid6: .... xor() 18306 MB/s, rmw enabled Sep 9 05:42:39.577237 kernel: raid6: using avx2x2 recovery algorithm Sep 9 05:42:39.605982 kernel: xor: automatically using best checksumming function avx Sep 9 05:42:39.794980 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 05:42:39.803506 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:42:39.814045 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:42:39.851600 systemd-udevd[454]: Using default interface naming scheme 'v255'. Sep 9 05:42:39.860491 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:42:39.884371 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 05:42:39.923236 dracut-pre-trigger[463]: rd.md=0: removing MD RAID activation Sep 9 05:42:39.956392 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:42:39.976797 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:42:40.082559 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:42:40.095248 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 05:42:40.206973 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 05:42:40.252895 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Sep 9 05:42:40.272973 kernel: AES CTR mode by8 optimization enabled Sep 9 05:42:40.309886 kernel: virtio_scsi virtio0: 1/0/0 default/read/poll queues Sep 9 05:42:40.366977 kernel: scsi host0: Virtio SCSI HBA Sep 9 05:42:40.392228 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:42:40.401116 kernel: scsi 0:0:1:0: Direct-Access Google PersistentDisk 1 PQ: 0 ANSI: 6 Sep 9 05:42:40.392706 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:42:40.447139 kernel: sd 0:0:1:0: [sda] 25165824 512-byte logical blocks: (12.9 GB/12.0 GiB) Sep 9 05:42:40.447434 kernel: sd 0:0:1:0: [sda] 4096-byte physical blocks Sep 9 05:42:40.447591 kernel: sd 0:0:1:0: [sda] Write Protect is off Sep 9 05:42:40.447737 kernel: sd 0:0:1:0: [sda] Mode Sense: 1f 00 00 08 Sep 9 05:42:40.447880 kernel: sd 0:0:1:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 9 05:42:40.411866 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:42:40.475623 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 05:42:40.475660 kernel: GPT:17805311 != 25165823 Sep 9 05:42:40.475684 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 05:42:40.475706 kernel: GPT:17805311 != 25165823 Sep 9 05:42:40.475727 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 05:42:40.475749 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 05:42:40.475772 kernel: sd 0:0:1:0: [sda] Attached SCSI disk Sep 9 05:42:40.504482 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:42:40.513802 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:42:40.587973 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:42:40.598413 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 05:42:40.638039 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - PersistentDisk EFI-SYSTEM. Sep 9 05:42:40.660005 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - PersistentDisk ROOT. Sep 9 05:42:40.680188 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - PersistentDisk USR-A. Sep 9 05:42:40.680436 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - PersistentDisk USR-A. Sep 9 05:42:40.722952 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - PersistentDisk OEM. Sep 9 05:42:40.734242 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:42:40.734343 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:42:40.763098 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:42:40.783239 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 05:42:40.792240 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 05:42:40.835265 disk-uuid[606]: Primary Header is updated. Sep 9 05:42:40.835265 disk-uuid[606]: Secondary Entries is updated. Sep 9 05:42:40.835265 disk-uuid[606]: Secondary Header is updated. Sep 9 05:42:40.858073 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 05:42:40.859190 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:42:41.890763 disk-uuid[607]: The operation has completed successfully. Sep 9 05:42:41.898101 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 05:42:41.961322 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 05:42:41.961486 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 05:42:42.014279 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 05:42:42.045586 sh[628]: Success Sep 9 05:42:42.082833 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 05:42:42.082921 kernel: device-mapper: uevent: version 1.0.3 Sep 9 05:42:42.083000 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 05:42:42.108979 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 9 05:42:42.183127 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 05:42:42.186407 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 05:42:42.220128 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 05:42:42.247985 kernel: BTRFS: device fsid 9ca60a92-6b53-4529-adc0-1f4392d2ad56 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (640) Sep 9 05:42:42.266444 kernel: BTRFS info (device dm-0): first mount of filesystem 9ca60a92-6b53-4529-adc0-1f4392d2ad56 Sep 9 05:42:42.266530 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:42:42.299277 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 9 05:42:42.299372 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 05:42:42.299397 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 05:42:42.309549 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 05:42:42.317761 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:42:42.324381 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 05:42:42.325417 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 05:42:42.351104 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 05:42:42.410112 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (663) Sep 9 05:42:42.427725 kernel: BTRFS info (device sda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:42:42.427797 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:42:42.445248 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 05:42:42.445334 kernel: BTRFS info (device sda6): turning on async discard Sep 9 05:42:42.445358 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 05:42:42.466038 kernel: BTRFS info (device sda6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:42:42.466722 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 05:42:42.486207 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 05:42:42.554199 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:42:42.557241 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:42:42.700285 systemd-networkd[809]: lo: Link UP Sep 9 05:42:42.700303 systemd-networkd[809]: lo: Gained carrier Sep 9 05:42:42.708562 systemd-networkd[809]: Enumeration completed Sep 9 05:42:42.708712 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:42:42.709310 systemd-networkd[809]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:42:42.746743 ignition[762]: Ignition 2.22.0 Sep 9 05:42:42.709319 systemd-networkd[809]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:42:42.746752 ignition[762]: Stage: fetch-offline Sep 9 05:42:42.719168 systemd-networkd[809]: eth0: Link UP Sep 9 05:42:42.746784 ignition[762]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:42:42.721360 systemd-networkd[809]: eth0: Gained carrier Sep 9 05:42:42.746794 ignition[762]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 9 05:42:42.721381 systemd-networkd[809]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:42:42.746906 ignition[762]: parsed url from cmdline: "" Sep 9 05:42:42.732301 systemd[1]: Reached target network.target - Network. Sep 9 05:42:42.746910 ignition[762]: no config URL provided Sep 9 05:42:42.734042 systemd-networkd[809]: eth0: Overlong DHCP hostname received, shortened from 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023.c.flatcar-212911.internal' to 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:42:42.746917 ignition[762]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:42:42.734063 systemd-networkd[809]: eth0: DHCPv4 address 10.128.0.19/32, gateway 10.128.0.1 acquired from 169.254.169.254 Sep 9 05:42:42.746925 ignition[762]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:42:42.755411 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:42:42.746933 ignition[762]: failed to fetch config: resource requires networking Sep 9 05:42:42.783928 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 05:42:42.747178 ignition[762]: Ignition finished successfully Sep 9 05:42:42.854162 unknown[818]: fetched base config from "system" Sep 9 05:42:42.843861 ignition[818]: Ignition 2.22.0 Sep 9 05:42:42.854175 unknown[818]: fetched base config from "system" Sep 9 05:42:42.843869 ignition[818]: Stage: fetch Sep 9 05:42:42.854187 unknown[818]: fetched user config from "gcp" Sep 9 05:42:42.844435 ignition[818]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:42:42.857474 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 05:42:42.844448 ignition[818]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 9 05:42:42.868366 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 05:42:42.844546 ignition[818]: parsed url from cmdline: "" Sep 9 05:42:42.915429 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 05:42:42.844551 ignition[818]: no config URL provided Sep 9 05:42:42.939237 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 05:42:42.844557 ignition[818]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:42:43.000553 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 05:42:42.844566 ignition[818]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:42:43.012779 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 05:42:42.844588 ignition[818]: GET http://169.254.169.254/computeMetadata/v1/instance/attributes/user-data: attempt #1 Sep 9 05:42:43.028112 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 05:42:42.848934 ignition[818]: GET result: OK Sep 9 05:42:43.044189 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:42:42.849061 ignition[818]: parsing config with SHA512: 33d92d5277faf78bc941fc827f4def2f0529dfa56e957c3b4bad1db211004c416cb612c3ccaf71624a9e7360edc6328d4ea6dc979438af5a5b8101b3026ef8d4 Sep 9 05:42:43.059127 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:42:42.854786 ignition[818]: fetch: fetch complete Sep 9 05:42:43.075127 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:42:42.854793 ignition[818]: fetch: fetch passed Sep 9 05:42:43.092544 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 05:42:42.854842 ignition[818]: Ignition finished successfully Sep 9 05:42:42.912774 ignition[824]: Ignition 2.22.0 Sep 9 05:42:42.912782 ignition[824]: Stage: kargs Sep 9 05:42:42.912974 ignition[824]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:42:42.912986 ignition[824]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 9 05:42:42.913783 ignition[824]: kargs: kargs passed Sep 9 05:42:42.913839 ignition[824]: Ignition finished successfully Sep 9 05:42:42.997526 ignition[829]: Ignition 2.22.0 Sep 9 05:42:42.997535 ignition[829]: Stage: disks Sep 9 05:42:42.997722 ignition[829]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:42:42.997738 ignition[829]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 9 05:42:42.999052 ignition[829]: disks: disks passed Sep 9 05:42:42.999125 ignition[829]: Ignition finished successfully Sep 9 05:42:43.152957 systemd-fsck[839]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 9 05:42:43.303919 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 05:42:43.325782 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 05:42:43.514518 kernel: EXT4-fs (sda9): mounted filesystem d2d7815e-fa16-4396-ab9d-ac540c1d8856 r/w with ordered data mode. Quota mode: none. Sep 9 05:42:43.514407 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 05:42:43.522802 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 05:42:43.540254 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:42:43.549150 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 05:42:43.573679 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 9 05:42:43.641585 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (847) Sep 9 05:42:43.641630 kernel: BTRFS info (device sda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:42:43.641653 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:42:43.641676 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 05:42:43.641696 kernel: BTRFS info (device sda6): turning on async discard Sep 9 05:42:43.641727 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 05:42:43.573769 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 05:42:43.573815 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:42:43.586168 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 05:42:43.668049 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:42:43.685512 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 05:42:43.831809 initrd-setup-root[871]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 05:42:43.841088 initrd-setup-root[878]: cut: /sysroot/etc/group: No such file or directory Sep 9 05:42:43.851094 initrd-setup-root[885]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 05:42:43.860103 initrd-setup-root[892]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 05:42:43.988368 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 05:42:44.007715 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 05:42:44.016227 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 05:42:44.046729 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 05:42:44.062148 kernel: BTRFS info (device sda6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:42:44.082410 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 05:42:44.090479 systemd-networkd[809]: eth0: Gained IPv6LL Sep 9 05:42:44.104241 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 05:42:44.114105 ignition[959]: INFO : Ignition 2.22.0 Sep 9 05:42:44.114105 ignition[959]: INFO : Stage: mount Sep 9 05:42:44.114105 ignition[959]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:42:44.114105 ignition[959]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 9 05:42:44.114105 ignition[959]: INFO : mount: mount passed Sep 9 05:42:44.114105 ignition[959]: INFO : Ignition finished successfully Sep 9 05:42:44.124550 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 05:42:44.516482 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:42:44.558992 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (973) Sep 9 05:42:44.577048 kernel: BTRFS info (device sda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:42:44.577128 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:42:44.593143 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 05:42:44.593279 kernel: BTRFS info (device sda6): turning on async discard Sep 9 05:42:44.593304 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 05:42:44.601848 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:42:44.650333 ignition[990]: INFO : Ignition 2.22.0 Sep 9 05:42:44.650333 ignition[990]: INFO : Stage: files Sep 9 05:42:44.663101 ignition[990]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:42:44.663101 ignition[990]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 9 05:42:44.663101 ignition[990]: DEBUG : files: compiled without relabeling support, skipping Sep 9 05:42:44.663101 ignition[990]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 05:42:44.663101 ignition[990]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 05:42:44.663101 ignition[990]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 05:42:44.663101 ignition[990]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 05:42:44.663101 ignition[990]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 05:42:44.660804 unknown[990]: wrote ssh authorized keys file for user: core Sep 9 05:42:44.757102 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 05:42:44.757102 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 9 05:42:44.789085 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 05:42:45.452882 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 05:42:45.469107 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 9 05:42:46.071645 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 05:42:46.498112 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 05:42:46.498112 ignition[990]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 05:42:46.533101 ignition[990]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:42:46.533101 ignition[990]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:42:46.533101 ignition[990]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 05:42:46.533101 ignition[990]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 9 05:42:46.533101 ignition[990]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 05:42:46.533101 ignition[990]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:42:46.533101 ignition[990]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:42:46.533101 ignition[990]: INFO : files: files passed Sep 9 05:42:46.533101 ignition[990]: INFO : Ignition finished successfully Sep 9 05:42:46.505438 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 05:42:46.516710 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 05:42:46.557218 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 05:42:46.572578 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 05:42:46.726165 initrd-setup-root-after-ignition[1019]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:42:46.726165 initrd-setup-root-after-ignition[1019]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:42:46.572708 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 05:42:46.771114 initrd-setup-root-after-ignition[1023]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:42:46.657117 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:42:46.671245 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 05:42:46.691021 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 05:42:46.746640 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 05:42:46.746764 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 05:42:46.762861 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 05:42:46.780199 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 05:42:46.802209 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 05:42:46.803392 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 05:42:46.899234 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:42:46.920272 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 05:42:46.962482 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:42:46.982237 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:42:46.982653 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 05:42:47.001430 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 05:42:47.001646 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:42:47.032435 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 05:42:47.042432 systemd[1]: Stopped target basic.target - Basic System. Sep 9 05:42:47.058496 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 05:42:47.072453 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:42:47.089425 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 05:42:47.107421 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:42:47.124441 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 05:42:47.141475 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:42:47.157565 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 05:42:47.177500 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 05:42:47.193470 systemd[1]: Stopped target swap.target - Swaps. Sep 9 05:42:47.222221 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 05:42:47.222659 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:42:47.247323 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:42:47.247695 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:42:47.264370 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 05:42:47.264565 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:42:47.283416 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 05:42:47.283610 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 05:42:47.320464 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 05:42:47.320702 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:42:47.328499 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 05:42:47.328678 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 05:42:47.347820 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 05:42:47.373239 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 05:42:47.406722 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 05:42:47.432162 ignition[1044]: INFO : Ignition 2.22.0 Sep 9 05:42:47.432162 ignition[1044]: INFO : Stage: umount Sep 9 05:42:47.432162 ignition[1044]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:42:47.432162 ignition[1044]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 9 05:42:47.432162 ignition[1044]: INFO : umount: umount passed Sep 9 05:42:47.432162 ignition[1044]: INFO : Ignition finished successfully Sep 9 05:42:47.407131 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:42:47.443472 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 05:42:47.443769 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:42:47.461742 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 05:42:47.463130 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 05:42:47.463253 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 05:42:47.474676 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 05:42:47.474813 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 05:42:47.495019 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 05:42:47.495186 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 05:42:47.514275 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 05:42:47.514343 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 05:42:47.519311 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 05:42:47.519377 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 05:42:47.535348 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 05:42:47.535419 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 05:42:47.561272 systemd[1]: Stopped target network.target - Network. Sep 9 05:42:47.577183 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 05:42:47.577273 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:42:47.586392 systemd[1]: Stopped target paths.target - Path Units. Sep 9 05:42:47.602294 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 05:42:47.608029 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:42:47.616250 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 05:42:47.633283 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 05:42:47.647331 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 05:42:47.647860 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:42:47.670243 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 05:42:47.670324 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:42:47.677315 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 05:42:47.677393 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 05:42:47.694311 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 05:42:47.694378 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 05:42:47.710305 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 05:42:47.710382 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 05:42:47.726523 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 05:42:47.751241 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 05:42:47.769654 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 05:42:47.769782 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 05:42:47.779358 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 05:42:47.779616 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 05:42:47.779770 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 05:42:47.804740 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 05:42:47.806591 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 05:42:47.810300 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 05:42:47.810356 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:42:47.828493 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 05:42:47.846285 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 05:42:47.846375 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:42:47.860389 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 05:42:47.860485 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:42:47.878507 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 05:42:47.878579 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 05:42:47.912312 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 05:42:47.912385 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:42:47.938406 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:42:47.956400 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 05:42:48.378109 systemd-journald[207]: Received SIGTERM from PID 1 (systemd). Sep 9 05:42:47.956491 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:42:47.957924 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 05:42:47.958169 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:42:47.972487 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 05:42:47.972596 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 05:42:47.988349 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 05:42:47.988399 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:42:48.013232 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 05:42:48.013319 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:42:48.040474 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 05:42:48.040561 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 05:42:48.065356 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 05:42:48.065452 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:42:48.093458 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 05:42:48.109058 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 05:42:48.109181 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:42:48.127462 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 05:42:48.127536 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:42:48.165396 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 9 05:42:48.165470 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:42:48.185324 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 05:42:48.185396 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:42:48.204200 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:42:48.204297 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:42:48.224829 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 9 05:42:48.224902 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 9 05:42:48.224976 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 9 05:42:48.225029 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:42:48.225554 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 05:42:48.225669 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 05:42:48.241534 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 05:42:48.241654 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 05:42:48.261330 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 05:42:48.270363 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 05:42:48.318677 systemd[1]: Switching root. Sep 9 05:42:48.715082 systemd-journald[207]: Journal stopped Sep 9 05:42:51.129720 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 05:42:51.129775 kernel: SELinux: policy capability open_perms=1 Sep 9 05:42:51.129796 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 05:42:51.129814 kernel: SELinux: policy capability always_check_network=0 Sep 9 05:42:51.129830 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 05:42:51.129849 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 05:42:51.129874 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 05:42:51.129892 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 05:42:51.129910 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 05:42:51.129929 kernel: audit: type=1403 audit(1757396568.993:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 05:42:51.130088 systemd[1]: Successfully loaded SELinux policy in 119.581ms. Sep 9 05:42:51.130115 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.957ms. Sep 9 05:42:51.130144 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:42:51.130170 systemd[1]: Detected virtualization google. Sep 9 05:42:51.130191 systemd[1]: Detected architecture x86-64. Sep 9 05:42:51.130211 systemd[1]: Detected first boot. Sep 9 05:42:51.130231 systemd[1]: Initializing machine ID from random generator. Sep 9 05:42:51.130251 zram_generator::config[1087]: No configuration found. Sep 9 05:42:51.130276 kernel: Guest personality initialized and is inactive Sep 9 05:42:51.130294 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 05:42:51.130312 kernel: Initialized host personality Sep 9 05:42:51.130330 kernel: NET: Registered PF_VSOCK protocol family Sep 9 05:42:51.130352 systemd[1]: Populated /etc with preset unit settings. Sep 9 05:42:51.130373 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 05:42:51.130392 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 05:42:51.130416 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 05:42:51.130437 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 05:42:51.130457 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 05:42:51.130478 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 05:42:51.130499 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 05:42:51.132993 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 05:42:51.133032 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 05:42:51.133063 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 05:42:51.133088 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 05:42:51.133110 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 05:42:51.133139 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:42:51.133160 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:42:51.133181 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 05:42:51.133203 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 05:42:51.133226 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 05:42:51.133255 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:42:51.133281 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 05:42:51.133304 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:42:51.133326 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:42:51.133347 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 05:42:51.133372 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 05:42:51.133394 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 05:42:51.133416 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 05:42:51.133441 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:42:51.133463 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:42:51.133486 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:42:51.133508 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:42:51.133529 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 05:42:51.133551 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 05:42:51.133573 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 05:42:51.133600 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:42:51.133622 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:42:51.133644 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:42:51.133666 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 05:42:51.133688 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 05:42:51.133711 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 05:42:51.133737 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 05:42:51.133758 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:42:51.133781 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 05:42:51.133802 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 05:42:51.133824 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 05:42:51.133849 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 05:42:51.133871 systemd[1]: Reached target machines.target - Containers. Sep 9 05:42:51.133893 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 05:42:51.133919 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:42:51.133977 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:42:51.134001 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 05:42:51.134023 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:42:51.134044 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:42:51.134067 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:42:51.134089 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 05:42:51.134111 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:42:51.134140 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 05:42:51.134168 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 05:42:51.134190 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 05:42:51.134212 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 05:42:51.134234 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 05:42:51.134256 kernel: ACPI: bus type drm_connector registered Sep 9 05:42:51.134277 kernel: fuse: init (API version 7.41) Sep 9 05:42:51.134298 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:42:51.134321 kernel: loop: module loaded Sep 9 05:42:51.134345 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:42:51.134367 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:42:51.134389 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:42:51.134452 systemd-journald[1175]: Collecting audit messages is disabled. Sep 9 05:42:51.134503 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 05:42:51.134525 systemd-journald[1175]: Journal started Sep 9 05:42:51.134567 systemd-journald[1175]: Runtime Journal (/run/log/journal/1cad6920f86a4f71999264a146493835) is 8M, max 148.9M, 140.9M free. Sep 9 05:42:49.938542 systemd[1]: Queued start job for default target multi-user.target. Sep 9 05:42:49.962746 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 9 05:42:49.963468 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 05:42:51.176985 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 05:42:51.196993 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:42:51.213700 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 05:42:51.213784 systemd[1]: Stopped verity-setup.service. Sep 9 05:42:51.242971 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:42:51.253995 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:42:51.264688 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 05:42:51.274377 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 05:42:51.285361 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 05:42:51.294324 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 05:42:51.303304 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 05:42:51.314307 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 05:42:51.323579 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 05:42:51.334487 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:42:51.345439 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 05:42:51.345736 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 05:42:51.356435 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:42:51.356712 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:42:51.367468 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:42:51.367750 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:42:51.376435 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:42:51.376733 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:42:51.387420 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 05:42:51.387702 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 05:42:51.398414 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:42:51.398689 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:42:51.407447 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:42:51.418442 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:42:51.429495 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 05:42:51.440459 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 05:42:51.452450 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:42:51.475493 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:42:51.487799 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 05:42:51.505074 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 05:42:51.514196 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 05:42:51.514424 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:42:51.524373 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 05:42:51.535463 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 05:42:51.544280 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:42:51.551063 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 05:42:51.562113 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 05:42:51.573135 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:42:51.581069 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 05:42:51.590132 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:42:51.593268 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:42:51.602883 systemd-journald[1175]: Time spent on flushing to /var/log/journal/1cad6920f86a4f71999264a146493835 is 141.209ms for 960 entries. Sep 9 05:42:51.602883 systemd-journald[1175]: System Journal (/var/log/journal/1cad6920f86a4f71999264a146493835) is 8M, max 584.8M, 576.8M free. Sep 9 05:42:51.795384 systemd-journald[1175]: Received client request to flush runtime journal. Sep 9 05:42:51.795479 kernel: loop0: detected capacity change from 0 to 50720 Sep 9 05:42:51.796343 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 05:42:51.613787 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 05:42:51.628745 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:42:51.651307 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 05:42:51.661342 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 05:42:51.689536 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 05:42:51.707786 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 05:42:51.721242 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 05:42:51.731640 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:42:51.800731 systemd-tmpfiles[1212]: ACLs are not supported, ignoring. Sep 9 05:42:51.800760 systemd-tmpfiles[1212]: ACLs are not supported, ignoring. Sep 9 05:42:51.802249 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 05:42:51.814798 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 05:42:51.816500 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 05:42:51.824020 kernel: loop1: detected capacity change from 0 to 128016 Sep 9 05:42:51.832627 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:42:51.854157 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 05:42:51.895811 kernel: loop2: detected capacity change from 0 to 110984 Sep 9 05:42:51.925160 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 05:42:51.944676 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:42:51.994986 kernel: loop3: detected capacity change from 0 to 224512 Sep 9 05:42:52.023532 systemd-tmpfiles[1230]: ACLs are not supported, ignoring. Sep 9 05:42:52.023570 systemd-tmpfiles[1230]: ACLs are not supported, ignoring. Sep 9 05:42:52.034539 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:42:52.100024 kernel: loop4: detected capacity change from 0 to 50720 Sep 9 05:42:52.143991 kernel: loop5: detected capacity change from 0 to 128016 Sep 9 05:42:52.186812 kernel: loop6: detected capacity change from 0 to 110984 Sep 9 05:42:52.229998 kernel: loop7: detected capacity change from 0 to 224512 Sep 9 05:42:52.271915 (sd-merge)[1235]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-gce'. Sep 9 05:42:52.272917 (sd-merge)[1235]: Merged extensions into '/usr'. Sep 9 05:42:52.288519 systemd[1]: Reload requested from client PID 1210 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 05:42:52.288928 systemd[1]: Reloading... Sep 9 05:42:52.465301 zram_generator::config[1260]: No configuration found. Sep 9 05:42:52.653003 ldconfig[1205]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 05:42:52.933662 systemd[1]: Reloading finished in 643 ms. Sep 9 05:42:52.954423 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 05:42:52.966072 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 05:42:52.988578 systemd[1]: Starting ensure-sysext.service... Sep 9 05:42:53.002385 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:42:53.037086 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 05:42:53.048201 systemd[1]: Reload requested from client PID 1301 ('systemctl') (unit ensure-sysext.service)... Sep 9 05:42:53.048233 systemd[1]: Reloading... Sep 9 05:42:53.048411 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 05:42:53.048470 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 05:42:53.048996 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 05:42:53.049509 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 05:42:53.051283 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 05:42:53.051856 systemd-tmpfiles[1302]: ACLs are not supported, ignoring. Sep 9 05:42:53.051996 systemd-tmpfiles[1302]: ACLs are not supported, ignoring. Sep 9 05:42:53.060038 systemd-tmpfiles[1302]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:42:53.060060 systemd-tmpfiles[1302]: Skipping /boot Sep 9 05:42:53.075930 systemd-tmpfiles[1302]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:42:53.076028 systemd-tmpfiles[1302]: Skipping /boot Sep 9 05:42:53.131998 zram_generator::config[1325]: No configuration found. Sep 9 05:42:53.381253 systemd[1]: Reloading finished in 331 ms. Sep 9 05:42:53.410615 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:42:53.429427 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:42:53.445404 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 05:42:53.459146 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 05:42:53.479014 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:42:53.490353 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:42:53.502295 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 05:42:53.519932 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:42:53.520471 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:42:53.525066 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:42:53.537586 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:42:53.556561 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:42:53.566213 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:42:53.566442 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:42:53.570765 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 05:42:53.580051 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:42:53.584588 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:42:53.588703 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:42:53.597828 augenrules[1399]: No rules Sep 9 05:42:53.600094 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:42:53.600797 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:42:53.611729 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 05:42:53.623630 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:42:53.624228 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:42:53.624720 systemd-udevd[1382]: Using default interface naming scheme 'v255'. Sep 9 05:42:53.635933 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:42:53.636268 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:42:53.665207 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:42:53.665649 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:42:53.669894 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:42:53.683312 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:42:53.697483 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:42:53.706203 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:42:53.706881 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:42:53.712220 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 05:42:53.721096 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:42:53.728715 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 05:42:53.745889 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:42:53.760250 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 05:42:53.773099 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:42:53.773675 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:42:53.783751 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:42:53.785339 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:42:53.797034 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:42:53.797505 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:42:53.808136 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 05:42:53.828917 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 05:42:53.892027 systemd[1]: Finished ensure-sysext.service. Sep 9 05:42:53.901471 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:42:53.906739 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:42:53.914331 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:42:53.922239 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:42:53.936251 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:42:53.949264 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:42:53.966159 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:42:53.979233 systemd[1]: Starting setup-oem.service - Setup OEM... Sep 9 05:42:53.988149 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:42:53.988222 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:42:54.001919 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:42:54.011125 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 05:42:54.015838 systemd-resolved[1379]: Positive Trust Anchors: Sep 9 05:42:54.015880 systemd-resolved[1379]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:42:54.016467 systemd-resolved[1379]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:42:54.020117 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 05:42:54.020172 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:42:54.021874 systemd[1]: Condition check resulted in dev-tpmrm0.device - /dev/tpmrm0 being skipped. Sep 9 05:42:54.028572 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 9 05:42:54.051901 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:42:54.053296 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:42:54.061154 augenrules[1454]: /sbin/augenrules: No change Sep 9 05:42:54.062134 systemd-resolved[1379]: Defaulting to hostname 'linux'. Sep 9 05:42:54.063727 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:42:54.065035 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:42:54.075508 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:42:54.086483 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:42:54.086829 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:42:54.092464 augenrules[1480]: No rules Sep 9 05:42:54.096585 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:42:54.096928 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:42:54.106620 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:42:54.106913 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:42:54.129837 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 05:42:54.135547 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:42:54.147112 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:42:54.147170 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:42:54.156247 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 05:42:54.166137 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 05:42:54.176078 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 05:42:54.186339 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 05:42:54.195310 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 05:42:54.205100 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 05:42:54.216116 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 05:42:54.216180 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:42:54.224098 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:42:54.234622 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 05:42:54.250132 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 05:42:54.262405 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 05:42:54.279055 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 05:42:54.278385 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 05:42:54.289106 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 05:42:54.298506 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 05:42:54.309367 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:42:54.314452 systemd[1]: Finished setup-oem.service - Setup OEM. Sep 9 05:42:54.319401 systemd-networkd[1468]: lo: Link UP Sep 9 05:42:54.319807 systemd-networkd[1468]: lo: Gained carrier Sep 9 05:42:54.322314 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 05:42:54.324331 systemd-networkd[1468]: Enumeration completed Sep 9 05:42:54.324893 systemd-networkd[1468]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:42:54.324901 systemd-networkd[1468]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:42:54.328082 systemd-networkd[1468]: eth0: Link UP Sep 9 05:42:54.329104 systemd-networkd[1468]: eth0: Gained carrier Sep 9 05:42:54.329881 systemd-networkd[1468]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:42:54.331151 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:42:54.342529 systemd-networkd[1468]: eth0: Overlong DHCP hostname received, shortened from 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023.c.flatcar-212911.internal' to 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:42:54.342685 systemd-networkd[1468]: eth0: DHCPv4 address 10.128.0.19/32, gateway 10.128.0.1 acquired from 169.254.169.254 Sep 9 05:42:54.358972 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 9 05:42:54.369094 systemd[1]: Reached target network.target - Network. Sep 9 05:42:54.378014 kernel: piix4_smbus 0000:00:01.3: SMBus base address uninitialized - upgrade BIOS or use force_addr=0xaddr Sep 9 05:42:54.384968 kernel: ACPI: button: Power Button [PWRF] Sep 9 05:42:54.402286 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input5 Sep 9 05:42:54.402397 kernel: ACPI: button: Sleep Button [SLPF] Sep 9 05:42:54.402599 systemd[1]: Starting oem-gce-enable-oslogin.service - Enable GCE OS Login... Sep 9 05:42:54.413115 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 05:42:54.425698 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 05:42:54.483161 systemd[1]: Finished oem-gce-enable-oslogin.service - Enable GCE OS Login. Sep 9 05:42:54.494701 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 05:42:54.523634 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 05:42:54.532117 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:42:54.541123 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:42:54.549231 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:42:54.549293 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:42:54.552176 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 05:42:54.565142 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 05:42:54.576716 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 05:42:54.587017 kernel: EDAC MC: Ver: 3.0.0 Sep 9 05:42:54.591274 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 05:42:54.609699 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 05:42:54.633285 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 05:42:54.642085 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 05:42:54.645255 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 05:42:54.652257 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 05:42:54.676244 systemd[1]: Started ntpd.service - Network Time Service. Sep 9 05:42:54.687522 jq[1536]: false Sep 9 05:42:54.687409 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 05:42:54.710206 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 05:42:54.721807 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 05:42:54.727570 google_oslogin_nss_cache[1539]: oslogin_cache_refresh[1539]: Refreshing passwd entry cache Sep 9 05:42:54.730014 oslogin_cache_refresh[1539]: Refreshing passwd entry cache Sep 9 05:42:54.745505 extend-filesystems[1537]: Found /dev/sda6 Sep 9 05:42:54.746187 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 05:42:54.756535 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionSecurity=!tpm2). Sep 9 05:42:54.757402 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 05:42:54.760257 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 05:42:54.771809 google_oslogin_nss_cache[1539]: oslogin_cache_refresh[1539]: Failure getting users, quitting Sep 9 05:42:54.771809 google_oslogin_nss_cache[1539]: oslogin_cache_refresh[1539]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:42:54.771809 google_oslogin_nss_cache[1539]: oslogin_cache_refresh[1539]: Refreshing group entry cache Sep 9 05:42:54.771809 google_oslogin_nss_cache[1539]: oslogin_cache_refresh[1539]: Failure getting groups, quitting Sep 9 05:42:54.771809 google_oslogin_nss_cache[1539]: oslogin_cache_refresh[1539]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:42:54.766749 oslogin_cache_refresh[1539]: Failure getting users, quitting Sep 9 05:42:54.766800 oslogin_cache_refresh[1539]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:42:54.766870 oslogin_cache_refresh[1539]: Refreshing group entry cache Sep 9 05:42:54.774710 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 05:42:54.770397 oslogin_cache_refresh[1539]: Failure getting groups, quitting Sep 9 05:42:54.770415 oslogin_cache_refresh[1539]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:42:54.789627 extend-filesystems[1537]: Found /dev/sda9 Sep 9 05:42:54.791019 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 05:42:54.800279 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 05:42:54.801374 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 05:42:54.803160 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 05:42:54.803491 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 05:42:54.806294 extend-filesystems[1537]: Checking size of /dev/sda9 Sep 9 05:42:54.819619 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 05:42:54.819935 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 05:42:54.831375 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 05:42:54.833120 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 05:42:54.847128 coreos-metadata[1532]: Sep 09 05:42:54.843 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/hostname: Attempt #1 Sep 9 05:42:54.851143 coreos-metadata[1532]: Sep 09 05:42:54.848 INFO Fetch successful Sep 9 05:42:54.851143 coreos-metadata[1532]: Sep 09 05:42:54.848 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/network-interfaces/0/access-configs/0/external-ip: Attempt #1 Sep 9 05:42:54.855072 coreos-metadata[1532]: Sep 09 05:42:54.853 INFO Fetch successful Sep 9 05:42:54.855072 coreos-metadata[1532]: Sep 09 05:42:54.855 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/network-interfaces/0/ip: Attempt #1 Sep 9 05:42:54.857211 coreos-metadata[1532]: Sep 09 05:42:54.857 INFO Fetch successful Sep 9 05:42:54.857211 coreos-metadata[1532]: Sep 09 05:42:54.857 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/machine-type: Attempt #1 Sep 9 05:42:54.864167 jq[1558]: true Sep 9 05:42:54.870488 coreos-metadata[1532]: Sep 09 05:42:54.868 INFO Fetch successful Sep 9 05:42:54.869544 (ntainerd)[1568]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 05:42:54.891408 extend-filesystems[1537]: Resized partition /dev/sda9 Sep 9 05:42:54.913660 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - PersistentDisk OEM. Sep 9 05:42:54.914056 extend-filesystems[1582]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 05:42:54.966533 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 2538491 blocks Sep 9 05:42:54.966671 jq[1569]: true Sep 9 05:42:54.968416 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 05:42:54.980911 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:42:54.993247 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 05:42:54.997480 update_engine[1557]: I20250909 05:42:54.997253 1557 main.cc:92] Flatcar Update Engine starting Sep 9 05:42:55.005013 kernel: EXT4-fs (sda9): resized filesystem to 2538491 Sep 9 05:42:55.015209 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 05:42:55.025094 extend-filesystems[1582]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 9 05:42:55.025094 extend-filesystems[1582]: old_desc_blocks = 1, new_desc_blocks = 2 Sep 9 05:42:55.025094 extend-filesystems[1582]: The filesystem on /dev/sda9 is now 2538491 (4k) blocks long. Sep 9 05:42:55.069118 extend-filesystems[1537]: Resized filesystem in /dev/sda9 Sep 9 05:42:55.027375 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 05:42:55.028023 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 05:42:55.071195 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 05:42:55.106642 tar[1565]: linux-amd64/LICENSE Sep 9 05:42:55.107058 tar[1565]: linux-amd64/helm Sep 9 05:42:55.152762 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 05:42:55.292917 bash[1613]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:42:55.296590 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 05:42:55.297597 sshd_keygen[1581]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 05:42:55.306262 systemd[1]: Starting sshkeys.service... Sep 9 05:42:55.339751 dbus-daemon[1533]: [system] SELinux support is enabled Sep 9 05:42:55.340075 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 05:42:55.356778 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 05:42:55.357813 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 05:42:55.358006 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 05:42:55.358029 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 05:42:55.397099 dbus-daemon[1533]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1468 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 9 05:42:55.411344 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 9 05:42:55.416680 update_engine[1557]: I20250909 05:42:55.415606 1557 update_check_scheduler.cc:74] Next update check in 11m43s Sep 9 05:42:55.412190 systemd[1]: Started update-engine.service - Update Engine. Sep 9 05:42:55.430934 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 05:42:55.488066 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 05:42:55.491191 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 05:42:55.511073 ntpd[1545]: ntpd 4.2.8p17@1.4004-o Tue Sep 9 03:09:56 UTC 2025 (1): Starting Sep 9 05:42:55.514318 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: ntpd 4.2.8p17@1.4004-o Tue Sep 9 03:09:56 UTC 2025 (1): Starting Sep 9 05:42:55.514318 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 9 05:42:55.514318 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: ---------------------------------------------------- Sep 9 05:42:55.514318 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: ntp-4 is maintained by Network Time Foundation, Sep 9 05:42:55.514318 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 9 05:42:55.514318 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: corporation. Support and training for ntp-4 are Sep 9 05:42:55.514318 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: available at https://www.nwtime.org/support Sep 9 05:42:55.514318 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: ---------------------------------------------------- Sep 9 05:42:55.511116 ntpd[1545]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 9 05:42:55.511130 ntpd[1545]: ---------------------------------------------------- Sep 9 05:42:55.511143 ntpd[1545]: ntp-4 is maintained by Network Time Foundation, Sep 9 05:42:55.511156 ntpd[1545]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 9 05:42:55.511169 ntpd[1545]: corporation. Support and training for ntp-4 are Sep 9 05:42:55.511183 ntpd[1545]: available at https://www.nwtime.org/support Sep 9 05:42:55.511197 ntpd[1545]: ---------------------------------------------------- Sep 9 05:42:55.531677 ntpd[1545]: proto: precision = 0.116 usec (-23) Sep 9 05:42:55.533538 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: proto: precision = 0.116 usec (-23) Sep 9 05:42:55.538854 ntpd[1545]: basedate set to 2025-08-28 Sep 9 05:42:55.563913 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: basedate set to 2025-08-28 Sep 9 05:42:55.563913 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: gps base set to 2025-08-31 (week 2382) Sep 9 05:42:55.538886 ntpd[1545]: gps base set to 2025-08-31 (week 2382) Sep 9 05:42:55.574908 ntpd[1545]: Listen and drop on 0 v6wildcard [::]:123 Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: Listen and drop on 0 v6wildcard [::]:123 Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: Listen normally on 2 lo 127.0.0.1:123 Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: Listen normally on 3 eth0 10.128.0.19:123 Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: Listen normally on 4 lo [::1]:123 Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: bind(21) AF_INET6 fe80::4001:aff:fe80:13%2#123 flags 0x11 failed: Cannot assign requested address Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: unable to create socket on eth0 (5) for fe80::4001:aff:fe80:13%2#123 Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: failed to init interface for address fe80::4001:aff:fe80:13%2 Sep 9 05:42:55.576723 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: Listening on routing socket on fd #21 for interface updates Sep 9 05:42:55.575011 ntpd[1545]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 9 05:42:55.575263 ntpd[1545]: Listen normally on 2 lo 127.0.0.1:123 Sep 9 05:42:55.575320 ntpd[1545]: Listen normally on 3 eth0 10.128.0.19:123 Sep 9 05:42:55.575380 ntpd[1545]: Listen normally on 4 lo [::1]:123 Sep 9 05:42:55.575441 ntpd[1545]: bind(21) AF_INET6 fe80::4001:aff:fe80:13%2#123 flags 0x11 failed: Cannot assign requested address Sep 9 05:42:55.575472 ntpd[1545]: unable to create socket on eth0 (5) for fe80::4001:aff:fe80:13%2#123 Sep 9 05:42:55.575493 ntpd[1545]: failed to init interface for address fe80::4001:aff:fe80:13%2 Sep 9 05:42:55.575536 ntpd[1545]: Listening on routing socket on fd #21 for interface updates Sep 9 05:42:55.589694 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:42:55.609028 ntpd[1545]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 9 05:42:55.610769 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 9 05:42:55.610769 ntpd[1545]: 9 Sep 05:42:55 ntpd[1545]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 9 05:42:55.609248 ntpd[1545]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 9 05:42:55.616382 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/sshKeys: Attempt #1 Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetch failed with 404: resource not found Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/ssh-keys: Attempt #1 Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetch successful Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/block-project-ssh-keys: Attempt #1 Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetch failed with 404: resource not found Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetching http://169.254.169.254/computeMetadata/v1/project/attributes/sshKeys: Attempt #1 Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetch failed with 404: resource not found Sep 9 05:42:55.617857 coreos-metadata[1633]: Sep 09 05:42:55.617 INFO Fetching http://169.254.169.254/computeMetadata/v1/project/attributes/ssh-keys: Attempt #1 Sep 9 05:42:55.620713 coreos-metadata[1633]: Sep 09 05:42:55.620 INFO Fetch successful Sep 9 05:42:55.631241 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 05:42:55.635178 unknown[1633]: wrote ssh authorized keys file for user: core Sep 9 05:42:55.643925 systemd[1]: Started sshd@0-10.128.0.19:22-139.178.89.65:43530.service - OpenSSH per-connection server daemon (139.178.89.65:43530). Sep 9 05:42:55.697969 update-ssh-keys[1648]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:42:55.700616 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 05:42:55.718761 systemd[1]: Finished sshkeys.service. Sep 9 05:42:55.739275 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 05:42:55.739859 locksmithd[1625]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 05:42:55.740700 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 05:42:55.754803 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 05:42:55.823838 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 05:42:55.838314 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 05:42:55.850079 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 05:42:55.859357 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 05:42:55.906991 systemd-logind[1555]: Watching system buttons on /dev/input/event2 (Power Button) Sep 9 05:42:55.907053 systemd-logind[1555]: Watching system buttons on /dev/input/event3 (Sleep Button) Sep 9 05:42:55.907088 systemd-logind[1555]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 05:42:55.908779 systemd-logind[1555]: New seat seat0. Sep 9 05:42:55.913218 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 05:42:55.922917 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 9 05:42:55.929403 dbus-daemon[1533]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 9 05:42:55.931109 dbus-daemon[1533]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.7' (uid=0 pid=1624 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 9 05:42:55.943398 systemd[1]: Starting polkit.service - Authorization Manager... Sep 9 05:42:55.971722 containerd[1568]: time="2025-09-09T05:42:55Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 05:42:55.974502 containerd[1568]: time="2025-09-09T05:42:55.974456011Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 05:42:56.023809 containerd[1568]: time="2025-09-09T05:42:56.023751307Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="15.639µs" Sep 9 05:42:56.023809 containerd[1568]: time="2025-09-09T05:42:56.023803251Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 05:42:56.024008 containerd[1568]: time="2025-09-09T05:42:56.023838277Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 05:42:56.027060 containerd[1568]: time="2025-09-09T05:42:56.027010622Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 05:42:56.027169 containerd[1568]: time="2025-09-09T05:42:56.027079473Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 05:42:56.027169 containerd[1568]: time="2025-09-09T05:42:56.027124257Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:42:56.027269 containerd[1568]: time="2025-09-09T05:42:56.027218215Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:42:56.027269 containerd[1568]: time="2025-09-09T05:42:56.027236770Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:42:56.027587 containerd[1568]: time="2025-09-09T05:42:56.027545827Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:42:56.027587 containerd[1568]: time="2025-09-09T05:42:56.027584956Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:42:56.027711 containerd[1568]: time="2025-09-09T05:42:56.027606884Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:42:56.027711 containerd[1568]: time="2025-09-09T05:42:56.027623039Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 05:42:56.027808 containerd[1568]: time="2025-09-09T05:42:56.027760405Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 05:42:56.030780 containerd[1568]: time="2025-09-09T05:42:56.030740635Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:42:56.030867 containerd[1568]: time="2025-09-09T05:42:56.030835757Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:42:56.030867 containerd[1568]: time="2025-09-09T05:42:56.030857572Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 05:42:56.032018 containerd[1568]: time="2025-09-09T05:42:56.030932463Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 05:42:56.032439 containerd[1568]: time="2025-09-09T05:42:56.032403089Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 05:42:56.032678 containerd[1568]: time="2025-09-09T05:42:56.032609599Z" level=info msg="metadata content store policy set" policy=shared Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040033930Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040106517Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040141412Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040162775Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040182392Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040201027Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040222073Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040241379Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040259617Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040275814Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040291058Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040313205Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040461743Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 05:42:56.040971 containerd[1568]: time="2025-09-09T05:42:56.040502263Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040531031Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040552949Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040579318Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040598562Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040616707Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040633778Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040653690Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040691269Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040708360Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040799091Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040821796Z" level=info msg="Start snapshots syncer" Sep 9 05:42:56.041603 containerd[1568]: time="2025-09-09T05:42:56.040848902Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 05:42:56.051005 containerd[1568]: time="2025-09-09T05:42:56.048339344Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 05:42:56.051005 containerd[1568]: time="2025-09-09T05:42:56.050017828Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050139156Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050296428Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050330293Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050348683Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050366135Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050387890Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050405591Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050431207Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050466964Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050484618Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050501425Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050550429Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050574281Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:42:56.051294 containerd[1568]: time="2025-09-09T05:42:56.050591495Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050607292Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050622419Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050637999Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050664057Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050689866Z" level=info msg="runtime interface created" Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050700285Z" level=info msg="created NRI interface" Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050715865Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050745492Z" level=info msg="Connect containerd service" Sep 9 05:42:56.051890 containerd[1568]: time="2025-09-09T05:42:56.050787117Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 05:42:56.058974 containerd[1568]: time="2025-09-09T05:42:56.057510089Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 05:42:56.165639 polkitd[1661]: Started polkitd version 126 Sep 9 05:42:56.174968 polkitd[1661]: Loading rules from directory /etc/polkit-1/rules.d Sep 9 05:42:56.176510 polkitd[1661]: Loading rules from directory /run/polkit-1/rules.d Sep 9 05:42:56.177231 polkitd[1661]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 05:42:56.177855 polkitd[1661]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 9 05:42:56.177902 polkitd[1661]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 9 05:42:56.177986 polkitd[1661]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 9 05:42:56.180099 polkitd[1661]: Finished loading, compiling and executing 2 rules Sep 9 05:42:56.180501 systemd[1]: Started polkit.service - Authorization Manager. Sep 9 05:42:56.182124 dbus-daemon[1533]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 9 05:42:56.182838 polkitd[1661]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 9 05:42:56.187424 systemd-networkd[1468]: eth0: Gained IPv6LL Sep 9 05:42:56.199979 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 05:42:56.214372 sshd[1647]: Accepted publickey for core from 139.178.89.65 port 43530 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:42:56.212724 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 05:42:56.225582 sshd-session[1647]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:56.228082 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:42:56.241870 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 05:42:56.252191 systemd[1]: Starting oem-gce.service - GCE Linux Agent... Sep 9 05:42:56.279684 init.sh[1681]: + '[' -e /etc/default/instance_configs.cfg.template ']' Sep 9 05:42:56.279684 init.sh[1681]: + echo -e '[InstanceSetup]\nset_host_keys = false' Sep 9 05:42:56.282629 init.sh[1681]: + /usr/bin/google_instance_setup Sep 9 05:42:56.284438 systemd-hostnamed[1624]: Hostname set to (transient) Sep 9 05:42:56.287435 systemd-resolved[1379]: System hostname changed to 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023'. Sep 9 05:42:56.300718 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 05:42:56.312360 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 05:42:56.373088 systemd-logind[1555]: New session 1 of user core. Sep 9 05:42:56.393096 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 05:42:56.401006 containerd[1568]: time="2025-09-09T05:42:56.400613747Z" level=info msg="Start subscribing containerd event" Sep 9 05:42:56.401006 containerd[1568]: time="2025-09-09T05:42:56.400692310Z" level=info msg="Start recovering state" Sep 9 05:42:56.401006 containerd[1568]: time="2025-09-09T05:42:56.400826660Z" level=info msg="Start event monitor" Sep 9 05:42:56.401006 containerd[1568]: time="2025-09-09T05:42:56.400844646Z" level=info msg="Start cni network conf syncer for default" Sep 9 05:42:56.401006 containerd[1568]: time="2025-09-09T05:42:56.400859183Z" level=info msg="Start streaming server" Sep 9 05:42:56.401006 containerd[1568]: time="2025-09-09T05:42:56.400874061Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 05:42:56.402054 containerd[1568]: time="2025-09-09T05:42:56.401357964Z" level=info msg="runtime interface starting up..." Sep 9 05:42:56.402054 containerd[1568]: time="2025-09-09T05:42:56.401900897Z" level=info msg="starting plugins..." Sep 9 05:42:56.402054 containerd[1568]: time="2025-09-09T05:42:56.401973131Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 05:42:56.412494 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 05:42:56.416245 containerd[1568]: time="2025-09-09T05:42:56.414017414Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 05:42:56.416245 containerd[1568]: time="2025-09-09T05:42:56.414738867Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 05:42:56.416245 containerd[1568]: time="2025-09-09T05:42:56.414902257Z" level=info msg="containerd successfully booted in 0.443767s" Sep 9 05:42:56.422418 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 05:42:56.433022 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 05:42:56.470070 (systemd)[1698]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 05:42:56.479680 systemd-logind[1555]: New session c1 of user core. Sep 9 05:42:56.860033 tar[1565]: linux-amd64/README.md Sep 9 05:42:56.886054 systemd[1698]: Queued start job for default target default.target. Sep 9 05:42:56.891297 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 05:42:56.893615 systemd[1698]: Created slice app.slice - User Application Slice. Sep 9 05:42:56.893872 systemd[1698]: Reached target paths.target - Paths. Sep 9 05:42:56.893984 systemd[1698]: Reached target timers.target - Timers. Sep 9 05:42:56.897078 systemd[1698]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 05:42:56.932124 systemd[1698]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 05:42:56.932303 systemd[1698]: Reached target sockets.target - Sockets. Sep 9 05:42:56.932371 systemd[1698]: Reached target basic.target - Basic System. Sep 9 05:42:56.932441 systemd[1698]: Reached target default.target - Main User Target. Sep 9 05:42:56.932493 systemd[1698]: Startup finished in 434ms. Sep 9 05:42:56.932670 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 05:42:56.954257 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 05:42:57.175805 instance-setup[1688]: INFO Running google_set_multiqueue. Sep 9 05:42:57.200524 systemd[1]: Started sshd@1-10.128.0.19:22-139.178.89.65:43534.service - OpenSSH per-connection server daemon (139.178.89.65:43534). Sep 9 05:42:57.215513 instance-setup[1688]: INFO Set channels for eth0 to 2. Sep 9 05:42:57.227264 instance-setup[1688]: INFO Setting /proc/irq/27/smp_affinity_list to 0 for device virtio1. Sep 9 05:42:57.227511 instance-setup[1688]: INFO /proc/irq/27/smp_affinity_list: real affinity 0 Sep 9 05:42:57.228289 instance-setup[1688]: INFO Setting /proc/irq/28/smp_affinity_list to 0 for device virtio1. Sep 9 05:42:57.234242 instance-setup[1688]: INFO /proc/irq/28/smp_affinity_list: real affinity 0 Sep 9 05:42:57.234970 instance-setup[1688]: INFO Setting /proc/irq/29/smp_affinity_list to 1 for device virtio1. Sep 9 05:42:57.240708 instance-setup[1688]: INFO /proc/irq/29/smp_affinity_list: real affinity 1 Sep 9 05:42:57.241652 instance-setup[1688]: INFO Setting /proc/irq/30/smp_affinity_list to 1 for device virtio1. Sep 9 05:42:57.246856 instance-setup[1688]: INFO /proc/irq/30/smp_affinity_list: real affinity 1 Sep 9 05:42:57.266798 instance-setup[1688]: INFO /usr/sbin/google_set_multiqueue: line 133: echo: write error: Value too large for defined data type Sep 9 05:42:57.272030 instance-setup[1688]: INFO /usr/sbin/google_set_multiqueue: line 133: echo: write error: Value too large for defined data type Sep 9 05:42:57.273856 instance-setup[1688]: INFO Queue 0 XPS=1 for /sys/class/net/eth0/queues/tx-0/xps_cpus Sep 9 05:42:57.273914 instance-setup[1688]: INFO Queue 1 XPS=2 for /sys/class/net/eth0/queues/tx-1/xps_cpus Sep 9 05:42:57.299072 init.sh[1681]: + /usr/bin/google_metadata_script_runner --script-type startup Sep 9 05:42:57.480916 startup-script[1746]: INFO Starting startup scripts. Sep 9 05:42:57.488842 startup-script[1746]: INFO No startup scripts found in metadata. Sep 9 05:42:57.488921 startup-script[1746]: INFO Finished running startup scripts. Sep 9 05:42:57.515901 init.sh[1681]: + trap 'stopping=1 ; kill "${daemon_pids[@]}" || :' SIGTERM Sep 9 05:42:57.515901 init.sh[1681]: + daemon_pids=() Sep 9 05:42:57.515901 init.sh[1681]: + for d in accounts clock_skew network Sep 9 05:42:57.515901 init.sh[1681]: + daemon_pids+=($!) Sep 9 05:42:57.516179 init.sh[1681]: + for d in accounts clock_skew network Sep 9 05:42:57.516223 init.sh[1681]: + daemon_pids+=($!) Sep 9 05:42:57.516279 init.sh[1681]: + for d in accounts clock_skew network Sep 9 05:42:57.517660 init.sh[1681]: + daemon_pids+=($!) Sep 9 05:42:57.517660 init.sh[1681]: + NOTIFY_SOCKET=/run/systemd/notify Sep 9 05:42:57.517660 init.sh[1681]: + /usr/bin/systemd-notify --ready Sep 9 05:42:57.517896 init.sh[1749]: + /usr/bin/google_accounts_daemon Sep 9 05:42:57.518385 init.sh[1750]: + /usr/bin/google_clock_skew_daemon Sep 9 05:42:57.518657 init.sh[1751]: + /usr/bin/google_network_daemon Sep 9 05:42:57.535173 systemd[1]: Started oem-gce.service - GCE Linux Agent. Sep 9 05:42:57.547367 init.sh[1681]: + wait -n 1749 1750 1751 Sep 9 05:42:57.550597 sshd[1728]: Accepted publickey for core from 139.178.89.65 port 43534 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:42:57.555960 sshd-session[1728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:57.575579 systemd-logind[1555]: New session 2 of user core. Sep 9 05:42:57.578432 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 05:42:57.789983 sshd[1753]: Connection closed by 139.178.89.65 port 43534 Sep 9 05:42:57.790683 sshd-session[1728]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:57.802866 systemd[1]: sshd@1-10.128.0.19:22-139.178.89.65:43534.service: Deactivated successfully. Sep 9 05:42:57.807761 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 05:42:57.811936 systemd-logind[1555]: Session 2 logged out. Waiting for processes to exit. Sep 9 05:42:57.815262 systemd-logind[1555]: Removed session 2. Sep 9 05:42:57.849924 systemd[1]: Started sshd@2-10.128.0.19:22-139.178.89.65:43546.service - OpenSSH per-connection server daemon (139.178.89.65:43546). Sep 9 05:42:58.006804 google-clock-skew[1750]: INFO Starting Google Clock Skew daemon. Sep 9 05:42:58.011644 google-networking[1751]: INFO Starting Google Networking daemon. Sep 9 05:42:58.015169 google-clock-skew[1750]: INFO Clock drift token has changed: 0. Sep 9 05:42:58.073729 groupadd[1771]: group added to /etc/group: name=google-sudoers, GID=1000 Sep 9 05:42:58.081420 groupadd[1771]: group added to /etc/gshadow: name=google-sudoers Sep 9 05:42:58.134606 groupadd[1771]: new group: name=google-sudoers, GID=1000 Sep 9 05:42:58.164981 google-accounts[1749]: INFO Starting Google Accounts daemon. Sep 9 05:42:58.178739 google-accounts[1749]: WARNING OS Login not installed. Sep 9 05:42:58.180782 google-accounts[1749]: INFO Creating a new user account for 0. Sep 9 05:42:58.185212 init.sh[1783]: useradd: invalid user name '0': use --badname to ignore Sep 9 05:42:58.185518 google-accounts[1749]: WARNING Could not create user 0. Command '['useradd', '-m', '-s', '/bin/bash', '-p', '*', '0']' returned non-zero exit status 3.. Sep 9 05:42:58.191233 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:42:58.201874 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 05:42:58.206643 (kubelet)[1786]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:42:58.210597 sshd[1764]: Accepted publickey for core from 139.178.89.65 port 43546 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:42:58.211694 systemd[1]: Startup finished in 3.683s (kernel) + 11.108s (initrd) + 9.325s (userspace) = 24.117s. Sep 9 05:42:58.213863 sshd-session[1764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:42:58.230025 systemd-logind[1555]: New session 3 of user core. Sep 9 05:42:58.235488 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 05:42:58.433344 sshd[1790]: Connection closed by 139.178.89.65 port 43546 Sep 9 05:42:58.435858 sshd-session[1764]: pam_unix(sshd:session): session closed for user core Sep 9 05:42:58.444770 systemd[1]: sshd@2-10.128.0.19:22-139.178.89.65:43546.service: Deactivated successfully. Sep 9 05:42:58.448307 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 05:42:58.450165 systemd-logind[1555]: Session 3 logged out. Waiting for processes to exit. Sep 9 05:42:58.453049 systemd-logind[1555]: Removed session 3. Sep 9 05:42:58.511667 ntpd[1545]: Listen normally on 6 eth0 [fe80::4001:aff:fe80:13%2]:123 Sep 9 05:42:58.512196 ntpd[1545]: 9 Sep 05:42:58 ntpd[1545]: Listen normally on 6 eth0 [fe80::4001:aff:fe80:13%2]:123 Sep 9 05:42:58.910613 kubelet[1786]: E0909 05:42:58.910536 1786 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:42:58.913643 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:42:58.913910 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:42:58.914515 systemd[1]: kubelet.service: Consumed 1.289s CPU time, 263.4M memory peak. Sep 9 05:42:59.000373 systemd-resolved[1379]: Clock change detected. Flushing caches. Sep 9 05:42:59.000649 google-clock-skew[1750]: INFO Synced system time with hardware clock. Sep 9 05:43:08.528922 systemd[1]: Started sshd@3-10.128.0.19:22-139.178.89.65:42394.service - OpenSSH per-connection server daemon (139.178.89.65:42394). Sep 9 05:43:08.839026 sshd[1804]: Accepted publickey for core from 139.178.89.65 port 42394 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:43:08.840795 sshd-session[1804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:43:08.846523 systemd-logind[1555]: New session 4 of user core. Sep 9 05:43:08.861408 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 05:43:09.015215 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 05:43:09.017382 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:43:09.057209 sshd[1807]: Connection closed by 139.178.89.65 port 42394 Sep 9 05:43:09.057993 sshd-session[1804]: pam_unix(sshd:session): session closed for user core Sep 9 05:43:09.069696 systemd[1]: sshd@3-10.128.0.19:22-139.178.89.65:42394.service: Deactivated successfully. Sep 9 05:43:09.070912 systemd-logind[1555]: Session 4 logged out. Waiting for processes to exit. Sep 9 05:43:09.076597 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 05:43:09.081787 systemd-logind[1555]: Removed session 4. Sep 9 05:43:09.114657 systemd[1]: Started sshd@4-10.128.0.19:22-139.178.89.65:42404.service - OpenSSH per-connection server daemon (139.178.89.65:42404). Sep 9 05:43:09.400898 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:43:09.415713 (kubelet)[1824]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:43:09.432684 sshd[1816]: Accepted publickey for core from 139.178.89.65 port 42404 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:43:09.435225 sshd-session[1816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:43:09.443248 systemd-logind[1555]: New session 5 of user core. Sep 9 05:43:09.449625 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 05:43:09.485849 kubelet[1824]: E0909 05:43:09.485773 1824 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:43:09.490195 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:43:09.490441 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:43:09.490987 systemd[1]: kubelet.service: Consumed 216ms CPU time, 110.4M memory peak. Sep 9 05:43:09.645382 sshd[1830]: Connection closed by 139.178.89.65 port 42404 Sep 9 05:43:09.646308 sshd-session[1816]: pam_unix(sshd:session): session closed for user core Sep 9 05:43:09.652089 systemd[1]: sshd@4-10.128.0.19:22-139.178.89.65:42404.service: Deactivated successfully. Sep 9 05:43:09.654601 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 05:43:09.656256 systemd-logind[1555]: Session 5 logged out. Waiting for processes to exit. Sep 9 05:43:09.658216 systemd-logind[1555]: Removed session 5. Sep 9 05:43:09.702473 systemd[1]: Started sshd@5-10.128.0.19:22-139.178.89.65:42416.service - OpenSSH per-connection server daemon (139.178.89.65:42416). Sep 9 05:43:10.023929 sshd[1837]: Accepted publickey for core from 139.178.89.65 port 42416 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:43:10.025784 sshd-session[1837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:43:10.033162 systemd-logind[1555]: New session 6 of user core. Sep 9 05:43:10.039399 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 05:43:10.240331 sshd[1840]: Connection closed by 139.178.89.65 port 42416 Sep 9 05:43:10.241224 sshd-session[1837]: pam_unix(sshd:session): session closed for user core Sep 9 05:43:10.246990 systemd[1]: sshd@5-10.128.0.19:22-139.178.89.65:42416.service: Deactivated successfully. Sep 9 05:43:10.249380 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 05:43:10.250807 systemd-logind[1555]: Session 6 logged out. Waiting for processes to exit. Sep 9 05:43:10.252635 systemd-logind[1555]: Removed session 6. Sep 9 05:43:10.302533 systemd[1]: Started sshd@6-10.128.0.19:22-139.178.89.65:60874.service - OpenSSH per-connection server daemon (139.178.89.65:60874). Sep 9 05:43:10.626511 sshd[1846]: Accepted publickey for core from 139.178.89.65 port 60874 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:43:10.628306 sshd-session[1846]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:43:10.635700 systemd-logind[1555]: New session 7 of user core. Sep 9 05:43:10.642393 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 05:43:10.821163 sudo[1850]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 05:43:10.821674 sudo[1850]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:43:10.837634 sudo[1850]: pam_unix(sudo:session): session closed for user root Sep 9 05:43:10.881356 sshd[1849]: Connection closed by 139.178.89.65 port 60874 Sep 9 05:43:10.882902 sshd-session[1846]: pam_unix(sshd:session): session closed for user core Sep 9 05:43:10.888891 systemd[1]: sshd@6-10.128.0.19:22-139.178.89.65:60874.service: Deactivated successfully. Sep 9 05:43:10.891120 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 05:43:10.892388 systemd-logind[1555]: Session 7 logged out. Waiting for processes to exit. Sep 9 05:43:10.894507 systemd-logind[1555]: Removed session 7. Sep 9 05:43:10.934514 systemd[1]: Started sshd@7-10.128.0.19:22-139.178.89.65:60886.service - OpenSSH per-connection server daemon (139.178.89.65:60886). Sep 9 05:43:11.256043 sshd[1856]: Accepted publickey for core from 139.178.89.65 port 60886 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:43:11.257442 sshd-session[1856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:43:11.265034 systemd-logind[1555]: New session 8 of user core. Sep 9 05:43:11.270412 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 05:43:11.435565 sudo[1861]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 05:43:11.436076 sudo[1861]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:43:11.442751 sudo[1861]: pam_unix(sudo:session): session closed for user root Sep 9 05:43:11.456227 sudo[1860]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 05:43:11.456696 sudo[1860]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:43:11.469618 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:43:11.527475 augenrules[1883]: No rules Sep 9 05:43:11.529641 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:43:11.530009 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:43:11.531360 sudo[1860]: pam_unix(sudo:session): session closed for user root Sep 9 05:43:11.574775 sshd[1859]: Connection closed by 139.178.89.65 port 60886 Sep 9 05:43:11.575598 sshd-session[1856]: pam_unix(sshd:session): session closed for user core Sep 9 05:43:11.581621 systemd[1]: sshd@7-10.128.0.19:22-139.178.89.65:60886.service: Deactivated successfully. Sep 9 05:43:11.583932 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 05:43:11.585145 systemd-logind[1555]: Session 8 logged out. Waiting for processes to exit. Sep 9 05:43:11.587392 systemd-logind[1555]: Removed session 8. Sep 9 05:43:11.631405 systemd[1]: Started sshd@8-10.128.0.19:22-139.178.89.65:60888.service - OpenSSH per-connection server daemon (139.178.89.65:60888). Sep 9 05:43:11.940283 sshd[1892]: Accepted publickey for core from 139.178.89.65 port 60888 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:43:11.942007 sshd-session[1892]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:43:11.948080 systemd-logind[1555]: New session 9 of user core. Sep 9 05:43:11.955374 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 05:43:12.121308 sudo[1896]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 05:43:12.121806 sudo[1896]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:43:12.588631 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 05:43:12.605804 (dockerd)[1915]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 05:43:12.936240 dockerd[1915]: time="2025-09-09T05:43:12.935914708Z" level=info msg="Starting up" Sep 9 05:43:12.938017 dockerd[1915]: time="2025-09-09T05:43:12.937973482Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 05:43:12.953347 dockerd[1915]: time="2025-09-09T05:43:12.953272421Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 05:43:13.006702 dockerd[1915]: time="2025-09-09T05:43:13.006414576Z" level=info msg="Loading containers: start." Sep 9 05:43:13.025231 kernel: Initializing XFRM netlink socket Sep 9 05:43:13.372715 systemd-networkd[1468]: docker0: Link UP Sep 9 05:43:13.377811 dockerd[1915]: time="2025-09-09T05:43:13.377755448Z" level=info msg="Loading containers: done." Sep 9 05:43:13.397665 dockerd[1915]: time="2025-09-09T05:43:13.397550521Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 05:43:13.397665 dockerd[1915]: time="2025-09-09T05:43:13.397658420Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 05:43:13.397931 dockerd[1915]: time="2025-09-09T05:43:13.397765254Z" level=info msg="Initializing buildkit" Sep 9 05:43:13.427465 dockerd[1915]: time="2025-09-09T05:43:13.427399491Z" level=info msg="Completed buildkit initialization" Sep 9 05:43:13.438667 dockerd[1915]: time="2025-09-09T05:43:13.438589375Z" level=info msg="Daemon has completed initialization" Sep 9 05:43:13.438823 dockerd[1915]: time="2025-09-09T05:43:13.438784703Z" level=info msg="API listen on /run/docker.sock" Sep 9 05:43:13.439320 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 05:43:14.308537 containerd[1568]: time="2025-09-09T05:43:14.308466057Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 05:43:14.800901 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1050639881.mount: Deactivated successfully. Sep 9 05:43:16.418768 containerd[1568]: time="2025-09-09T05:43:16.418697811Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:16.420219 containerd[1568]: time="2025-09-09T05:43:16.420020377Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28807315" Sep 9 05:43:16.421223 containerd[1568]: time="2025-09-09T05:43:16.421167104Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:16.424402 containerd[1568]: time="2025-09-09T05:43:16.424360093Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:16.426396 containerd[1568]: time="2025-09-09T05:43:16.425591740Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 2.117078117s" Sep 9 05:43:16.426396 containerd[1568]: time="2025-09-09T05:43:16.425658787Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Sep 9 05:43:16.426825 containerd[1568]: time="2025-09-09T05:43:16.426778370Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 05:43:17.861853 containerd[1568]: time="2025-09-09T05:43:17.861784537Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:17.863345 containerd[1568]: time="2025-09-09T05:43:17.863236750Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24786062" Sep 9 05:43:17.864463 containerd[1568]: time="2025-09-09T05:43:17.864420886Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:17.867715 containerd[1568]: time="2025-09-09T05:43:17.867651080Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:17.869461 containerd[1568]: time="2025-09-09T05:43:17.868872586Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 1.442049196s" Sep 9 05:43:17.869461 containerd[1568]: time="2025-09-09T05:43:17.868920661Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Sep 9 05:43:17.869892 containerd[1568]: time="2025-09-09T05:43:17.869861836Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 05:43:19.035736 containerd[1568]: time="2025-09-09T05:43:19.035666734Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:19.037382 containerd[1568]: time="2025-09-09T05:43:19.037064655Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19176952" Sep 9 05:43:19.038707 containerd[1568]: time="2025-09-09T05:43:19.038667164Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:19.041911 containerd[1568]: time="2025-09-09T05:43:19.041850973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:19.043276 containerd[1568]: time="2025-09-09T05:43:19.043221241Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 1.173316131s" Sep 9 05:43:19.043438 containerd[1568]: time="2025-09-09T05:43:19.043412517Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Sep 9 05:43:19.044313 containerd[1568]: time="2025-09-09T05:43:19.044170371Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 05:43:19.565970 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 05:43:19.572287 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:43:20.098978 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:43:20.111848 (kubelet)[2200]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:43:20.204712 kubelet[2200]: E0909 05:43:20.204605 2200 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:43:20.207213 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:43:20.207450 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:43:20.208265 systemd[1]: kubelet.service: Consumed 252ms CPU time, 108.6M memory peak. Sep 9 05:43:20.566787 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3850421767.mount: Deactivated successfully. Sep 9 05:43:21.269210 containerd[1568]: time="2025-09-09T05:43:21.269127221Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:21.270593 containerd[1568]: time="2025-09-09T05:43:21.270351382Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30899065" Sep 9 05:43:21.271607 containerd[1568]: time="2025-09-09T05:43:21.271564760Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:21.275217 containerd[1568]: time="2025-09-09T05:43:21.274283085Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:21.275217 containerd[1568]: time="2025-09-09T05:43:21.274979528Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 2.230749625s" Sep 9 05:43:21.275217 containerd[1568]: time="2025-09-09T05:43:21.275021421Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Sep 9 05:43:21.275901 containerd[1568]: time="2025-09-09T05:43:21.275849210Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 05:43:21.673525 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3858919560.mount: Deactivated successfully. Sep 9 05:43:22.822324 containerd[1568]: time="2025-09-09T05:43:22.822251840Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:22.823915 containerd[1568]: time="2025-09-09T05:43:22.823865030Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18571883" Sep 9 05:43:22.825780 containerd[1568]: time="2025-09-09T05:43:22.825172026Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:22.829802 containerd[1568]: time="2025-09-09T05:43:22.829756061Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:22.831347 containerd[1568]: time="2025-09-09T05:43:22.831300750Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.555274993s" Sep 9 05:43:22.831474 containerd[1568]: time="2025-09-09T05:43:22.831352385Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 05:43:22.832167 containerd[1568]: time="2025-09-09T05:43:22.832016453Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 05:43:23.209944 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1321097557.mount: Deactivated successfully. Sep 9 05:43:23.217604 containerd[1568]: time="2025-09-09T05:43:23.217529703Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:43:23.218712 containerd[1568]: time="2025-09-09T05:43:23.218661832Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=322072" Sep 9 05:43:23.220213 containerd[1568]: time="2025-09-09T05:43:23.219802024Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:43:23.223486 containerd[1568]: time="2025-09-09T05:43:23.222334107Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:43:23.223486 containerd[1568]: time="2025-09-09T05:43:23.223335140Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 391.274975ms" Sep 9 05:43:23.223486 containerd[1568]: time="2025-09-09T05:43:23.223374226Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 05:43:23.224433 containerd[1568]: time="2025-09-09T05:43:23.224350584Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 05:43:23.680205 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3756240662.mount: Deactivated successfully. Sep 9 05:43:25.941372 containerd[1568]: time="2025-09-09T05:43:25.941294243Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:25.942941 containerd[1568]: time="2025-09-09T05:43:25.942844833Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57689565" Sep 9 05:43:25.943992 containerd[1568]: time="2025-09-09T05:43:25.943947505Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:25.947598 containerd[1568]: time="2025-09-09T05:43:25.947534249Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:25.949194 containerd[1568]: time="2025-09-09T05:43:25.948929990Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.724524166s" Sep 9 05:43:25.949194 containerd[1568]: time="2025-09-09T05:43:25.948977741Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 9 05:43:26.354920 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 9 05:43:29.422342 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:43:29.422654 systemd[1]: kubelet.service: Consumed 252ms CPU time, 108.6M memory peak. Sep 9 05:43:29.426151 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:43:29.465743 systemd[1]: Reload requested from client PID 2352 ('systemctl') (unit session-9.scope)... Sep 9 05:43:29.465773 systemd[1]: Reloading... Sep 9 05:43:29.654218 zram_generator::config[2396]: No configuration found. Sep 9 05:43:29.961911 systemd[1]: Reloading finished in 495 ms. Sep 9 05:43:30.050645 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:43:30.053012 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:43:30.053352 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:43:30.053434 systemd[1]: kubelet.service: Consumed 173ms CPU time, 98.3M memory peak. Sep 9 05:43:30.055964 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:43:30.780222 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:43:30.793695 (kubelet)[2449]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:43:30.855399 kubelet[2449]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:43:30.855399 kubelet[2449]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 05:43:30.855399 kubelet[2449]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:43:30.857201 kubelet[2449]: I0909 05:43:30.856104 2449 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:43:31.333210 kubelet[2449]: I0909 05:43:31.332978 2449 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 05:43:31.333210 kubelet[2449]: I0909 05:43:31.333019 2449 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:43:31.333824 kubelet[2449]: I0909 05:43:31.333793 2449 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 05:43:31.383425 kubelet[2449]: E0909 05:43:31.383367 2449 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.128.0.19:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:43:31.387215 kubelet[2449]: I0909 05:43:31.387053 2449 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:43:31.398533 kubelet[2449]: I0909 05:43:31.398484 2449 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:43:31.402101 kubelet[2449]: I0909 05:43:31.402068 2449 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:43:31.403787 kubelet[2449]: I0909 05:43:31.403713 2449 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:43:31.404069 kubelet[2449]: I0909 05:43:31.403767 2449 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:43:31.404307 kubelet[2449]: I0909 05:43:31.404072 2449 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:43:31.404307 kubelet[2449]: I0909 05:43:31.404093 2449 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 05:43:31.404307 kubelet[2449]: I0909 05:43:31.404273 2449 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:43:31.408890 kubelet[2449]: I0909 05:43:31.408862 2449 kubelet.go:446] "Attempting to sync node with API server" Sep 9 05:43:31.409017 kubelet[2449]: I0909 05:43:31.408909 2449 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:43:31.409017 kubelet[2449]: I0909 05:43:31.408944 2449 kubelet.go:352] "Adding apiserver pod source" Sep 9 05:43:31.409017 kubelet[2449]: I0909 05:43:31.408960 2449 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:43:31.420143 kubelet[2449]: W0909 05:43:31.419089 2449 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.128.0.19:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 9 05:43:31.420143 kubelet[2449]: E0909 05:43:31.419204 2449 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.128.0.19:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:43:31.420143 kubelet[2449]: I0909 05:43:31.419342 2449 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:43:31.420143 kubelet[2449]: I0909 05:43:31.420040 2449 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:43:31.422004 kubelet[2449]: W0909 05:43:31.420978 2449 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 05:43:31.423597 kubelet[2449]: W0909 05:43:31.423536 2449 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.128.0.19:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023&limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 9 05:43:31.423700 kubelet[2449]: E0909 05:43:31.423611 2449 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.128.0.19:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023&limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:43:31.427466 kubelet[2449]: I0909 05:43:31.425849 2449 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 05:43:31.427466 kubelet[2449]: I0909 05:43:31.425906 2449 server.go:1287] "Started kubelet" Sep 9 05:43:31.430321 kubelet[2449]: I0909 05:43:31.430266 2449 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:43:31.431964 kubelet[2449]: I0909 05:43:31.431918 2449 server.go:479] "Adding debug handlers to kubelet server" Sep 9 05:43:31.439107 kubelet[2449]: I0909 05:43:31.437784 2449 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:43:31.439107 kubelet[2449]: I0909 05:43:31.438145 2449 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:43:31.439107 kubelet[2449]: I0909 05:43:31.438517 2449 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:43:31.441451 kubelet[2449]: E0909 05:43:31.438926 2449 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.128.0.19:6443/api/v1/namespaces/default/events\": dial tcp 10.128.0.19:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023.186386e83183ca97 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,UID:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,},FirstTimestamp:2025-09-09 05:43:31.425880727 +0000 UTC m=+0.627017051,LastTimestamp:2025-09-09 05:43:31.425880727 +0000 UTC m=+0.627017051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,}" Sep 9 05:43:31.445327 kubelet[2449]: I0909 05:43:31.445294 2449 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:43:31.448037 kubelet[2449]: I0909 05:43:31.448010 2449 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 05:43:31.448334 kubelet[2449]: E0909 05:43:31.448308 2449 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" Sep 9 05:43:31.449389 kubelet[2449]: E0909 05:43:31.448788 2449 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.19:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023?timeout=10s\": dial tcp 10.128.0.19:6443: connect: connection refused" interval="200ms" Sep 9 05:43:31.450447 kubelet[2449]: I0909 05:43:31.450420 2449 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:43:31.450722 kubelet[2449]: I0909 05:43:31.450696 2449 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:43:31.451069 kubelet[2449]: E0909 05:43:31.451045 2449 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:43:31.453114 kubelet[2449]: I0909 05:43:31.453092 2449 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:43:31.454255 kubelet[2449]: I0909 05:43:31.453758 2449 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 05:43:31.454255 kubelet[2449]: I0909 05:43:31.453830 2449 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:43:31.467802 kubelet[2449]: I0909 05:43:31.467751 2449 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:43:31.469839 kubelet[2449]: I0909 05:43:31.469803 2449 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:43:31.469839 kubelet[2449]: I0909 05:43:31.469840 2449 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 05:43:31.469991 kubelet[2449]: I0909 05:43:31.469863 2449 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 05:43:31.469991 kubelet[2449]: I0909 05:43:31.469881 2449 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 05:43:31.469991 kubelet[2449]: E0909 05:43:31.469945 2449 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:43:31.478385 kubelet[2449]: W0909 05:43:31.478311 2449 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.128.0.19:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 9 05:43:31.478515 kubelet[2449]: E0909 05:43:31.478408 2449 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.128.0.19:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:43:31.479388 kubelet[2449]: W0909 05:43:31.479326 2449 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.128.0.19:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 9 05:43:31.479489 kubelet[2449]: E0909 05:43:31.479400 2449 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.128.0.19:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:43:31.492484 kubelet[2449]: I0909 05:43:31.492438 2449 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 05:43:31.492484 kubelet[2449]: I0909 05:43:31.492459 2449 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 05:43:31.492484 kubelet[2449]: I0909 05:43:31.492483 2449 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:43:31.495100 kubelet[2449]: I0909 05:43:31.495033 2449 policy_none.go:49] "None policy: Start" Sep 9 05:43:31.495100 kubelet[2449]: I0909 05:43:31.495074 2449 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 05:43:31.495100 kubelet[2449]: I0909 05:43:31.495096 2449 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:43:31.502907 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 05:43:31.514907 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 05:43:31.520456 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 05:43:31.529907 kubelet[2449]: I0909 05:43:31.529700 2449 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:43:31.531127 kubelet[2449]: I0909 05:43:31.530418 2449 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:43:31.531127 kubelet[2449]: I0909 05:43:31.530440 2449 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:43:31.532030 kubelet[2449]: I0909 05:43:31.531845 2449 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:43:31.533528 kubelet[2449]: E0909 05:43:31.533497 2449 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 05:43:31.533610 kubelet[2449]: E0909 05:43:31.533556 2449 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" Sep 9 05:43:31.589090 systemd[1]: Created slice kubepods-burstable-pode851d5bdc6cd025dcacd571b717747d1.slice - libcontainer container kubepods-burstable-pode851d5bdc6cd025dcacd571b717747d1.slice. Sep 9 05:43:31.612681 kubelet[2449]: E0909 05:43:31.612363 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.619204 systemd[1]: Created slice kubepods-burstable-pod453e81ce9df947d678a3c1d726018cdb.slice - libcontainer container kubepods-burstable-pod453e81ce9df947d678a3c1d726018cdb.slice. Sep 9 05:43:31.622517 kubelet[2449]: E0909 05:43:31.622483 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.626163 systemd[1]: Created slice kubepods-burstable-pod3a0566bf84ec05729636b44ed0c7671d.slice - libcontainer container kubepods-burstable-pod3a0566bf84ec05729636b44ed0c7671d.slice. Sep 9 05:43:31.629012 kubelet[2449]: E0909 05:43:31.628963 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.636035 kubelet[2449]: I0909 05:43:31.635710 2449 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.636282 kubelet[2449]: E0909 05:43:31.636235 2449 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.128.0.19:6443/api/v1/nodes\": dial tcp 10.128.0.19:6443: connect: connection refused" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.651031 kubelet[2449]: E0909 05:43:31.650986 2449 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.19:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023?timeout=10s\": dial tcp 10.128.0.19:6443: connect: connection refused" interval="400ms" Sep 9 05:43:31.655355 kubelet[2449]: I0909 05:43:31.655291 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e851d5bdc6cd025dcacd571b717747d1-kubeconfig\") pod \"kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"e851d5bdc6cd025dcacd571b717747d1\") " pod="kube-system/kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.655355 kubelet[2449]: I0909 05:43:31.655347 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-k8s-certs\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.655533 kubelet[2449]: I0909 05:43:31.655376 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-ca-certs\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.655533 kubelet[2449]: I0909 05:43:31.655403 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-flexvolume-dir\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.655533 kubelet[2449]: I0909 05:43:31.655434 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-kubeconfig\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.655533 kubelet[2449]: I0909 05:43:31.655466 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.655720 kubelet[2449]: I0909 05:43:31.655496 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/453e81ce9df947d678a3c1d726018cdb-ca-certs\") pod \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"453e81ce9df947d678a3c1d726018cdb\") " pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.655720 kubelet[2449]: I0909 05:43:31.655524 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/453e81ce9df947d678a3c1d726018cdb-k8s-certs\") pod \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"453e81ce9df947d678a3c1d726018cdb\") " pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.655720 kubelet[2449]: I0909 05:43:31.655554 2449 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/453e81ce9df947d678a3c1d726018cdb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"453e81ce9df947d678a3c1d726018cdb\") " pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.841146 kubelet[2449]: I0909 05:43:31.840899 2449 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.841546 kubelet[2449]: E0909 05:43:31.841505 2449 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.128.0.19:6443/api/v1/nodes\": dial tcp 10.128.0.19:6443: connect: connection refused" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:31.914779 containerd[1568]: time="2025-09-09T05:43:31.914720208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,Uid:e851d5bdc6cd025dcacd571b717747d1,Namespace:kube-system,Attempt:0,}" Sep 9 05:43:31.924569 containerd[1568]: time="2025-09-09T05:43:31.924511459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,Uid:453e81ce9df947d678a3c1d726018cdb,Namespace:kube-system,Attempt:0,}" Sep 9 05:43:31.930687 containerd[1568]: time="2025-09-09T05:43:31.930631317Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,Uid:3a0566bf84ec05729636b44ed0c7671d,Namespace:kube-system,Attempt:0,}" Sep 9 05:43:31.953942 containerd[1568]: time="2025-09-09T05:43:31.953888412Z" level=info msg="connecting to shim ea7488201fbca7cb400bcee17f88c3f8af4048c5774bf2424fe7b69cf7529054" address="unix:///run/containerd/s/69f5fff24a48b29233ec13b0a0b30c107069bbb2e7d74cdbcea3ddd72ce5aa40" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:43:31.981218 containerd[1568]: time="2025-09-09T05:43:31.980357602Z" level=info msg="connecting to shim 0fc95de8ace749cb65f50be2399c7651592fcf439ecb9d47e42c297861ae739a" address="unix:///run/containerd/s/222b6c35e8bf9adb43295d82935328fd776c4825d748347d9170ffaef46b7a1f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:43:32.013866 containerd[1568]: time="2025-09-09T05:43:32.013598204Z" level=info msg="connecting to shim bccc5013c44c327c03a124d9cb74d726d7da980f668b431d28ccecbc7d549cf3" address="unix:///run/containerd/s/d08dab45307e336173ecc3c603817dedbfe55a1298f385d82f616b9f7567091b" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:43:32.043463 systemd[1]: Started cri-containerd-0fc95de8ace749cb65f50be2399c7651592fcf439ecb9d47e42c297861ae739a.scope - libcontainer container 0fc95de8ace749cb65f50be2399c7651592fcf439ecb9d47e42c297861ae739a. Sep 9 05:43:32.054662 systemd[1]: Started cri-containerd-ea7488201fbca7cb400bcee17f88c3f8af4048c5774bf2424fe7b69cf7529054.scope - libcontainer container ea7488201fbca7cb400bcee17f88c3f8af4048c5774bf2424fe7b69cf7529054. Sep 9 05:43:32.055470 kubelet[2449]: E0909 05:43:32.054659 2449 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.19:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023?timeout=10s\": dial tcp 10.128.0.19:6443: connect: connection refused" interval="800ms" Sep 9 05:43:32.110772 systemd[1]: Started cri-containerd-bccc5013c44c327c03a124d9cb74d726d7da980f668b431d28ccecbc7d549cf3.scope - libcontainer container bccc5013c44c327c03a124d9cb74d726d7da980f668b431d28ccecbc7d549cf3. Sep 9 05:43:32.162814 containerd[1568]: time="2025-09-09T05:43:32.162761506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,Uid:e851d5bdc6cd025dcacd571b717747d1,Namespace:kube-system,Attempt:0,} returns sandbox id \"ea7488201fbca7cb400bcee17f88c3f8af4048c5774bf2424fe7b69cf7529054\"" Sep 9 05:43:32.170631 kubelet[2449]: E0909 05:43:32.170584 2449 kubelet_pods.go:555] "Hostname for pod was too long, truncated it" podName="kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" hostnameMaxLen=63 truncatedHostname="kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21" Sep 9 05:43:32.175619 containerd[1568]: time="2025-09-09T05:43:32.175064524Z" level=info msg="CreateContainer within sandbox \"ea7488201fbca7cb400bcee17f88c3f8af4048c5774bf2424fe7b69cf7529054\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 05:43:32.188256 containerd[1568]: time="2025-09-09T05:43:32.187948574Z" level=info msg="Container ae3bddc38c7890c71321f6540040aa422241c8cf0869c370d400edd17bc081c1: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:43:32.209231 containerd[1568]: time="2025-09-09T05:43:32.209158416Z" level=info msg="CreateContainer within sandbox \"ea7488201fbca7cb400bcee17f88c3f8af4048c5774bf2424fe7b69cf7529054\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ae3bddc38c7890c71321f6540040aa422241c8cf0869c370d400edd17bc081c1\"" Sep 9 05:43:32.211816 containerd[1568]: time="2025-09-09T05:43:32.211765088Z" level=info msg="StartContainer for \"ae3bddc38c7890c71321f6540040aa422241c8cf0869c370d400edd17bc081c1\"" Sep 9 05:43:32.217691 containerd[1568]: time="2025-09-09T05:43:32.217644144Z" level=info msg="connecting to shim ae3bddc38c7890c71321f6540040aa422241c8cf0869c370d400edd17bc081c1" address="unix:///run/containerd/s/69f5fff24a48b29233ec13b0a0b30c107069bbb2e7d74cdbcea3ddd72ce5aa40" protocol=ttrpc version=3 Sep 9 05:43:32.234768 containerd[1568]: time="2025-09-09T05:43:32.234707842Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,Uid:3a0566bf84ec05729636b44ed0c7671d,Namespace:kube-system,Attempt:0,} returns sandbox id \"0fc95de8ace749cb65f50be2399c7651592fcf439ecb9d47e42c297861ae739a\"" Sep 9 05:43:32.237796 kubelet[2449]: E0909 05:43:32.237702 2449 kubelet_pods.go:555] "Hostname for pod was too long, truncated it" podName="kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" hostnameMaxLen=63 truncatedHostname="kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7" Sep 9 05:43:32.242267 containerd[1568]: time="2025-09-09T05:43:32.242087611Z" level=info msg="CreateContainer within sandbox \"0fc95de8ace749cb65f50be2399c7651592fcf439ecb9d47e42c297861ae739a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 05:43:32.243710 containerd[1568]: time="2025-09-09T05:43:32.243649275Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023,Uid:453e81ce9df947d678a3c1d726018cdb,Namespace:kube-system,Attempt:0,} returns sandbox id \"bccc5013c44c327c03a124d9cb74d726d7da980f668b431d28ccecbc7d549cf3\"" Sep 9 05:43:32.250388 kubelet[2449]: E0909 05:43:32.250349 2449 kubelet_pods.go:555] "Hostname for pod was too long, truncated it" podName="kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" hostnameMaxLen=63 truncatedHostname="kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21" Sep 9 05:43:32.250388 kubelet[2449]: I0909 05:43:32.250457 2449 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:32.252562 kubelet[2449]: E0909 05:43:32.252486 2449 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.128.0.19:6443/api/v1/nodes\": dial tcp 10.128.0.19:6443: connect: connection refused" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:32.253363 containerd[1568]: time="2025-09-09T05:43:32.253324379Z" level=info msg="CreateContainer within sandbox \"bccc5013c44c327c03a124d9cb74d726d7da980f668b431d28ccecbc7d549cf3\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 05:43:32.270426 systemd[1]: Started cri-containerd-ae3bddc38c7890c71321f6540040aa422241c8cf0869c370d400edd17bc081c1.scope - libcontainer container ae3bddc38c7890c71321f6540040aa422241c8cf0869c370d400edd17bc081c1. Sep 9 05:43:32.272944 containerd[1568]: time="2025-09-09T05:43:32.272752224Z" level=info msg="Container 1f4ee058d4299fe991e6c7902fc426ed95162c805c585b74ff1cd0e00a925276: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:43:32.275983 containerd[1568]: time="2025-09-09T05:43:32.275913471Z" level=info msg="Container d1a6607a3c532271186634e5d6ad62dcfd2c2ad7ce7da161945bfd32819f3ef3: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:43:32.283138 kubelet[2449]: W0909 05:43:32.283017 2449 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.128.0.19:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 9 05:43:32.283514 kubelet[2449]: E0909 05:43:32.283257 2449 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.128.0.19:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:43:32.293894 containerd[1568]: time="2025-09-09T05:43:32.293739114Z" level=info msg="CreateContainer within sandbox \"bccc5013c44c327c03a124d9cb74d726d7da980f668b431d28ccecbc7d549cf3\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d1a6607a3c532271186634e5d6ad62dcfd2c2ad7ce7da161945bfd32819f3ef3\"" Sep 9 05:43:32.294880 containerd[1568]: time="2025-09-09T05:43:32.294746004Z" level=info msg="CreateContainer within sandbox \"0fc95de8ace749cb65f50be2399c7651592fcf439ecb9d47e42c297861ae739a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1f4ee058d4299fe991e6c7902fc426ed95162c805c585b74ff1cd0e00a925276\"" Sep 9 05:43:32.296206 containerd[1568]: time="2025-09-09T05:43:32.296091957Z" level=info msg="StartContainer for \"1f4ee058d4299fe991e6c7902fc426ed95162c805c585b74ff1cd0e00a925276\"" Sep 9 05:43:32.297848 containerd[1568]: time="2025-09-09T05:43:32.297809998Z" level=info msg="connecting to shim 1f4ee058d4299fe991e6c7902fc426ed95162c805c585b74ff1cd0e00a925276" address="unix:///run/containerd/s/222b6c35e8bf9adb43295d82935328fd776c4825d748347d9170ffaef46b7a1f" protocol=ttrpc version=3 Sep 9 05:43:32.298788 containerd[1568]: time="2025-09-09T05:43:32.298245552Z" level=info msg="StartContainer for \"d1a6607a3c532271186634e5d6ad62dcfd2c2ad7ce7da161945bfd32819f3ef3\"" Sep 9 05:43:32.302457 containerd[1568]: time="2025-09-09T05:43:32.302406879Z" level=info msg="connecting to shim d1a6607a3c532271186634e5d6ad62dcfd2c2ad7ce7da161945bfd32819f3ef3" address="unix:///run/containerd/s/d08dab45307e336173ecc3c603817dedbfe55a1298f385d82f616b9f7567091b" protocol=ttrpc version=3 Sep 9 05:43:32.336393 systemd[1]: Started cri-containerd-d1a6607a3c532271186634e5d6ad62dcfd2c2ad7ce7da161945bfd32819f3ef3.scope - libcontainer container d1a6607a3c532271186634e5d6ad62dcfd2c2ad7ce7da161945bfd32819f3ef3. Sep 9 05:43:32.346675 systemd[1]: Started cri-containerd-1f4ee058d4299fe991e6c7902fc426ed95162c805c585b74ff1cd0e00a925276.scope - libcontainer container 1f4ee058d4299fe991e6c7902fc426ed95162c805c585b74ff1cd0e00a925276. Sep 9 05:43:32.413648 containerd[1568]: time="2025-09-09T05:43:32.412728256Z" level=info msg="StartContainer for \"ae3bddc38c7890c71321f6540040aa422241c8cf0869c370d400edd17bc081c1\" returns successfully" Sep 9 05:43:32.501349 containerd[1568]: time="2025-09-09T05:43:32.500932853Z" level=info msg="StartContainer for \"1f4ee058d4299fe991e6c7902fc426ed95162c805c585b74ff1cd0e00a925276\" returns successfully" Sep 9 05:43:32.520279 kubelet[2449]: W0909 05:43:32.518935 2449 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.128.0.19:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 9 05:43:32.520444 kubelet[2449]: E0909 05:43:32.519041 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:32.520735 kubelet[2449]: E0909 05:43:32.520567 2449 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.128.0.19:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:43:32.522086 kubelet[2449]: W0909 05:43:32.521761 2449 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.128.0.19:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 9 05:43:32.522225 kubelet[2449]: E0909 05:43:32.522101 2449 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.128.0.19:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:43:32.529209 kubelet[2449]: E0909 05:43:32.529143 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:32.533099 containerd[1568]: time="2025-09-09T05:43:32.532858992Z" level=info msg="StartContainer for \"d1a6607a3c532271186634e5d6ad62dcfd2c2ad7ce7da161945bfd32819f3ef3\" returns successfully" Sep 9 05:43:33.057829 kubelet[2449]: I0909 05:43:33.057412 2449 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:33.536040 kubelet[2449]: E0909 05:43:33.535594 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:33.536040 kubelet[2449]: E0909 05:43:33.535792 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:33.537409 kubelet[2449]: E0909 05:43:33.537393 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:34.543518 kubelet[2449]: E0909 05:43:34.543439 2449 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.462039 kubelet[2449]: I0909 05:43:35.461985 2449 kubelet_node_status.go:78] "Successfully registered node" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.462039 kubelet[2449]: E0909 05:43:35.462040 2449 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\": node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" Sep 9 05:43:35.540924 kubelet[2449]: I0909 05:43:35.540882 2449 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.547510 kubelet[2449]: E0909 05:43:35.547457 2449 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.549541 kubelet[2449]: I0909 05:43:35.549514 2449 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.552787 kubelet[2449]: E0909 05:43:35.552571 2449 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.552787 kubelet[2449]: I0909 05:43:35.552597 2449 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.554675 kubelet[2449]: E0909 05:43:35.554603 2449 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.554675 kubelet[2449]: I0909 05:43:35.554632 2449 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:35.557324 kubelet[2449]: E0909 05:43:35.557271 2449 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:36.414691 kubelet[2449]: I0909 05:43:36.414581 2449 apiserver.go:52] "Watching apiserver" Sep 9 05:43:36.454878 kubelet[2449]: I0909 05:43:36.454823 2449 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 05:43:36.542538 kubelet[2449]: I0909 05:43:36.542496 2449 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:36.552819 kubelet[2449]: W0909 05:43:36.552732 2449 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Sep 9 05:43:37.764669 systemd[1]: Reload requested from client PID 2719 ('systemctl') (unit session-9.scope)... Sep 9 05:43:37.764690 systemd[1]: Reloading... Sep 9 05:43:37.926208 zram_generator::config[2763]: No configuration found. Sep 9 05:43:38.250131 systemd[1]: Reloading finished in 484 ms. Sep 9 05:43:38.288142 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:43:38.302975 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:43:38.303394 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:43:38.303511 systemd[1]: kubelet.service: Consumed 1.176s CPU time, 131.5M memory peak. Sep 9 05:43:38.306557 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:43:38.695045 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:43:38.707863 (kubelet)[2811]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:43:38.789209 kubelet[2811]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:43:38.789737 kubelet[2811]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 05:43:38.789737 kubelet[2811]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:43:38.789917 kubelet[2811]: I0909 05:43:38.789867 2811 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:43:38.802961 kubelet[2811]: I0909 05:43:38.802624 2811 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 05:43:38.802961 kubelet[2811]: I0909 05:43:38.802659 2811 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:43:38.803819 kubelet[2811]: I0909 05:43:38.803792 2811 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 05:43:38.808170 kubelet[2811]: I0909 05:43:38.808135 2811 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 05:43:38.815685 kubelet[2811]: I0909 05:43:38.815648 2811 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:43:38.823013 kubelet[2811]: I0909 05:43:38.822980 2811 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:43:38.827861 kubelet[2811]: I0909 05:43:38.827384 2811 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:43:38.827861 kubelet[2811]: I0909 05:43:38.827699 2811 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:43:38.828102 kubelet[2811]: I0909 05:43:38.827734 2811 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:43:38.828288 kubelet[2811]: I0909 05:43:38.828117 2811 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:43:38.828288 kubelet[2811]: I0909 05:43:38.828138 2811 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 05:43:38.828288 kubelet[2811]: I0909 05:43:38.828235 2811 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:43:38.830164 kubelet[2811]: I0909 05:43:38.828468 2811 kubelet.go:446] "Attempting to sync node with API server" Sep 9 05:43:38.830164 kubelet[2811]: I0909 05:43:38.829165 2811 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:43:38.830164 kubelet[2811]: I0909 05:43:38.829240 2811 kubelet.go:352] "Adding apiserver pod source" Sep 9 05:43:38.830164 kubelet[2811]: I0909 05:43:38.829257 2811 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:43:38.836204 kubelet[2811]: I0909 05:43:38.835572 2811 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:43:38.836308 kubelet[2811]: I0909 05:43:38.836271 2811 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:43:38.836953 kubelet[2811]: I0909 05:43:38.836906 2811 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 05:43:38.836953 kubelet[2811]: I0909 05:43:38.836953 2811 server.go:1287] "Started kubelet" Sep 9 05:43:38.840977 kubelet[2811]: I0909 05:43:38.840376 2811 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:43:38.855959 kubelet[2811]: I0909 05:43:38.855922 2811 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:43:38.857735 kubelet[2811]: I0909 05:43:38.857711 2811 server.go:479] "Adding debug handlers to kubelet server" Sep 9 05:43:38.866053 kubelet[2811]: I0909 05:43:38.865978 2811 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:43:38.866504 kubelet[2811]: I0909 05:43:38.866461 2811 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:43:38.885441 kubelet[2811]: I0909 05:43:38.885404 2811 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:43:38.891054 kubelet[2811]: I0909 05:43:38.889553 2811 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 05:43:38.891054 kubelet[2811]: E0909 05:43:38.889851 2811 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" not found" Sep 9 05:43:38.891493 kubelet[2811]: I0909 05:43:38.891458 2811 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 05:43:38.892354 kubelet[2811]: I0909 05:43:38.891657 2811 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:43:38.905123 kubelet[2811]: I0909 05:43:38.903089 2811 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:43:38.906735 kubelet[2811]: I0909 05:43:38.905807 2811 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:43:38.906735 kubelet[2811]: I0909 05:43:38.905849 2811 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 05:43:38.906735 kubelet[2811]: I0909 05:43:38.905885 2811 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 05:43:38.906735 kubelet[2811]: I0909 05:43:38.905896 2811 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 05:43:38.906735 kubelet[2811]: E0909 05:43:38.905961 2811 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:43:38.916668 kubelet[2811]: I0909 05:43:38.915777 2811 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:43:38.916668 kubelet[2811]: I0909 05:43:38.915910 2811 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:43:38.931281 kubelet[2811]: E0909 05:43:38.931245 2811 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:43:38.931802 kubelet[2811]: I0909 05:43:38.931751 2811 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:43:39.006247 kubelet[2811]: I0909 05:43:39.006085 2811 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 05:43:39.006247 kubelet[2811]: I0909 05:43:39.006114 2811 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 05:43:39.006247 kubelet[2811]: I0909 05:43:39.006158 2811 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:43:39.007958 kubelet[2811]: I0909 05:43:39.006401 2811 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 05:43:39.007958 kubelet[2811]: I0909 05:43:39.006418 2811 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 05:43:39.007958 kubelet[2811]: I0909 05:43:39.006444 2811 policy_none.go:49] "None policy: Start" Sep 9 05:43:39.007958 kubelet[2811]: I0909 05:43:39.006460 2811 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 05:43:39.007958 kubelet[2811]: I0909 05:43:39.006476 2811 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:43:39.007958 kubelet[2811]: I0909 05:43:39.006637 2811 state_mem.go:75] "Updated machine memory state" Sep 9 05:43:39.008286 kubelet[2811]: E0909 05:43:39.008258 2811 kubelet.go:2406] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 9 05:43:39.018207 kubelet[2811]: I0909 05:43:39.017983 2811 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:43:39.019104 kubelet[2811]: I0909 05:43:39.019080 2811 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:43:39.019492 kubelet[2811]: I0909 05:43:39.019423 2811 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:43:39.023198 kubelet[2811]: I0909 05:43:39.022454 2811 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:43:39.028721 kubelet[2811]: E0909 05:43:39.028691 2811 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 05:43:39.142483 kubelet[2811]: I0909 05:43:39.142444 2811 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.154687 kubelet[2811]: I0909 05:43:39.154654 2811 kubelet_node_status.go:124] "Node was previously registered" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.155060 kubelet[2811]: I0909 05:43:39.154998 2811 kubelet_node_status.go:78] "Successfully registered node" node="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.209647 kubelet[2811]: I0909 05:43:39.209600 2811 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.210001 kubelet[2811]: I0909 05:43:39.209974 2811 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.212203 kubelet[2811]: I0909 05:43:39.210959 2811 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.220269 kubelet[2811]: W0909 05:43:39.220010 2811 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Sep 9 05:43:39.223066 kubelet[2811]: W0909 05:43:39.222260 2811 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Sep 9 05:43:39.224160 kubelet[2811]: W0909 05:43:39.224133 2811 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Sep 9 05:43:39.224382 kubelet[2811]: E0909 05:43:39.224279 2811 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" already exists" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.292997 kubelet[2811]: I0909 05:43:39.292639 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/453e81ce9df947d678a3c1d726018cdb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"453e81ce9df947d678a3c1d726018cdb\") " pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.293669 kubelet[2811]: I0909 05:43:39.293344 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-ca-certs\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.293669 kubelet[2811]: I0909 05:43:39.293603 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-flexvolume-dir\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.294261 kubelet[2811]: I0909 05:43:39.294102 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e851d5bdc6cd025dcacd571b717747d1-kubeconfig\") pod \"kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"e851d5bdc6cd025dcacd571b717747d1\") " pod="kube-system/kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.294589 kubelet[2811]: I0909 05:43:39.294478 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.294856 kubelet[2811]: I0909 05:43:39.294777 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/453e81ce9df947d678a3c1d726018cdb-ca-certs\") pod \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"453e81ce9df947d678a3c1d726018cdb\") " pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.295192 kubelet[2811]: I0909 05:43:39.295109 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/453e81ce9df947d678a3c1d726018cdb-k8s-certs\") pod \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"453e81ce9df947d678a3c1d726018cdb\") " pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.295575 kubelet[2811]: I0909 05:43:39.295505 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-k8s-certs\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.295935 kubelet[2811]: I0909 05:43:39.295864 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3a0566bf84ec05729636b44ed0c7671d-kubeconfig\") pod \"kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" (UID: \"3a0566bf84ec05729636b44ed0c7671d\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.833490 kubelet[2811]: I0909 05:43:39.833154 2811 apiserver.go:52] "Watching apiserver" Sep 9 05:43:39.892400 kubelet[2811]: I0909 05:43:39.892344 2811 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 05:43:39.970297 kubelet[2811]: I0909 05:43:39.967848 2811 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.970705 kubelet[2811]: I0909 05:43:39.969659 2811 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.984385 kubelet[2811]: W0909 05:43:39.980462 2811 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Sep 9 05:43:39.984385 kubelet[2811]: E0909 05:43:39.981523 2811 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" already exists" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:39.984385 kubelet[2811]: W0909 05:43:39.984344 2811 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters] Sep 9 05:43:39.984656 kubelet[2811]: E0909 05:43:39.984408 2811 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" already exists" pod="kube-system/kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:43:40.037789 kubelet[2811]: I0909 05:43:40.037527 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" podStartSLOduration=1.037463553 podStartE2EDuration="1.037463553s" podCreationTimestamp="2025-09-09 05:43:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:43:40.018772097 +0000 UTC m=+1.303320122" watchObservedRunningTime="2025-09-09 05:43:40.037463553 +0000 UTC m=+1.322011582" Sep 9 05:43:40.038493 kubelet[2811]: I0909 05:43:40.038409 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" podStartSLOduration=1.038092215 podStartE2EDuration="1.038092215s" podCreationTimestamp="2025-09-09 05:43:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:43:40.036272944 +0000 UTC m=+1.320820972" watchObservedRunningTime="2025-09-09 05:43:40.038092215 +0000 UTC m=+1.322640241" Sep 9 05:43:41.131513 update_engine[1557]: I20250909 05:43:41.131387 1557 update_attempter.cc:509] Updating boot flags... Sep 9 05:43:43.496361 kubelet[2811]: I0909 05:43:43.496283 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" podStartSLOduration=7.496231964 podStartE2EDuration="7.496231964s" podCreationTimestamp="2025-09-09 05:43:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:43:40.053925081 +0000 UTC m=+1.338473108" watchObservedRunningTime="2025-09-09 05:43:43.496231964 +0000 UTC m=+4.780779988" Sep 9 05:43:43.921581 kubelet[2811]: I0909 05:43:43.921509 2811 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 05:43:43.923464 containerd[1568]: time="2025-09-09T05:43:43.923401670Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 05:43:43.925012 kubelet[2811]: I0909 05:43:43.924614 2811 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 05:43:44.812872 systemd[1]: Created slice kubepods-besteffort-pod0dcdb062_c2e5_4716_b853_7c8074b1191e.slice - libcontainer container kubepods-besteffort-pod0dcdb062_c2e5_4716_b853_7c8074b1191e.slice. Sep 9 05:43:44.840440 kubelet[2811]: I0909 05:43:44.840392 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0dcdb062-c2e5-4716-b853-7c8074b1191e-lib-modules\") pod \"kube-proxy-5446p\" (UID: \"0dcdb062-c2e5-4716-b853-7c8074b1191e\") " pod="kube-system/kube-proxy-5446p" Sep 9 05:43:44.840970 kubelet[2811]: I0909 05:43:44.840454 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0dcdb062-c2e5-4716-b853-7c8074b1191e-xtables-lock\") pod \"kube-proxy-5446p\" (UID: \"0dcdb062-c2e5-4716-b853-7c8074b1191e\") " pod="kube-system/kube-proxy-5446p" Sep 9 05:43:44.840970 kubelet[2811]: I0909 05:43:44.840495 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0dcdb062-c2e5-4716-b853-7c8074b1191e-kube-proxy\") pod \"kube-proxy-5446p\" (UID: \"0dcdb062-c2e5-4716-b853-7c8074b1191e\") " pod="kube-system/kube-proxy-5446p" Sep 9 05:43:44.840970 kubelet[2811]: I0909 05:43:44.840521 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwbbb\" (UniqueName: \"kubernetes.io/projected/0dcdb062-c2e5-4716-b853-7c8074b1191e-kube-api-access-nwbbb\") pod \"kube-proxy-5446p\" (UID: \"0dcdb062-c2e5-4716-b853-7c8074b1191e\") " pod="kube-system/kube-proxy-5446p" Sep 9 05:43:45.124738 containerd[1568]: time="2025-09-09T05:43:45.124629459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5446p,Uid:0dcdb062-c2e5-4716-b853-7c8074b1191e,Namespace:kube-system,Attempt:0,}" Sep 9 05:43:45.179274 containerd[1568]: time="2025-09-09T05:43:45.177237187Z" level=info msg="connecting to shim 37e81933be1f9515766c5c1ae1d77d1bccff03f062a423837372cb6ce3e43112" address="unix:///run/containerd/s/8029382f4922b37dcc5e2d1b940ed53f6d81dfb6d66779092ae510e70d3e9d8e" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:43:45.188061 systemd[1]: Created slice kubepods-besteffort-pod6a154c72_8ed6_4767_94ec_e59515be31ac.slice - libcontainer container kubepods-besteffort-pod6a154c72_8ed6_4767_94ec_e59515be31ac.slice. Sep 9 05:43:45.232443 systemd[1]: Started cri-containerd-37e81933be1f9515766c5c1ae1d77d1bccff03f062a423837372cb6ce3e43112.scope - libcontainer container 37e81933be1f9515766c5c1ae1d77d1bccff03f062a423837372cb6ce3e43112. Sep 9 05:43:45.268997 containerd[1568]: time="2025-09-09T05:43:45.268943764Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5446p,Uid:0dcdb062-c2e5-4716-b853-7c8074b1191e,Namespace:kube-system,Attempt:0,} returns sandbox id \"37e81933be1f9515766c5c1ae1d77d1bccff03f062a423837372cb6ce3e43112\"" Sep 9 05:43:45.274459 containerd[1568]: time="2025-09-09T05:43:45.274414084Z" level=info msg="CreateContainer within sandbox \"37e81933be1f9515766c5c1ae1d77d1bccff03f062a423837372cb6ce3e43112\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 05:43:45.290213 containerd[1568]: time="2025-09-09T05:43:45.287696091Z" level=info msg="Container 727bfbc3f6512b7d16724a9919cdae5cc11fa2035e5aa31d609b745fee9b2a50: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:43:45.300044 containerd[1568]: time="2025-09-09T05:43:45.299974855Z" level=info msg="CreateContainer within sandbox \"37e81933be1f9515766c5c1ae1d77d1bccff03f062a423837372cb6ce3e43112\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"727bfbc3f6512b7d16724a9919cdae5cc11fa2035e5aa31d609b745fee9b2a50\"" Sep 9 05:43:45.301153 containerd[1568]: time="2025-09-09T05:43:45.301114352Z" level=info msg="StartContainer for \"727bfbc3f6512b7d16724a9919cdae5cc11fa2035e5aa31d609b745fee9b2a50\"" Sep 9 05:43:45.305139 containerd[1568]: time="2025-09-09T05:43:45.305042120Z" level=info msg="connecting to shim 727bfbc3f6512b7d16724a9919cdae5cc11fa2035e5aa31d609b745fee9b2a50" address="unix:///run/containerd/s/8029382f4922b37dcc5e2d1b940ed53f6d81dfb6d66779092ae510e70d3e9d8e" protocol=ttrpc version=3 Sep 9 05:43:45.337433 systemd[1]: Started cri-containerd-727bfbc3f6512b7d16724a9919cdae5cc11fa2035e5aa31d609b745fee9b2a50.scope - libcontainer container 727bfbc3f6512b7d16724a9919cdae5cc11fa2035e5aa31d609b745fee9b2a50. Sep 9 05:43:45.347263 kubelet[2811]: I0909 05:43:45.347212 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6a154c72-8ed6-4767-94ec-e59515be31ac-var-lib-calico\") pod \"tigera-operator-755d956888-qr6hl\" (UID: \"6a154c72-8ed6-4767-94ec-e59515be31ac\") " pod="tigera-operator/tigera-operator-755d956888-qr6hl" Sep 9 05:43:45.347263 kubelet[2811]: I0909 05:43:45.347274 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm8rv\" (UniqueName: \"kubernetes.io/projected/6a154c72-8ed6-4767-94ec-e59515be31ac-kube-api-access-jm8rv\") pod \"tigera-operator-755d956888-qr6hl\" (UID: \"6a154c72-8ed6-4767-94ec-e59515be31ac\") " pod="tigera-operator/tigera-operator-755d956888-qr6hl" Sep 9 05:43:45.403791 containerd[1568]: time="2025-09-09T05:43:45.402884088Z" level=info msg="StartContainer for \"727bfbc3f6512b7d16724a9919cdae5cc11fa2035e5aa31d609b745fee9b2a50\" returns successfully" Sep 9 05:43:45.495064 containerd[1568]: time="2025-09-09T05:43:45.495010428Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-qr6hl,Uid:6a154c72-8ed6-4767-94ec-e59515be31ac,Namespace:tigera-operator,Attempt:0,}" Sep 9 05:43:45.526400 containerd[1568]: time="2025-09-09T05:43:45.526318148Z" level=info msg="connecting to shim b45993b74282181ea6e4d0d5971725766b137633f5a662957341b50a5671de27" address="unix:///run/containerd/s/e1981fec0552fa9aa3657df2616907432f09d23c61217b0fe3ebd876f5911538" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:43:45.576842 systemd[1]: Started cri-containerd-b45993b74282181ea6e4d0d5971725766b137633f5a662957341b50a5671de27.scope - libcontainer container b45993b74282181ea6e4d0d5971725766b137633f5a662957341b50a5671de27. Sep 9 05:43:45.675504 containerd[1568]: time="2025-09-09T05:43:45.674967560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-qr6hl,Uid:6a154c72-8ed6-4767-94ec-e59515be31ac,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"b45993b74282181ea6e4d0d5971725766b137633f5a662957341b50a5671de27\"" Sep 9 05:43:45.681603 containerd[1568]: time="2025-09-09T05:43:45.681568504Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 05:43:45.998993 kubelet[2811]: I0909 05:43:45.998663 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5446p" podStartSLOduration=1.998639121 podStartE2EDuration="1.998639121s" podCreationTimestamp="2025-09-09 05:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:43:45.998338674 +0000 UTC m=+7.282886700" watchObservedRunningTime="2025-09-09 05:43:45.998639121 +0000 UTC m=+7.283187149" Sep 9 05:43:46.752470 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount64566558.mount: Deactivated successfully. Sep 9 05:43:47.689678 containerd[1568]: time="2025-09-09T05:43:47.689599885Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:47.691236 containerd[1568]: time="2025-09-09T05:43:47.690995475Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 05:43:47.692484 containerd[1568]: time="2025-09-09T05:43:47.692418320Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:47.695340 containerd[1568]: time="2025-09-09T05:43:47.695228731Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:43:47.696754 containerd[1568]: time="2025-09-09T05:43:47.696131285Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.014519728s" Sep 9 05:43:47.696754 containerd[1568]: time="2025-09-09T05:43:47.696191864Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 05:43:47.700055 containerd[1568]: time="2025-09-09T05:43:47.699975434Z" level=info msg="CreateContainer within sandbox \"b45993b74282181ea6e4d0d5971725766b137633f5a662957341b50a5671de27\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 05:43:47.712122 containerd[1568]: time="2025-09-09T05:43:47.711366793Z" level=info msg="Container 8f7b66180329a86baec998b7df715255070fd5164b48a9c378b0e4de64899ebf: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:43:47.723198 containerd[1568]: time="2025-09-09T05:43:47.723085476Z" level=info msg="CreateContainer within sandbox \"b45993b74282181ea6e4d0d5971725766b137633f5a662957341b50a5671de27\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8f7b66180329a86baec998b7df715255070fd5164b48a9c378b0e4de64899ebf\"" Sep 9 05:43:47.724144 containerd[1568]: time="2025-09-09T05:43:47.723859453Z" level=info msg="StartContainer for \"8f7b66180329a86baec998b7df715255070fd5164b48a9c378b0e4de64899ebf\"" Sep 9 05:43:47.725803 containerd[1568]: time="2025-09-09T05:43:47.725763263Z" level=info msg="connecting to shim 8f7b66180329a86baec998b7df715255070fd5164b48a9c378b0e4de64899ebf" address="unix:///run/containerd/s/e1981fec0552fa9aa3657df2616907432f09d23c61217b0fe3ebd876f5911538" protocol=ttrpc version=3 Sep 9 05:43:47.765462 systemd[1]: Started cri-containerd-8f7b66180329a86baec998b7df715255070fd5164b48a9c378b0e4de64899ebf.scope - libcontainer container 8f7b66180329a86baec998b7df715255070fd5164b48a9c378b0e4de64899ebf. Sep 9 05:43:47.814559 containerd[1568]: time="2025-09-09T05:43:47.814428494Z" level=info msg="StartContainer for \"8f7b66180329a86baec998b7df715255070fd5164b48a9c378b0e4de64899ebf\" returns successfully" Sep 9 05:43:48.005639 kubelet[2811]: I0909 05:43:48.005433 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-qr6hl" podStartSLOduration=0.988201526 podStartE2EDuration="3.005406396s" podCreationTimestamp="2025-09-09 05:43:45 +0000 UTC" firstStartedPulling="2025-09-09 05:43:45.679929079 +0000 UTC m=+6.964477094" lastFinishedPulling="2025-09-09 05:43:47.697133961 +0000 UTC m=+8.981681964" observedRunningTime="2025-09-09 05:43:48.005324984 +0000 UTC m=+9.289873009" watchObservedRunningTime="2025-09-09 05:43:48.005406396 +0000 UTC m=+9.289954421" Sep 9 05:43:55.032157 sudo[1896]: pam_unix(sudo:session): session closed for user root Sep 9 05:43:55.078206 sshd[1895]: Connection closed by 139.178.89.65 port 60888 Sep 9 05:43:55.079501 sshd-session[1892]: pam_unix(sshd:session): session closed for user core Sep 9 05:43:55.094727 systemd[1]: sshd@8-10.128.0.19:22-139.178.89.65:60888.service: Deactivated successfully. Sep 9 05:43:55.104522 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 05:43:55.107285 systemd[1]: session-9.scope: Consumed 6.420s CPU time, 231.4M memory peak. Sep 9 05:43:55.113092 systemd-logind[1555]: Session 9 logged out. Waiting for processes to exit. Sep 9 05:43:55.117703 systemd-logind[1555]: Removed session 9. Sep 9 05:44:00.972141 systemd[1]: Created slice kubepods-besteffort-pod1f2ac9e8_c668_4c8d_8b3b_f1440ee5262a.slice - libcontainer container kubepods-besteffort-pod1f2ac9e8_c668_4c8d_8b3b_f1440ee5262a.slice. Sep 9 05:44:01.052429 kubelet[2811]: I0909 05:44:01.052377 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1f2ac9e8-c668-4c8d-8b3b-f1440ee5262a-typha-certs\") pod \"calico-typha-5d7bb64d4-bspvd\" (UID: \"1f2ac9e8-c668-4c8d-8b3b-f1440ee5262a\") " pod="calico-system/calico-typha-5d7bb64d4-bspvd" Sep 9 05:44:01.053353 kubelet[2811]: I0909 05:44:01.053217 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f2ac9e8-c668-4c8d-8b3b-f1440ee5262a-tigera-ca-bundle\") pod \"calico-typha-5d7bb64d4-bspvd\" (UID: \"1f2ac9e8-c668-4c8d-8b3b-f1440ee5262a\") " pod="calico-system/calico-typha-5d7bb64d4-bspvd" Sep 9 05:44:01.053353 kubelet[2811]: I0909 05:44:01.053279 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4d5\" (UniqueName: \"kubernetes.io/projected/1f2ac9e8-c668-4c8d-8b3b-f1440ee5262a-kube-api-access-kt4d5\") pod \"calico-typha-5d7bb64d4-bspvd\" (UID: \"1f2ac9e8-c668-4c8d-8b3b-f1440ee5262a\") " pod="calico-system/calico-typha-5d7bb64d4-bspvd" Sep 9 05:44:01.285264 containerd[1568]: time="2025-09-09T05:44:01.283905357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d7bb64d4-bspvd,Uid:1f2ac9e8-c668-4c8d-8b3b-f1440ee5262a,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:01.296987 systemd[1]: Created slice kubepods-besteffort-pod89c19d25_6c0f_4674_aea8_a9f020cb3c03.slice - libcontainer container kubepods-besteffort-pod89c19d25_6c0f_4674_aea8_a9f020cb3c03.slice. Sep 9 05:44:01.345210 containerd[1568]: time="2025-09-09T05:44:01.345102717Z" level=info msg="connecting to shim ce24423ec5f9e45cdb894d70938d515732faf5da71c1f880046e3d8d99ebe812" address="unix:///run/containerd/s/e87d0e6c8edf663412661b0a6525e8076cef5c0663893a0ba22ffca8679f5fd8" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:01.355909 kubelet[2811]: I0909 05:44:01.355796 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-flexvol-driver-host\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.355909 kubelet[2811]: I0909 05:44:01.355857 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6zcl\" (UniqueName: \"kubernetes.io/projected/89c19d25-6c0f-4674-aea8-a9f020cb3c03-kube-api-access-b6zcl\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.355909 kubelet[2811]: I0909 05:44:01.355890 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-lib-modules\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.356669 kubelet[2811]: I0909 05:44:01.355922 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-xtables-lock\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.356669 kubelet[2811]: I0909 05:44:01.355950 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-cni-log-dir\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.356669 kubelet[2811]: I0909 05:44:01.355992 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89c19d25-6c0f-4674-aea8-a9f020cb3c03-tigera-ca-bundle\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.356669 kubelet[2811]: I0909 05:44:01.356022 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-var-lib-calico\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.356669 kubelet[2811]: I0909 05:44:01.356052 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/89c19d25-6c0f-4674-aea8-a9f020cb3c03-node-certs\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.357266 kubelet[2811]: I0909 05:44:01.356099 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-cni-net-dir\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.357266 kubelet[2811]: I0909 05:44:01.356135 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-cni-bin-dir\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.357266 kubelet[2811]: I0909 05:44:01.356162 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-policysync\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.358031 kubelet[2811]: I0909 05:44:01.357743 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/89c19d25-6c0f-4674-aea8-a9f020cb3c03-var-run-calico\") pod \"calico-node-x7xf4\" (UID: \"89c19d25-6c0f-4674-aea8-a9f020cb3c03\") " pod="calico-system/calico-node-x7xf4" Sep 9 05:44:01.422792 systemd[1]: Started cri-containerd-ce24423ec5f9e45cdb894d70938d515732faf5da71c1f880046e3d8d99ebe812.scope - libcontainer container ce24423ec5f9e45cdb894d70938d515732faf5da71c1f880046e3d8d99ebe812. Sep 9 05:44:01.468779 kubelet[2811]: E0909 05:44:01.468548 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.468779 kubelet[2811]: W0909 05:44:01.468580 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.468779 kubelet[2811]: E0909 05:44:01.468623 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.469075 kubelet[2811]: E0909 05:44:01.468980 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.469075 kubelet[2811]: W0909 05:44:01.468995 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.469075 kubelet[2811]: E0909 05:44:01.469071 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.470286 kubelet[2811]: E0909 05:44:01.469707 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.470286 kubelet[2811]: W0909 05:44:01.469727 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.470286 kubelet[2811]: E0909 05:44:01.469746 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.471215 kubelet[2811]: E0909 05:44:01.470688 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.471215 kubelet[2811]: W0909 05:44:01.470708 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.471215 kubelet[2811]: E0909 05:44:01.470736 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.472885 kubelet[2811]: E0909 05:44:01.471310 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.472885 kubelet[2811]: W0909 05:44:01.471325 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.472885 kubelet[2811]: E0909 05:44:01.471414 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.472885 kubelet[2811]: E0909 05:44:01.471818 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.472885 kubelet[2811]: W0909 05:44:01.471832 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.472885 kubelet[2811]: E0909 05:44:01.471888 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.472885 kubelet[2811]: E0909 05:44:01.472376 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.472885 kubelet[2811]: W0909 05:44:01.472392 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.472885 kubelet[2811]: E0909 05:44:01.472410 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.473802 kubelet[2811]: E0909 05:44:01.473258 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.473802 kubelet[2811]: W0909 05:44:01.473280 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.473802 kubelet[2811]: E0909 05:44:01.473308 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.473802 kubelet[2811]: E0909 05:44:01.473634 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.473802 kubelet[2811]: W0909 05:44:01.473649 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.473802 kubelet[2811]: E0909 05:44:01.473666 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.474138 kubelet[2811]: E0909 05:44:01.473979 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.474138 kubelet[2811]: W0909 05:44:01.474000 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.474138 kubelet[2811]: E0909 05:44:01.474016 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.474373 kubelet[2811]: E0909 05:44:01.474351 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.474373 kubelet[2811]: W0909 05:44:01.474365 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.474473 kubelet[2811]: E0909 05:44:01.474383 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.476460 kubelet[2811]: E0909 05:44:01.476434 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.476460 kubelet[2811]: W0909 05:44:01.476459 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.476606 kubelet[2811]: E0909 05:44:01.476478 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.477061 kubelet[2811]: E0909 05:44:01.477032 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.477148 kubelet[2811]: W0909 05:44:01.477064 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.477148 kubelet[2811]: E0909 05:44:01.477082 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.478431 kubelet[2811]: E0909 05:44:01.478405 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.478522 kubelet[2811]: W0909 05:44:01.478437 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.478522 kubelet[2811]: E0909 05:44:01.478456 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.479302 kubelet[2811]: E0909 05:44:01.479277 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.479302 kubelet[2811]: W0909 05:44:01.479301 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.479438 kubelet[2811]: E0909 05:44:01.479319 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.485215 kubelet[2811]: E0909 05:44:01.483806 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.485215 kubelet[2811]: W0909 05:44:01.483829 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.485215 kubelet[2811]: E0909 05:44:01.483859 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.485562 kubelet[2811]: E0909 05:44:01.485535 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.485562 kubelet[2811]: W0909 05:44:01.485552 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.485668 kubelet[2811]: E0909 05:44:01.485580 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.485884 kubelet[2811]: E0909 05:44:01.485861 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.485978 kubelet[2811]: W0909 05:44:01.485884 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.485978 kubelet[2811]: E0909 05:44:01.485913 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.487390 kubelet[2811]: E0909 05:44:01.487335 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.487390 kubelet[2811]: W0909 05:44:01.487355 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.487390 kubelet[2811]: E0909 05:44:01.487374 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.491008 kubelet[2811]: E0909 05:44:01.490973 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.491154 kubelet[2811]: W0909 05:44:01.491133 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.491337 kubelet[2811]: E0909 05:44:01.491216 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.492353 kubelet[2811]: E0909 05:44:01.492304 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.492629 kubelet[2811]: W0909 05:44:01.492416 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.492629 kubelet[2811]: E0909 05:44:01.492436 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.495543 kubelet[2811]: E0909 05:44:01.495483 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.495543 kubelet[2811]: W0909 05:44:01.495508 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.502975 kubelet[2811]: E0909 05:44:01.502929 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.502975 kubelet[2811]: W0909 05:44:01.502960 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.503392 kubelet[2811]: E0909 05:44:01.503364 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.503392 kubelet[2811]: W0909 05:44:01.503390 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.504358 kubelet[2811]: E0909 05:44:01.503893 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.504358 kubelet[2811]: E0909 05:44:01.503927 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.504358 kubelet[2811]: E0909 05:44:01.503954 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.504680 kubelet[2811]: E0909 05:44:01.504663 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.505123 kubelet[2811]: W0909 05:44:01.504776 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.506630 kubelet[2811]: E0909 05:44:01.506141 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.507731 kubelet[2811]: E0909 05:44:01.507706 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.507731 kubelet[2811]: W0909 05:44:01.507732 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.509511 kubelet[2811]: E0909 05:44:01.509478 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.509965 kubelet[2811]: E0909 05:44:01.509672 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.509965 kubelet[2811]: W0909 05:44:01.509689 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.510638 kubelet[2811]: E0909 05:44:01.510351 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.511353 kubelet[2811]: E0909 05:44:01.511332 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.511715 kubelet[2811]: W0909 05:44:01.511474 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.511715 kubelet[2811]: E0909 05:44:01.511525 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.512790 kubelet[2811]: E0909 05:44:01.512551 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.512790 kubelet[2811]: W0909 05:44:01.512574 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.512790 kubelet[2811]: E0909 05:44:01.512618 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.514645 kubelet[2811]: E0909 05:44:01.514625 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.514865 kubelet[2811]: W0909 05:44:01.514765 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.514991 kubelet[2811]: E0909 05:44:01.514971 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.515734 kubelet[2811]: E0909 05:44:01.515408 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.515920 kubelet[2811]: W0909 05:44:01.515834 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.515920 kubelet[2811]: E0909 05:44:01.515878 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.517483 kubelet[2811]: E0909 05:44:01.517388 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.517483 kubelet[2811]: W0909 05:44:01.517408 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.517483 kubelet[2811]: E0909 05:44:01.517443 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.518007 kubelet[2811]: E0909 05:44:01.517914 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.518007 kubelet[2811]: W0909 05:44:01.517930 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.518155 kubelet[2811]: E0909 05:44:01.518091 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.518716 kubelet[2811]: E0909 05:44:01.518697 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.519014 kubelet[2811]: W0909 05:44:01.518988 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.521280 kubelet[2811]: E0909 05:44:01.521248 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.521856 kubelet[2811]: E0909 05:44:01.521838 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.522051 kubelet[2811]: W0909 05:44:01.521969 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.522170 kubelet[2811]: E0909 05:44:01.522150 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.524338 kubelet[2811]: E0909 05:44:01.524240 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.524338 kubelet[2811]: W0909 05:44:01.524265 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.524578 kubelet[2811]: E0909 05:44:01.524513 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.524997 kubelet[2811]: E0909 05:44:01.524945 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.524997 kubelet[2811]: W0909 05:44:01.524974 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.525358 kubelet[2811]: E0909 05:44:01.525317 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.525588 kubelet[2811]: E0909 05:44:01.525567 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.525675 kubelet[2811]: W0909 05:44:01.525588 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.525819 kubelet[2811]: E0909 05:44:01.525731 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.526103 kubelet[2811]: E0909 05:44:01.526083 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.526103 kubelet[2811]: W0909 05:44:01.526103 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.527037 kubelet[2811]: E0909 05:44:01.526848 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.527194 kubelet[2811]: E0909 05:44:01.527040 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.527194 kubelet[2811]: W0909 05:44:01.527054 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.527194 kubelet[2811]: E0909 05:44:01.527092 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.527419 kubelet[2811]: E0909 05:44:01.527404 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.527476 kubelet[2811]: W0909 05:44:01.527420 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.527476 kubelet[2811]: E0909 05:44:01.527437 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.558757 kubelet[2811]: E0909 05:44:01.558096 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.558757 kubelet[2811]: W0909 05:44:01.558127 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.558757 kubelet[2811]: E0909 05:44:01.558152 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.607781 containerd[1568]: time="2025-09-09T05:44:01.607721623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x7xf4,Uid:89c19d25-6c0f-4674-aea8-a9f020cb3c03,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:01.630162 kubelet[2811]: E0909 05:44:01.629998 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nt4z8" podUID="fae9fae7-86cf-409f-b61d-e750df25a87f" Sep 9 05:44:01.645028 containerd[1568]: time="2025-09-09T05:44:01.644960490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d7bb64d4-bspvd,Uid:1f2ac9e8-c668-4c8d-8b3b-f1440ee5262a,Namespace:calico-system,Attempt:0,} returns sandbox id \"ce24423ec5f9e45cdb894d70938d515732faf5da71c1f880046e3d8d99ebe812\"" Sep 9 05:44:01.650786 containerd[1568]: time="2025-09-09T05:44:01.650742129Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 05:44:01.667631 kubelet[2811]: E0909 05:44:01.667582 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.667631 kubelet[2811]: W0909 05:44:01.667613 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.668415 kubelet[2811]: E0909 05:44:01.667640 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.669488 kubelet[2811]: E0909 05:44:01.669459 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.669825 kubelet[2811]: W0909 05:44:01.669589 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.669825 kubelet[2811]: E0909 05:44:01.669620 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.670211 kubelet[2811]: E0909 05:44:01.670110 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.670211 kubelet[2811]: W0909 05:44:01.670127 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.670211 kubelet[2811]: E0909 05:44:01.670147 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.672545 kubelet[2811]: E0909 05:44:01.672225 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.672545 kubelet[2811]: W0909 05:44:01.672246 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.672545 kubelet[2811]: E0909 05:44:01.672267 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.673719 kubelet[2811]: E0909 05:44:01.673547 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.673719 kubelet[2811]: W0909 05:44:01.673574 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.673719 kubelet[2811]: E0909 05:44:01.673595 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.674584 kubelet[2811]: E0909 05:44:01.674485 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.674584 kubelet[2811]: W0909 05:44:01.674501 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.674584 kubelet[2811]: E0909 05:44:01.674520 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.676411 kubelet[2811]: E0909 05:44:01.676386 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.676411 kubelet[2811]: W0909 05:44:01.676410 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.676575 kubelet[2811]: E0909 05:44:01.676428 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.677323 kubelet[2811]: E0909 05:44:01.677271 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.677323 kubelet[2811]: W0909 05:44:01.677293 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.677323 kubelet[2811]: E0909 05:44:01.677311 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.679087 kubelet[2811]: E0909 05:44:01.679059 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.679087 kubelet[2811]: W0909 05:44:01.679085 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.679349 kubelet[2811]: E0909 05:44:01.679103 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.679502 kubelet[2811]: E0909 05:44:01.679402 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.679502 kubelet[2811]: W0909 05:44:01.679416 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.679502 kubelet[2811]: E0909 05:44:01.679432 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.681521 kubelet[2811]: E0909 05:44:01.681493 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.681521 kubelet[2811]: W0909 05:44:01.681519 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.681668 kubelet[2811]: E0909 05:44:01.681537 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.682014 kubelet[2811]: E0909 05:44:01.681979 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.682014 kubelet[2811]: W0909 05:44:01.682002 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.682162 kubelet[2811]: E0909 05:44:01.682051 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.682560 kubelet[2811]: E0909 05:44:01.682534 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.682560 kubelet[2811]: W0909 05:44:01.682558 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.682700 kubelet[2811]: E0909 05:44:01.682575 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.683268 kubelet[2811]: E0909 05:44:01.683244 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.683268 kubelet[2811]: W0909 05:44:01.683266 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.683426 kubelet[2811]: E0909 05:44:01.683284 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.684049 kubelet[2811]: E0909 05:44:01.683803 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.684049 kubelet[2811]: W0909 05:44:01.683827 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.684049 kubelet[2811]: E0909 05:44:01.683875 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.684865 containerd[1568]: time="2025-09-09T05:44:01.684692647Z" level=info msg="connecting to shim c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667" address="unix:///run/containerd/s/f7449390268968ee76d81c8390bf18aab0213f1f55ff2e8b02aa06baf6f01d12" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:01.685909 kubelet[2811]: E0909 05:44:01.685860 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.685909 kubelet[2811]: W0909 05:44:01.685880 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.685909 kubelet[2811]: E0909 05:44:01.685900 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.686674 kubelet[2811]: E0909 05:44:01.686591 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.686674 kubelet[2811]: W0909 05:44:01.686610 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.686804 kubelet[2811]: E0909 05:44:01.686738 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.687369 kubelet[2811]: E0909 05:44:01.687281 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.687369 kubelet[2811]: W0909 05:44:01.687301 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.687369 kubelet[2811]: E0909 05:44:01.687319 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.687906 kubelet[2811]: E0909 05:44:01.687784 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.687906 kubelet[2811]: W0909 05:44:01.687808 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.687906 kubelet[2811]: E0909 05:44:01.687824 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.688573 kubelet[2811]: E0909 05:44:01.688550 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.688573 kubelet[2811]: W0909 05:44:01.688572 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.688573 kubelet[2811]: E0909 05:44:01.688589 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.690302 kubelet[2811]: E0909 05:44:01.690259 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.690302 kubelet[2811]: W0909 05:44:01.690282 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.690302 kubelet[2811]: E0909 05:44:01.690301 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.690996 kubelet[2811]: I0909 05:44:01.690950 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktfn2\" (UniqueName: \"kubernetes.io/projected/fae9fae7-86cf-409f-b61d-e750df25a87f-kube-api-access-ktfn2\") pod \"csi-node-driver-nt4z8\" (UID: \"fae9fae7-86cf-409f-b61d-e750df25a87f\") " pod="calico-system/csi-node-driver-nt4z8" Sep 9 05:44:01.692269 kubelet[2811]: E0909 05:44:01.691850 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.692269 kubelet[2811]: W0909 05:44:01.691874 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.692269 kubelet[2811]: E0909 05:44:01.691911 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.692269 kubelet[2811]: I0909 05:44:01.692101 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fae9fae7-86cf-409f-b61d-e750df25a87f-socket-dir\") pod \"csi-node-driver-nt4z8\" (UID: \"fae9fae7-86cf-409f-b61d-e750df25a87f\") " pod="calico-system/csi-node-driver-nt4z8" Sep 9 05:44:01.693288 kubelet[2811]: E0909 05:44:01.693116 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.693288 kubelet[2811]: W0909 05:44:01.693152 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.694392 kubelet[2811]: E0909 05:44:01.693788 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.694715 kubelet[2811]: E0909 05:44:01.694697 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.695704 kubelet[2811]: W0909 05:44:01.694894 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.695704 kubelet[2811]: E0909 05:44:01.695408 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.696065 kubelet[2811]: I0909 05:44:01.695864 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/fae9fae7-86cf-409f-b61d-e750df25a87f-varrun\") pod \"csi-node-driver-nt4z8\" (UID: \"fae9fae7-86cf-409f-b61d-e750df25a87f\") " pod="calico-system/csi-node-driver-nt4z8" Sep 9 05:44:01.696320 kubelet[2811]: E0909 05:44:01.696239 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.696723 kubelet[2811]: W0909 05:44:01.696437 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.698223 kubelet[2811]: E0909 05:44:01.698086 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.699262 kubelet[2811]: E0909 05:44:01.698558 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.699262 kubelet[2811]: W0909 05:44:01.699213 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.699262 kubelet[2811]: E0909 05:44:01.699237 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.699975 kubelet[2811]: E0909 05:44:01.699840 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.699975 kubelet[2811]: W0909 05:44:01.699859 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.699975 kubelet[2811]: E0909 05:44:01.699891 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.702113 kubelet[2811]: E0909 05:44:01.700672 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.703896 kubelet[2811]: W0909 05:44:01.703866 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.704110 kubelet[2811]: E0909 05:44:01.704087 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.704787 kubelet[2811]: E0909 05:44:01.704767 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.705321 kubelet[2811]: W0909 05:44:01.705057 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.705321 kubelet[2811]: E0909 05:44:01.705088 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.705586 kubelet[2811]: E0909 05:44:01.705559 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.705838 kubelet[2811]: W0909 05:44:01.705668 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.705838 kubelet[2811]: E0909 05:44:01.705686 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.705838 kubelet[2811]: I0909 05:44:01.705712 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fae9fae7-86cf-409f-b61d-e750df25a87f-registration-dir\") pod \"csi-node-driver-nt4z8\" (UID: \"fae9fae7-86cf-409f-b61d-e750df25a87f\") " pod="calico-system/csi-node-driver-nt4z8" Sep 9 05:44:01.706256 kubelet[2811]: E0909 05:44:01.706233 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.706579 kubelet[2811]: W0909 05:44:01.706507 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.706823 kubelet[2811]: E0909 05:44:01.706672 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.706823 kubelet[2811]: I0909 05:44:01.706702 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fae9fae7-86cf-409f-b61d-e750df25a87f-kubelet-dir\") pod \"csi-node-driver-nt4z8\" (UID: \"fae9fae7-86cf-409f-b61d-e750df25a87f\") " pod="calico-system/csi-node-driver-nt4z8" Sep 9 05:44:01.708506 kubelet[2811]: E0909 05:44:01.707921 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.708506 kubelet[2811]: W0909 05:44:01.707940 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.708506 kubelet[2811]: E0909 05:44:01.707953 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.710499 kubelet[2811]: E0909 05:44:01.710477 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.710744 kubelet[2811]: W0909 05:44:01.710695 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.710744 kubelet[2811]: E0909 05:44:01.710723 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.713565 kubelet[2811]: E0909 05:44:01.712840 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.713565 kubelet[2811]: W0909 05:44:01.712876 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.713565 kubelet[2811]: E0909 05:44:01.712898 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.713565 kubelet[2811]: E0909 05:44:01.713405 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.713565 kubelet[2811]: W0909 05:44:01.713421 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.713565 kubelet[2811]: E0909 05:44:01.713437 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.760779 systemd[1]: Started cri-containerd-c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667.scope - libcontainer container c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667. Sep 9 05:44:01.813456 kubelet[2811]: E0909 05:44:01.813320 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.813456 kubelet[2811]: W0909 05:44:01.813357 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.813456 kubelet[2811]: E0909 05:44:01.813408 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.815616 kubelet[2811]: E0909 05:44:01.815581 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.815616 kubelet[2811]: W0909 05:44:01.815605 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.816296 kubelet[2811]: E0909 05:44:01.816240 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.817370 kubelet[2811]: E0909 05:44:01.817342 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.817451 kubelet[2811]: W0909 05:44:01.817390 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.819199 kubelet[2811]: E0909 05:44:01.818282 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.820020 kubelet[2811]: E0909 05:44:01.818606 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.820020 kubelet[2811]: W0909 05:44:01.819728 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.821616 kubelet[2811]: E0909 05:44:01.821534 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.823026 kubelet[2811]: E0909 05:44:01.822665 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.823026 kubelet[2811]: W0909 05:44:01.822879 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.823204 kubelet[2811]: E0909 05:44:01.823055 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.823654 kubelet[2811]: E0909 05:44:01.823591 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.823654 kubelet[2811]: W0909 05:44:01.823608 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.823938 kubelet[2811]: E0909 05:44:01.823914 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.824701 kubelet[2811]: E0909 05:44:01.824463 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.824701 kubelet[2811]: W0909 05:44:01.824516 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.825122 kubelet[2811]: E0909 05:44:01.825080 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.825350 kubelet[2811]: W0909 05:44:01.825097 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.825932 kubelet[2811]: E0909 05:44:01.825794 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.825932 kubelet[2811]: W0909 05:44:01.825812 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.825932 kubelet[2811]: E0909 05:44:01.825823 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.825932 kubelet[2811]: E0909 05:44:01.825858 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.825932 kubelet[2811]: E0909 05:44:01.825896 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.826559 kubelet[2811]: E0909 05:44:01.826543 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.826697 kubelet[2811]: W0909 05:44:01.826677 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.826936 kubelet[2811]: E0909 05:44:01.826807 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.827290 kubelet[2811]: E0909 05:44:01.827258 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.827409 kubelet[2811]: W0909 05:44:01.827390 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.827548 kubelet[2811]: E0909 05:44:01.827524 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.828035 kubelet[2811]: E0909 05:44:01.827991 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.828035 kubelet[2811]: W0909 05:44:01.828011 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.828484 kubelet[2811]: E0909 05:44:01.828424 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.828648 kubelet[2811]: E0909 05:44:01.828635 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.828788 kubelet[2811]: W0909 05:44:01.828719 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.829168 kubelet[2811]: E0909 05:44:01.829058 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.829168 kubelet[2811]: W0909 05:44:01.829075 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.829565 kubelet[2811]: E0909 05:44:01.829529 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.829815 kubelet[2811]: W0909 05:44:01.829669 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.830084 kubelet[2811]: E0909 05:44:01.830066 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.830261 kubelet[2811]: W0909 05:44:01.830168 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.830527 kubelet[2811]: E0909 05:44:01.830360 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.830596 kubelet[2811]: E0909 05:44:01.830566 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.830596 kubelet[2811]: E0909 05:44:01.830593 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.830726 kubelet[2811]: E0909 05:44:01.830674 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.831086 kubelet[2811]: E0909 05:44:01.831014 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.831086 kubelet[2811]: W0909 05:44:01.831043 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.831347 kubelet[2811]: E0909 05:44:01.831260 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.832002 kubelet[2811]: E0909 05:44:01.831964 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.832324 kubelet[2811]: W0909 05:44:01.832135 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.832324 kubelet[2811]: E0909 05:44:01.832168 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.832769 kubelet[2811]: E0909 05:44:01.832728 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.832769 kubelet[2811]: W0909 05:44:01.832748 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.833082 kubelet[2811]: E0909 05:44:01.833041 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.833536 kubelet[2811]: E0909 05:44:01.833494 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.833536 kubelet[2811]: W0909 05:44:01.833513 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.833874 kubelet[2811]: E0909 05:44:01.833794 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.834336 kubelet[2811]: E0909 05:44:01.834295 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.834336 kubelet[2811]: W0909 05:44:01.834313 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.834626 kubelet[2811]: E0909 05:44:01.834578 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.835062 kubelet[2811]: E0909 05:44:01.835022 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.835062 kubelet[2811]: W0909 05:44:01.835040 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.835361 kubelet[2811]: E0909 05:44:01.835293 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.835829 kubelet[2811]: E0909 05:44:01.835734 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.835829 kubelet[2811]: W0909 05:44:01.835752 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.835829 kubelet[2811]: E0909 05:44:01.835768 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.836542 kubelet[2811]: E0909 05:44:01.836376 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.836542 kubelet[2811]: W0909 05:44:01.836394 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.836542 kubelet[2811]: E0909 05:44:01.836411 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.837074 kubelet[2811]: E0909 05:44:01.837035 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.837295 kubelet[2811]: W0909 05:44:01.837206 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.837295 kubelet[2811]: E0909 05:44:01.837255 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.864622 kubelet[2811]: E0909 05:44:01.864505 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:01.864622 kubelet[2811]: W0909 05:44:01.864534 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:01.864622 kubelet[2811]: E0909 05:44:01.864564 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:01.883794 containerd[1568]: time="2025-09-09T05:44:01.883732593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-x7xf4,Uid:89c19d25-6c0f-4674-aea8-a9f020cb3c03,Namespace:calico-system,Attempt:0,} returns sandbox id \"c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667\"" Sep 9 05:44:02.738763 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2595202205.mount: Deactivated successfully. Sep 9 05:44:03.906962 kubelet[2811]: E0909 05:44:03.906890 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nt4z8" podUID="fae9fae7-86cf-409f-b61d-e750df25a87f" Sep 9 05:44:04.224881 containerd[1568]: time="2025-09-09T05:44:04.224323507Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:04.226082 containerd[1568]: time="2025-09-09T05:44:04.225897445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 05:44:04.227215 containerd[1568]: time="2025-09-09T05:44:04.227138483Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:04.230869 containerd[1568]: time="2025-09-09T05:44:04.230812095Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:04.232278 containerd[1568]: time="2025-09-09T05:44:04.231811745Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.581019754s" Sep 9 05:44:04.232278 containerd[1568]: time="2025-09-09T05:44:04.231856241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 05:44:04.234873 containerd[1568]: time="2025-09-09T05:44:04.234602046Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 05:44:04.257572 containerd[1568]: time="2025-09-09T05:44:04.257505368Z" level=info msg="CreateContainer within sandbox \"ce24423ec5f9e45cdb894d70938d515732faf5da71c1f880046e3d8d99ebe812\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 05:44:04.270500 containerd[1568]: time="2025-09-09T05:44:04.270456422Z" level=info msg="Container 557e3fc33ef16b4faa04f9ea674166f0c36c4958a2e8fc8d7a16c2433a96a8ea: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:04.282833 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2998346481.mount: Deactivated successfully. Sep 9 05:44:04.296904 containerd[1568]: time="2025-09-09T05:44:04.296816378Z" level=info msg="CreateContainer within sandbox \"ce24423ec5f9e45cdb894d70938d515732faf5da71c1f880046e3d8d99ebe812\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"557e3fc33ef16b4faa04f9ea674166f0c36c4958a2e8fc8d7a16c2433a96a8ea\"" Sep 9 05:44:04.297840 containerd[1568]: time="2025-09-09T05:44:04.297801217Z" level=info msg="StartContainer for \"557e3fc33ef16b4faa04f9ea674166f0c36c4958a2e8fc8d7a16c2433a96a8ea\"" Sep 9 05:44:04.302069 containerd[1568]: time="2025-09-09T05:44:04.302005822Z" level=info msg="connecting to shim 557e3fc33ef16b4faa04f9ea674166f0c36c4958a2e8fc8d7a16c2433a96a8ea" address="unix:///run/containerd/s/e87d0e6c8edf663412661b0a6525e8076cef5c0663893a0ba22ffca8679f5fd8" protocol=ttrpc version=3 Sep 9 05:44:04.340752 systemd[1]: Started cri-containerd-557e3fc33ef16b4faa04f9ea674166f0c36c4958a2e8fc8d7a16c2433a96a8ea.scope - libcontainer container 557e3fc33ef16b4faa04f9ea674166f0c36c4958a2e8fc8d7a16c2433a96a8ea. Sep 9 05:44:04.419791 containerd[1568]: time="2025-09-09T05:44:04.419692297Z" level=info msg="StartContainer for \"557e3fc33ef16b4faa04f9ea674166f0c36c4958a2e8fc8d7a16c2433a96a8ea\" returns successfully" Sep 9 05:44:05.085802 kubelet[2811]: I0909 05:44:05.084846 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5d7bb64d4-bspvd" podStartSLOduration=2.501660624 podStartE2EDuration="5.084791105s" podCreationTimestamp="2025-09-09 05:44:00 +0000 UTC" firstStartedPulling="2025-09-09 05:44:01.650404747 +0000 UTC m=+22.934952764" lastFinishedPulling="2025-09-09 05:44:04.23353523 +0000 UTC m=+25.518083245" observedRunningTime="2025-09-09 05:44:05.080621379 +0000 UTC m=+26.365169405" watchObservedRunningTime="2025-09-09 05:44:05.084791105 +0000 UTC m=+26.369339129" Sep 9 05:44:05.116957 kubelet[2811]: E0909 05:44:05.116913 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.116957 kubelet[2811]: W0909 05:44:05.116950 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.116957 kubelet[2811]: E0909 05:44:05.116984 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.119049 kubelet[2811]: E0909 05:44:05.117854 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.119049 kubelet[2811]: W0909 05:44:05.117876 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.119049 kubelet[2811]: E0909 05:44:05.117902 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.119049 kubelet[2811]: E0909 05:44:05.118840 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.119049 kubelet[2811]: W0909 05:44:05.118858 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.119049 kubelet[2811]: E0909 05:44:05.118981 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.119533 kubelet[2811]: E0909 05:44:05.119511 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.119649 kubelet[2811]: W0909 05:44:05.119534 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.119880 kubelet[2811]: E0909 05:44:05.119643 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.120376 kubelet[2811]: E0909 05:44:05.120263 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.120376 kubelet[2811]: W0909 05:44:05.120328 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.120376 kubelet[2811]: E0909 05:44:05.120349 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.120905 kubelet[2811]: E0909 05:44:05.120883 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.120905 kubelet[2811]: W0909 05:44:05.120902 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.121062 kubelet[2811]: E0909 05:44:05.120921 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.121366 kubelet[2811]: E0909 05:44:05.121344 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.121447 kubelet[2811]: W0909 05:44:05.121376 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.121447 kubelet[2811]: E0909 05:44:05.121395 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.122319 kubelet[2811]: E0909 05:44:05.122298 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.122658 kubelet[2811]: W0909 05:44:05.122438 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.122658 kubelet[2811]: E0909 05:44:05.122465 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.123373 kubelet[2811]: E0909 05:44:05.123207 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.123373 kubelet[2811]: W0909 05:44:05.123227 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.123373 kubelet[2811]: E0909 05:44:05.123247 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.123796 kubelet[2811]: E0909 05:44:05.123542 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.123796 kubelet[2811]: W0909 05:44:05.123556 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.123796 kubelet[2811]: E0909 05:44:05.123618 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.124239 kubelet[2811]: E0909 05:44:05.124217 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.124239 kubelet[2811]: W0909 05:44:05.124241 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.124385 kubelet[2811]: E0909 05:44:05.124259 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.124871 kubelet[2811]: E0909 05:44:05.124766 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.124871 kubelet[2811]: W0909 05:44:05.124787 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.125034 kubelet[2811]: E0909 05:44:05.124955 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.125472 kubelet[2811]: E0909 05:44:05.125428 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.125472 kubelet[2811]: W0909 05:44:05.125450 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.125847 kubelet[2811]: E0909 05:44:05.125468 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.126254 kubelet[2811]: E0909 05:44:05.126232 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.126254 kubelet[2811]: W0909 05:44:05.126254 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.126696 kubelet[2811]: E0909 05:44:05.126272 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.126959 kubelet[2811]: E0909 05:44:05.126949 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.126959 kubelet[2811]: W0909 05:44:05.126964 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.127447 kubelet[2811]: E0909 05:44:05.126981 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.159709 kubelet[2811]: E0909 05:44:05.159585 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.159899 kubelet[2811]: W0909 05:44:05.159613 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.159899 kubelet[2811]: E0909 05:44:05.159756 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.161200 kubelet[2811]: E0909 05:44:05.160894 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.161200 kubelet[2811]: W0909 05:44:05.160918 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.161200 kubelet[2811]: E0909 05:44:05.160976 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.161507 kubelet[2811]: E0909 05:44:05.161484 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.161572 kubelet[2811]: W0909 05:44:05.161508 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.161572 kubelet[2811]: E0909 05:44:05.161560 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.162161 kubelet[2811]: E0909 05:44:05.162140 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.162161 kubelet[2811]: W0909 05:44:05.162159 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.162490 kubelet[2811]: E0909 05:44:05.162354 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.162950 kubelet[2811]: E0909 05:44:05.162922 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.162950 kubelet[2811]: W0909 05:44:05.162940 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.163105 kubelet[2811]: E0909 05:44:05.162957 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.163639 kubelet[2811]: E0909 05:44:05.163498 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.163639 kubelet[2811]: W0909 05:44:05.163517 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.163639 kubelet[2811]: E0909 05:44:05.163611 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.164290 kubelet[2811]: E0909 05:44:05.164253 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.164290 kubelet[2811]: W0909 05:44:05.164275 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.164623 kubelet[2811]: E0909 05:44:05.164599 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.165245 kubelet[2811]: E0909 05:44:05.164814 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.165245 kubelet[2811]: W0909 05:44:05.165233 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.165383 kubelet[2811]: E0909 05:44:05.165260 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.165622 kubelet[2811]: E0909 05:44:05.165575 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.165622 kubelet[2811]: W0909 05:44:05.165594 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.165758 kubelet[2811]: E0909 05:44:05.165627 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.166035 kubelet[2811]: E0909 05:44:05.166014 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.166035 kubelet[2811]: W0909 05:44:05.166035 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.166293 kubelet[2811]: E0909 05:44:05.166271 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.166670 kubelet[2811]: E0909 05:44:05.166609 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.166670 kubelet[2811]: W0909 05:44:05.166628 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.167055 kubelet[2811]: E0909 05:44:05.167029 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.167459 kubelet[2811]: E0909 05:44:05.167426 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.167459 kubelet[2811]: W0909 05:44:05.167447 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.167870 kubelet[2811]: E0909 05:44:05.167624 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.168355 kubelet[2811]: E0909 05:44:05.168331 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.168355 kubelet[2811]: W0909 05:44:05.168354 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.168494 kubelet[2811]: E0909 05:44:05.168377 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.169361 kubelet[2811]: E0909 05:44:05.169321 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.169638 kubelet[2811]: W0909 05:44:05.169346 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.169885 kubelet[2811]: E0909 05:44:05.169735 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.170074 kubelet[2811]: E0909 05:44:05.170053 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.170074 kubelet[2811]: W0909 05:44:05.170074 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.170334 kubelet[2811]: E0909 05:44:05.170091 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.170902 kubelet[2811]: E0909 05:44:05.170430 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.170902 kubelet[2811]: W0909 05:44:05.170446 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.170902 kubelet[2811]: E0909 05:44:05.170462 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.170902 kubelet[2811]: E0909 05:44:05.170765 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.170902 kubelet[2811]: W0909 05:44:05.170776 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.170902 kubelet[2811]: E0909 05:44:05.170793 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.171305 kubelet[2811]: E0909 05:44:05.171285 2811 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:44:05.171371 kubelet[2811]: W0909 05:44:05.171307 2811 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:44:05.171371 kubelet[2811]: E0909 05:44:05.171323 2811 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:44:05.238086 containerd[1568]: time="2025-09-09T05:44:05.238014699Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:05.239640 containerd[1568]: time="2025-09-09T05:44:05.239328043Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 05:44:05.240725 containerd[1568]: time="2025-09-09T05:44:05.240677953Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:05.244559 containerd[1568]: time="2025-09-09T05:44:05.243550045Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:05.244559 containerd[1568]: time="2025-09-09T05:44:05.244397043Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.009756202s" Sep 9 05:44:05.244559 containerd[1568]: time="2025-09-09T05:44:05.244440201Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 05:44:05.248473 containerd[1568]: time="2025-09-09T05:44:05.248432285Z" level=info msg="CreateContainer within sandbox \"c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 05:44:05.262207 containerd[1568]: time="2025-09-09T05:44:05.261297936Z" level=info msg="Container 57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:05.279311 containerd[1568]: time="2025-09-09T05:44:05.279169708Z" level=info msg="CreateContainer within sandbox \"c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32\"" Sep 9 05:44:05.280414 containerd[1568]: time="2025-09-09T05:44:05.280249006Z" level=info msg="StartContainer for \"57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32\"" Sep 9 05:44:05.285886 containerd[1568]: time="2025-09-09T05:44:05.285770577Z" level=info msg="connecting to shim 57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32" address="unix:///run/containerd/s/f7449390268968ee76d81c8390bf18aab0213f1f55ff2e8b02aa06baf6f01d12" protocol=ttrpc version=3 Sep 9 05:44:05.322417 systemd[1]: Started cri-containerd-57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32.scope - libcontainer container 57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32. Sep 9 05:44:05.406353 containerd[1568]: time="2025-09-09T05:44:05.406092151Z" level=info msg="StartContainer for \"57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32\" returns successfully" Sep 9 05:44:05.422571 systemd[1]: cri-containerd-57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32.scope: Deactivated successfully. Sep 9 05:44:05.431123 containerd[1568]: time="2025-09-09T05:44:05.431035010Z" level=info msg="received exit event container_id:\"57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32\" id:\"57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32\" pid:3544 exited_at:{seconds:1757396645 nanos:429932985}" Sep 9 05:44:05.431422 containerd[1568]: time="2025-09-09T05:44:05.431375760Z" level=info msg="TaskExit event in podsandbox handler container_id:\"57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32\" id:\"57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32\" pid:3544 exited_at:{seconds:1757396645 nanos:429932985}" Sep 9 05:44:05.471082 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-57455a0faf80cfa44e6bf62f56a89c1187e0317d937d1792586201a695419e32-rootfs.mount: Deactivated successfully. Sep 9 05:44:05.906475 kubelet[2811]: E0909 05:44:05.906395 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nt4z8" podUID="fae9fae7-86cf-409f-b61d-e750df25a87f" Sep 9 05:44:06.063760 kubelet[2811]: I0909 05:44:06.063318 2811 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:44:07.072585 containerd[1568]: time="2025-09-09T05:44:07.072533845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 05:44:07.906985 kubelet[2811]: E0909 05:44:07.906868 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nt4z8" podUID="fae9fae7-86cf-409f-b61d-e750df25a87f" Sep 9 05:44:09.907018 kubelet[2811]: E0909 05:44:09.906500 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-nt4z8" podUID="fae9fae7-86cf-409f-b61d-e750df25a87f" Sep 9 05:44:10.512958 containerd[1568]: time="2025-09-09T05:44:10.512816642Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:10.514070 containerd[1568]: time="2025-09-09T05:44:10.514028963Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 05:44:10.515497 containerd[1568]: time="2025-09-09T05:44:10.515453793Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:10.518518 containerd[1568]: time="2025-09-09T05:44:10.518479883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:10.519506 containerd[1568]: time="2025-09-09T05:44:10.519311105Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.446720619s" Sep 9 05:44:10.519506 containerd[1568]: time="2025-09-09T05:44:10.519352719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 05:44:10.523585 containerd[1568]: time="2025-09-09T05:44:10.523544680Z" level=info msg="CreateContainer within sandbox \"c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 05:44:10.539105 containerd[1568]: time="2025-09-09T05:44:10.536343234Z" level=info msg="Container 690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:10.550467 containerd[1568]: time="2025-09-09T05:44:10.550414477Z" level=info msg="CreateContainer within sandbox \"c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d\"" Sep 9 05:44:10.554026 containerd[1568]: time="2025-09-09T05:44:10.553985873Z" level=info msg="StartContainer for \"690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d\"" Sep 9 05:44:10.558786 containerd[1568]: time="2025-09-09T05:44:10.558738200Z" level=info msg="connecting to shim 690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d" address="unix:///run/containerd/s/f7449390268968ee76d81c8390bf18aab0213f1f55ff2e8b02aa06baf6f01d12" protocol=ttrpc version=3 Sep 9 05:44:10.598425 systemd[1]: Started cri-containerd-690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d.scope - libcontainer container 690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d. Sep 9 05:44:10.663825 containerd[1568]: time="2025-09-09T05:44:10.663694439Z" level=info msg="StartContainer for \"690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d\" returns successfully" Sep 9 05:44:11.666718 systemd[1]: cri-containerd-690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d.scope: Deactivated successfully. Sep 9 05:44:11.667886 systemd[1]: cri-containerd-690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d.scope: Consumed 642ms CPU time, 192.9M memory peak, 171.3M written to disk. Sep 9 05:44:11.670162 containerd[1568]: time="2025-09-09T05:44:11.670119661Z" level=info msg="TaskExit event in podsandbox handler container_id:\"690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d\" id:\"690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d\" pid:3604 exited_at:{seconds:1757396651 nanos:668082293}" Sep 9 05:44:11.671521 containerd[1568]: time="2025-09-09T05:44:11.671398959Z" level=info msg="received exit event container_id:\"690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d\" id:\"690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d\" pid:3604 exited_at:{seconds:1757396651 nanos:668082293}" Sep 9 05:44:11.707762 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-690f7de3504c644cfa6afb5e5252b36d09b57a1ee950ab5f92b272ae2869a36d-rootfs.mount: Deactivated successfully. Sep 9 05:44:11.784439 kubelet[2811]: I0909 05:44:11.784392 2811 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 05:44:11.842021 systemd[1]: Created slice kubepods-burstable-pod77d44113_c086_484e_89d4_b7d4844d3cef.slice - libcontainer container kubepods-burstable-pod77d44113_c086_484e_89d4_b7d4844d3cef.slice. Sep 9 05:44:11.848995 kubelet[2811]: W0909 05:44:11.845980 2811 reflector.go:569] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' and this object Sep 9 05:44:11.858171 kubelet[2811]: E0909 05:44:11.857905 2811 reflector.go:166] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' and this object" logger="UnhandledError" Sep 9 05:44:11.862390 kubelet[2811]: W0909 05:44:11.861292 2811 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' and this object Sep 9 05:44:11.862390 kubelet[2811]: E0909 05:44:11.861344 2811 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' and this object" logger="UnhandledError" Sep 9 05:44:11.862390 kubelet[2811]: W0909 05:44:11.861465 2811 reflector.go:569] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' and this object Sep 9 05:44:11.862390 kubelet[2811]: E0909 05:44:11.861487 2811 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' and this object" logger="UnhandledError" Sep 9 05:44:11.872021 systemd[1]: Created slice kubepods-besteffort-pod8c449465_a3d4_4a22_abbd_fae36370667f.slice - libcontainer container kubepods-besteffort-pod8c449465_a3d4_4a22_abbd_fae36370667f.slice. Sep 9 05:44:11.897975 systemd[1]: Created slice kubepods-besteffort-pod8767d1d9_ea87_40f6_8f9c_15997478a3da.slice - libcontainer container kubepods-besteffort-pod8767d1d9_ea87_40f6_8f9c_15997478a3da.slice. Sep 9 05:44:11.911834 systemd[1]: Created slice kubepods-besteffort-pod86de38b5_26f0_4f51_9314_86aac9313780.slice - libcontainer container kubepods-besteffort-pod86de38b5_26f0_4f51_9314_86aac9313780.slice. Sep 9 05:44:11.920477 kubelet[2811]: I0909 05:44:11.920309 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/77d44113-c086-484e-89d4-b7d4844d3cef-config-volume\") pod \"coredns-668d6bf9bc-gz8d9\" (UID: \"77d44113-c086-484e-89d4-b7d4844d3cef\") " pod="kube-system/coredns-668d6bf9bc-gz8d9" Sep 9 05:44:11.922601 systemd[1]: Created slice kubepods-besteffort-pod005a5e53_ed87_45db_9af7_c3f891ffa294.slice - libcontainer container kubepods-besteffort-pod005a5e53_ed87_45db_9af7_c3f891ffa294.slice. Sep 9 05:44:11.923992 kubelet[2811]: I0909 05:44:11.921570 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf692b7e-548d-45bc-bdd0-baa91f595ecc-whisker-backend-key-pair\") pod \"whisker-54794687d4-zw5x6\" (UID: \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\") " pod="calico-system/whisker-54794687d4-zw5x6" Sep 9 05:44:11.923992 kubelet[2811]: I0909 05:44:11.923310 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf692b7e-548d-45bc-bdd0-baa91f595ecc-whisker-ca-bundle\") pod \"whisker-54794687d4-zw5x6\" (UID: \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\") " pod="calico-system/whisker-54794687d4-zw5x6" Sep 9 05:44:11.923992 kubelet[2811]: I0909 05:44:11.923363 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf7hc\" (UniqueName: \"kubernetes.io/projected/a2a152ca-9105-4e9c-9026-f43cca5fbe8f-kube-api-access-rf7hc\") pod \"coredns-668d6bf9bc-2tqqm\" (UID: \"a2a152ca-9105-4e9c-9026-f43cca5fbe8f\") " pod="kube-system/coredns-668d6bf9bc-2tqqm" Sep 9 05:44:11.923992 kubelet[2811]: I0909 05:44:11.923419 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c449465-a3d4-4a22-abbd-fae36370667f-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-mvxm4\" (UID: \"8c449465-a3d4-4a22-abbd-fae36370667f\") " pod="calico-system/goldmane-54d579b49d-mvxm4" Sep 9 05:44:11.923992 kubelet[2811]: I0909 05:44:11.923478 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/86de38b5-26f0-4f51-9314-86aac9313780-calico-apiserver-certs\") pod \"calico-apiserver-7d7857fcf5-gs95d\" (UID: \"86de38b5-26f0-4f51-9314-86aac9313780\") " pod="calico-apiserver/calico-apiserver-7d7857fcf5-gs95d" Sep 9 05:44:11.924323 kubelet[2811]: I0909 05:44:11.923508 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8c449465-a3d4-4a22-abbd-fae36370667f-goldmane-key-pair\") pod \"goldmane-54d579b49d-mvxm4\" (UID: \"8c449465-a3d4-4a22-abbd-fae36370667f\") " pod="calico-system/goldmane-54d579b49d-mvxm4" Sep 9 05:44:11.924323 kubelet[2811]: I0909 05:44:11.923557 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2ggt\" (UniqueName: \"kubernetes.io/projected/8c449465-a3d4-4a22-abbd-fae36370667f-kube-api-access-r2ggt\") pod \"goldmane-54d579b49d-mvxm4\" (UID: \"8c449465-a3d4-4a22-abbd-fae36370667f\") " pod="calico-system/goldmane-54d579b49d-mvxm4" Sep 9 05:44:11.924323 kubelet[2811]: I0909 05:44:11.923594 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a2a152ca-9105-4e9c-9026-f43cca5fbe8f-config-volume\") pod \"coredns-668d6bf9bc-2tqqm\" (UID: \"a2a152ca-9105-4e9c-9026-f43cca5fbe8f\") " pod="kube-system/coredns-668d6bf9bc-2tqqm" Sep 9 05:44:11.924323 kubelet[2811]: I0909 05:44:11.923664 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqkkb\" (UniqueName: \"kubernetes.io/projected/86de38b5-26f0-4f51-9314-86aac9313780-kube-api-access-rqkkb\") pod \"calico-apiserver-7d7857fcf5-gs95d\" (UID: \"86de38b5-26f0-4f51-9314-86aac9313780\") " pod="calico-apiserver/calico-apiserver-7d7857fcf5-gs95d" Sep 9 05:44:11.924323 kubelet[2811]: I0909 05:44:11.923699 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psvdx\" (UniqueName: \"kubernetes.io/projected/005a5e53-ed87-45db-9af7-c3f891ffa294-kube-api-access-psvdx\") pod \"calico-kube-controllers-8c89c8979-s4l22\" (UID: \"005a5e53-ed87-45db-9af7-c3f891ffa294\") " pod="calico-system/calico-kube-controllers-8c89c8979-s4l22" Sep 9 05:44:11.924577 kubelet[2811]: I0909 05:44:11.923753 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c449465-a3d4-4a22-abbd-fae36370667f-config\") pod \"goldmane-54d579b49d-mvxm4\" (UID: \"8c449465-a3d4-4a22-abbd-fae36370667f\") " pod="calico-system/goldmane-54d579b49d-mvxm4" Sep 9 05:44:11.924577 kubelet[2811]: I0909 05:44:11.923806 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8767d1d9-ea87-40f6-8f9c-15997478a3da-calico-apiserver-certs\") pod \"calico-apiserver-7d7857fcf5-mzmsc\" (UID: \"8767d1d9-ea87-40f6-8f9c-15997478a3da\") " pod="calico-apiserver/calico-apiserver-7d7857fcf5-mzmsc" Sep 9 05:44:11.924577 kubelet[2811]: I0909 05:44:11.923838 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqbs7\" (UniqueName: \"kubernetes.io/projected/cf692b7e-548d-45bc-bdd0-baa91f595ecc-kube-api-access-mqbs7\") pod \"whisker-54794687d4-zw5x6\" (UID: \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\") " pod="calico-system/whisker-54794687d4-zw5x6" Sep 9 05:44:11.924577 kubelet[2811]: I0909 05:44:11.923867 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bk87\" (UniqueName: \"kubernetes.io/projected/77d44113-c086-484e-89d4-b7d4844d3cef-kube-api-access-6bk87\") pod \"coredns-668d6bf9bc-gz8d9\" (UID: \"77d44113-c086-484e-89d4-b7d4844d3cef\") " pod="kube-system/coredns-668d6bf9bc-gz8d9" Sep 9 05:44:11.924577 kubelet[2811]: I0909 05:44:11.923916 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/005a5e53-ed87-45db-9af7-c3f891ffa294-tigera-ca-bundle\") pod \"calico-kube-controllers-8c89c8979-s4l22\" (UID: \"005a5e53-ed87-45db-9af7-c3f891ffa294\") " pod="calico-system/calico-kube-controllers-8c89c8979-s4l22" Sep 9 05:44:11.924863 kubelet[2811]: I0909 05:44:11.923943 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mlvk\" (UniqueName: \"kubernetes.io/projected/8767d1d9-ea87-40f6-8f9c-15997478a3da-kube-api-access-4mlvk\") pod \"calico-apiserver-7d7857fcf5-mzmsc\" (UID: \"8767d1d9-ea87-40f6-8f9c-15997478a3da\") " pod="calico-apiserver/calico-apiserver-7d7857fcf5-mzmsc" Sep 9 05:44:11.939123 systemd[1]: Created slice kubepods-burstable-poda2a152ca_9105_4e9c_9026_f43cca5fbe8f.slice - libcontainer container kubepods-burstable-poda2a152ca_9105_4e9c_9026_f43cca5fbe8f.slice. Sep 9 05:44:11.953918 systemd[1]: Created slice kubepods-besteffort-podcf692b7e_548d_45bc_bdd0_baa91f595ecc.slice - libcontainer container kubepods-besteffort-podcf692b7e_548d_45bc_bdd0_baa91f595ecc.slice. Sep 9 05:44:11.964272 systemd[1]: Created slice kubepods-besteffort-podfae9fae7_86cf_409f_b61d_e750df25a87f.slice - libcontainer container kubepods-besteffort-podfae9fae7_86cf_409f_b61d_e750df25a87f.slice. Sep 9 05:44:11.979671 containerd[1568]: time="2025-09-09T05:44:11.977922937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nt4z8,Uid:fae9fae7-86cf-409f-b61d-e750df25a87f,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:12.159001 containerd[1568]: time="2025-09-09T05:44:12.158940007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gz8d9,Uid:77d44113-c086-484e-89d4-b7d4844d3cef,Namespace:kube-system,Attempt:0,}" Sep 9 05:44:12.234363 containerd[1568]: time="2025-09-09T05:44:12.234208827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8c89c8979-s4l22,Uid:005a5e53-ed87-45db-9af7-c3f891ffa294,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:12.281317 containerd[1568]: time="2025-09-09T05:44:12.281220233Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2tqqm,Uid:a2a152ca-9105-4e9c-9026-f43cca5fbe8f,Namespace:kube-system,Attempt:0,}" Sep 9 05:44:12.290072 containerd[1568]: time="2025-09-09T05:44:12.290013570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54794687d4-zw5x6,Uid:cf692b7e-548d-45bc-bdd0-baa91f595ecc,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:12.507933 containerd[1568]: time="2025-09-09T05:44:12.507498248Z" level=error msg="Failed to destroy network for sandbox \"6f962566ae1f5493c06b57a167d2ec0191c0968afdbdd6c9006083fc52f2eca4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.510960 containerd[1568]: time="2025-09-09T05:44:12.510849970Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54794687d4-zw5x6,Uid:cf692b7e-548d-45bc-bdd0-baa91f595ecc,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f962566ae1f5493c06b57a167d2ec0191c0968afdbdd6c9006083fc52f2eca4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.512026 kubelet[2811]: E0909 05:44:12.511940 2811 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f962566ae1f5493c06b57a167d2ec0191c0968afdbdd6c9006083fc52f2eca4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.512985 kubelet[2811]: E0909 05:44:12.512065 2811 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f962566ae1f5493c06b57a167d2ec0191c0968afdbdd6c9006083fc52f2eca4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54794687d4-zw5x6" Sep 9 05:44:12.512985 kubelet[2811]: E0909 05:44:12.512100 2811 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f962566ae1f5493c06b57a167d2ec0191c0968afdbdd6c9006083fc52f2eca4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54794687d4-zw5x6" Sep 9 05:44:12.512985 kubelet[2811]: E0909 05:44:12.512162 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-54794687d4-zw5x6_calico-system(cf692b7e-548d-45bc-bdd0-baa91f595ecc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-54794687d4-zw5x6_calico-system(cf692b7e-548d-45bc-bdd0-baa91f595ecc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f962566ae1f5493c06b57a167d2ec0191c0968afdbdd6c9006083fc52f2eca4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-54794687d4-zw5x6" podUID="cf692b7e-548d-45bc-bdd0-baa91f595ecc" Sep 9 05:44:12.532636 containerd[1568]: time="2025-09-09T05:44:12.532516491Z" level=error msg="Failed to destroy network for sandbox \"3967db391ee749d5cb9407a538e5ddda95606fd807e90f2583e0bcf103ca4cb5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.535963 containerd[1568]: time="2025-09-09T05:44:12.535879504Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nt4z8,Uid:fae9fae7-86cf-409f-b61d-e750df25a87f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3967db391ee749d5cb9407a538e5ddda95606fd807e90f2583e0bcf103ca4cb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.537116 kubelet[2811]: E0909 05:44:12.536618 2811 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3967db391ee749d5cb9407a538e5ddda95606fd807e90f2583e0bcf103ca4cb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.537116 kubelet[2811]: E0909 05:44:12.536711 2811 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3967db391ee749d5cb9407a538e5ddda95606fd807e90f2583e0bcf103ca4cb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nt4z8" Sep 9 05:44:12.537116 kubelet[2811]: E0909 05:44:12.536750 2811 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3967db391ee749d5cb9407a538e5ddda95606fd807e90f2583e0bcf103ca4cb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-nt4z8" Sep 9 05:44:12.538064 kubelet[2811]: E0909 05:44:12.536829 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-nt4z8_calico-system(fae9fae7-86cf-409f-b61d-e750df25a87f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-nt4z8_calico-system(fae9fae7-86cf-409f-b61d-e750df25a87f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3967db391ee749d5cb9407a538e5ddda95606fd807e90f2583e0bcf103ca4cb5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-nt4z8" podUID="fae9fae7-86cf-409f-b61d-e750df25a87f" Sep 9 05:44:12.542458 containerd[1568]: time="2025-09-09T05:44:12.542407120Z" level=error msg="Failed to destroy network for sandbox \"192d084f8bb1f60ff3e43e83147db4325c30f0014068ecdee62290b28ac08382\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.546686 containerd[1568]: time="2025-09-09T05:44:12.546249083Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gz8d9,Uid:77d44113-c086-484e-89d4-b7d4844d3cef,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"192d084f8bb1f60ff3e43e83147db4325c30f0014068ecdee62290b28ac08382\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.547535 kubelet[2811]: E0909 05:44:12.547469 2811 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"192d084f8bb1f60ff3e43e83147db4325c30f0014068ecdee62290b28ac08382\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.547659 kubelet[2811]: E0909 05:44:12.547554 2811 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"192d084f8bb1f60ff3e43e83147db4325c30f0014068ecdee62290b28ac08382\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gz8d9" Sep 9 05:44:12.547659 kubelet[2811]: E0909 05:44:12.547586 2811 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"192d084f8bb1f60ff3e43e83147db4325c30f0014068ecdee62290b28ac08382\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gz8d9" Sep 9 05:44:12.549199 kubelet[2811]: E0909 05:44:12.548247 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-gz8d9_kube-system(77d44113-c086-484e-89d4-b7d4844d3cef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-gz8d9_kube-system(77d44113-c086-484e-89d4-b7d4844d3cef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"192d084f8bb1f60ff3e43e83147db4325c30f0014068ecdee62290b28ac08382\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-gz8d9" podUID="77d44113-c086-484e-89d4-b7d4844d3cef" Sep 9 05:44:12.554035 containerd[1568]: time="2025-09-09T05:44:12.553977984Z" level=error msg="Failed to destroy network for sandbox \"11fedaa71d855be24d172da74dd83472d35ed0dea56522cc70b5e1e88e02fc91\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.556310 containerd[1568]: time="2025-09-09T05:44:12.556246504Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2tqqm,Uid:a2a152ca-9105-4e9c-9026-f43cca5fbe8f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"11fedaa71d855be24d172da74dd83472d35ed0dea56522cc70b5e1e88e02fc91\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.556660 kubelet[2811]: E0909 05:44:12.556618 2811 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11fedaa71d855be24d172da74dd83472d35ed0dea56522cc70b5e1e88e02fc91\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.556760 kubelet[2811]: E0909 05:44:12.556686 2811 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11fedaa71d855be24d172da74dd83472d35ed0dea56522cc70b5e1e88e02fc91\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2tqqm" Sep 9 05:44:12.556760 kubelet[2811]: E0909 05:44:12.556716 2811 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"11fedaa71d855be24d172da74dd83472d35ed0dea56522cc70b5e1e88e02fc91\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-2tqqm" Sep 9 05:44:12.556978 kubelet[2811]: E0909 05:44:12.556931 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-2tqqm_kube-system(a2a152ca-9105-4e9c-9026-f43cca5fbe8f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-2tqqm_kube-system(a2a152ca-9105-4e9c-9026-f43cca5fbe8f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"11fedaa71d855be24d172da74dd83472d35ed0dea56522cc70b5e1e88e02fc91\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-2tqqm" podUID="a2a152ca-9105-4e9c-9026-f43cca5fbe8f" Sep 9 05:44:12.566345 containerd[1568]: time="2025-09-09T05:44:12.566274353Z" level=error msg="Failed to destroy network for sandbox \"07e8e5dac915180058e1bae5976f5db0ee241c92b38ce3bd9b7016ac6bfbf021\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.568085 containerd[1568]: time="2025-09-09T05:44:12.567937581Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8c89c8979-s4l22,Uid:005a5e53-ed87-45db-9af7-c3f891ffa294,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"07e8e5dac915180058e1bae5976f5db0ee241c92b38ce3bd9b7016ac6bfbf021\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.568311 kubelet[2811]: E0909 05:44:12.568261 2811 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07e8e5dac915180058e1bae5976f5db0ee241c92b38ce3bd9b7016ac6bfbf021\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:12.568392 kubelet[2811]: E0909 05:44:12.568333 2811 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07e8e5dac915180058e1bae5976f5db0ee241c92b38ce3bd9b7016ac6bfbf021\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8c89c8979-s4l22" Sep 9 05:44:12.568392 kubelet[2811]: E0909 05:44:12.568373 2811 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"07e8e5dac915180058e1bae5976f5db0ee241c92b38ce3bd9b7016ac6bfbf021\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8c89c8979-s4l22" Sep 9 05:44:12.568511 kubelet[2811]: E0909 05:44:12.568429 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8c89c8979-s4l22_calico-system(005a5e53-ed87-45db-9af7-c3f891ffa294)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8c89c8979-s4l22_calico-system(005a5e53-ed87-45db-9af7-c3f891ffa294)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"07e8e5dac915180058e1bae5976f5db0ee241c92b38ce3bd9b7016ac6bfbf021\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8c89c8979-s4l22" podUID="005a5e53-ed87-45db-9af7-c3f891ffa294" Sep 9 05:44:12.712976 systemd[1]: run-netns-cni\x2d562503c5\x2ddf3a\x2d1e42\x2de0e1\x2db20fcee244b2.mount: Deactivated successfully. Sep 9 05:44:13.051383 kubelet[2811]: E0909 05:44:13.051319 2811 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Sep 9 05:44:13.052386 kubelet[2811]: E0909 05:44:13.051457 2811 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86de38b5-26f0-4f51-9314-86aac9313780-calico-apiserver-certs podName:86de38b5-26f0-4f51-9314-86aac9313780 nodeName:}" failed. No retries permitted until 2025-09-09 05:44:13.551423491 +0000 UTC m=+34.835971513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/86de38b5-26f0-4f51-9314-86aac9313780-calico-apiserver-certs") pod "calico-apiserver-7d7857fcf5-gs95d" (UID: "86de38b5-26f0-4f51-9314-86aac9313780") : failed to sync secret cache: timed out waiting for the condition Sep 9 05:44:13.052386 kubelet[2811]: E0909 05:44:13.051968 2811 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Sep 9 05:44:13.052386 kubelet[2811]: E0909 05:44:13.052079 2811 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8767d1d9-ea87-40f6-8f9c-15997478a3da-calico-apiserver-certs podName:8767d1d9-ea87-40f6-8f9c-15997478a3da nodeName:}" failed. No retries permitted until 2025-09-09 05:44:13.552055776 +0000 UTC m=+34.836603789 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/8767d1d9-ea87-40f6-8f9c-15997478a3da-calico-apiserver-certs") pod "calico-apiserver-7d7857fcf5-mzmsc" (UID: "8767d1d9-ea87-40f6-8f9c-15997478a3da") : failed to sync secret cache: timed out waiting for the condition Sep 9 05:44:13.053571 kubelet[2811]: E0909 05:44:13.053494 2811 secret.go:189] Couldn't get secret calico-system/goldmane-key-pair: failed to sync secret cache: timed out waiting for the condition Sep 9 05:44:13.053752 kubelet[2811]: E0909 05:44:13.053657 2811 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8c449465-a3d4-4a22-abbd-fae36370667f-goldmane-key-pair podName:8c449465-a3d4-4a22-abbd-fae36370667f nodeName:}" failed. No retries permitted until 2025-09-09 05:44:13.553632815 +0000 UTC m=+34.838180835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-key-pair" (UniqueName: "kubernetes.io/secret/8c449465-a3d4-4a22-abbd-fae36370667f-goldmane-key-pair") pod "goldmane-54d579b49d-mvxm4" (UID: "8c449465-a3d4-4a22-abbd-fae36370667f") : failed to sync secret cache: timed out waiting for the condition Sep 9 05:44:13.054633 kubelet[2811]: E0909 05:44:13.054594 2811 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 9 05:44:13.054633 kubelet[2811]: E0909 05:44:13.054628 2811 projected.go:194] Error preparing data for projected volume kube-api-access-4mlvk for pod calico-apiserver/calico-apiserver-7d7857fcf5-mzmsc: failed to sync configmap cache: timed out waiting for the condition Sep 9 05:44:13.054784 kubelet[2811]: E0909 05:44:13.054696 2811 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8767d1d9-ea87-40f6-8f9c-15997478a3da-kube-api-access-4mlvk podName:8767d1d9-ea87-40f6-8f9c-15997478a3da nodeName:}" failed. No retries permitted until 2025-09-09 05:44:13.554678292 +0000 UTC m=+34.839226315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4mlvk" (UniqueName: "kubernetes.io/projected/8767d1d9-ea87-40f6-8f9c-15997478a3da-kube-api-access-4mlvk") pod "calico-apiserver-7d7857fcf5-mzmsc" (UID: "8767d1d9-ea87-40f6-8f9c-15997478a3da") : failed to sync configmap cache: timed out waiting for the condition Sep 9 05:44:13.101828 kubelet[2811]: E0909 05:44:13.101705 2811 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 9 05:44:13.101828 kubelet[2811]: E0909 05:44:13.101833 2811 projected.go:194] Error preparing data for projected volume kube-api-access-rqkkb for pod calico-apiserver/calico-apiserver-7d7857fcf5-gs95d: failed to sync configmap cache: timed out waiting for the condition Sep 9 05:44:13.102102 kubelet[2811]: E0909 05:44:13.101942 2811 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/86de38b5-26f0-4f51-9314-86aac9313780-kube-api-access-rqkkb podName:86de38b5-26f0-4f51-9314-86aac9313780 nodeName:}" failed. No retries permitted until 2025-09-09 05:44:13.601916375 +0000 UTC m=+34.886464391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rqkkb" (UniqueName: "kubernetes.io/projected/86de38b5-26f0-4f51-9314-86aac9313780-kube-api-access-rqkkb") pod "calico-apiserver-7d7857fcf5-gs95d" (UID: "86de38b5-26f0-4f51-9314-86aac9313780") : failed to sync configmap cache: timed out waiting for the condition Sep 9 05:44:13.111672 containerd[1568]: time="2025-09-09T05:44:13.111625053Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 05:44:13.688947 containerd[1568]: time="2025-09-09T05:44:13.688896543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mvxm4,Uid:8c449465-a3d4-4a22-abbd-fae36370667f,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:13.712022 containerd[1568]: time="2025-09-09T05:44:13.711641344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d7857fcf5-mzmsc,Uid:8767d1d9-ea87-40f6-8f9c-15997478a3da,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:44:13.732956 containerd[1568]: time="2025-09-09T05:44:13.732902212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d7857fcf5-gs95d,Uid:86de38b5-26f0-4f51-9314-86aac9313780,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:44:13.838910 containerd[1568]: time="2025-09-09T05:44:13.838835662Z" level=error msg="Failed to destroy network for sandbox \"ba8589d5461c49f87694eab485d07a55a3e2acb9333ed77ba41c1fd6c0e57a9d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.841127 containerd[1568]: time="2025-09-09T05:44:13.840990874Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mvxm4,Uid:8c449465-a3d4-4a22-abbd-fae36370667f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba8589d5461c49f87694eab485d07a55a3e2acb9333ed77ba41c1fd6c0e57a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.848744 kubelet[2811]: E0909 05:44:13.848206 2811 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba8589d5461c49f87694eab485d07a55a3e2acb9333ed77ba41c1fd6c0e57a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.848744 kubelet[2811]: E0909 05:44:13.848298 2811 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba8589d5461c49f87694eab485d07a55a3e2acb9333ed77ba41c1fd6c0e57a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-mvxm4" Sep 9 05:44:13.848744 kubelet[2811]: E0909 05:44:13.848333 2811 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba8589d5461c49f87694eab485d07a55a3e2acb9333ed77ba41c1fd6c0e57a9d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-mvxm4" Sep 9 05:44:13.848995 kubelet[2811]: E0909 05:44:13.848403 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-mvxm4_calico-system(8c449465-a3d4-4a22-abbd-fae36370667f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-mvxm4_calico-system(8c449465-a3d4-4a22-abbd-fae36370667f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba8589d5461c49f87694eab485d07a55a3e2acb9333ed77ba41c1fd6c0e57a9d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-mvxm4" podUID="8c449465-a3d4-4a22-abbd-fae36370667f" Sep 9 05:44:13.850294 systemd[1]: run-netns-cni\x2d2acc8231\x2de5a8\x2d0416\x2d44d8\x2dcf0f41e3d2f6.mount: Deactivated successfully. Sep 9 05:44:13.895765 containerd[1568]: time="2025-09-09T05:44:13.895685809Z" level=error msg="Failed to destroy network for sandbox \"cd01b96f02608a7c4429578dd5c92c5c13c71515d7c66fb28f4bfebbf2630f0c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.901012 containerd[1568]: time="2025-09-09T05:44:13.900901735Z" level=error msg="Failed to destroy network for sandbox \"d56a3be418ac44c7df41ddd00a030a3984e7818876fe955e42610ec8b4b34b2c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.901468 containerd[1568]: time="2025-09-09T05:44:13.901329385Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d7857fcf5-gs95d,Uid:86de38b5-26f0-4f51-9314-86aac9313780,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd01b96f02608a7c4429578dd5c92c5c13c71515d7c66fb28f4bfebbf2630f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.902863 kubelet[2811]: E0909 05:44:13.902797 2811 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd01b96f02608a7c4429578dd5c92c5c13c71515d7c66fb28f4bfebbf2630f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.902988 kubelet[2811]: E0909 05:44:13.902889 2811 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd01b96f02608a7c4429578dd5c92c5c13c71515d7c66fb28f4bfebbf2630f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d7857fcf5-gs95d" Sep 9 05:44:13.902988 kubelet[2811]: E0909 05:44:13.902932 2811 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd01b96f02608a7c4429578dd5c92c5c13c71515d7c66fb28f4bfebbf2630f0c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d7857fcf5-gs95d" Sep 9 05:44:13.903112 kubelet[2811]: E0909 05:44:13.902991 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d7857fcf5-gs95d_calico-apiserver(86de38b5-26f0-4f51-9314-86aac9313780)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d7857fcf5-gs95d_calico-apiserver(86de38b5-26f0-4f51-9314-86aac9313780)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cd01b96f02608a7c4429578dd5c92c5c13c71515d7c66fb28f4bfebbf2630f0c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d7857fcf5-gs95d" podUID="86de38b5-26f0-4f51-9314-86aac9313780" Sep 9 05:44:13.904801 systemd[1]: run-netns-cni\x2d08fc2df9\x2ddf84\x2d0a33\x2d8cdd\x2d458c38f247f2.mount: Deactivated successfully. Sep 9 05:44:13.905856 containerd[1568]: time="2025-09-09T05:44:13.905623047Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d7857fcf5-mzmsc,Uid:8767d1d9-ea87-40f6-8f9c-15997478a3da,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d56a3be418ac44c7df41ddd00a030a3984e7818876fe955e42610ec8b4b34b2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.906563 kubelet[2811]: E0909 05:44:13.906053 2811 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d56a3be418ac44c7df41ddd00a030a3984e7818876fe955e42610ec8b4b34b2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:44:13.906563 kubelet[2811]: E0909 05:44:13.906136 2811 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d56a3be418ac44c7df41ddd00a030a3984e7818876fe955e42610ec8b4b34b2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d7857fcf5-mzmsc" Sep 9 05:44:13.907635 kubelet[2811]: E0909 05:44:13.906168 2811 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d56a3be418ac44c7df41ddd00a030a3984e7818876fe955e42610ec8b4b34b2c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d7857fcf5-mzmsc" Sep 9 05:44:13.912082 kubelet[2811]: E0909 05:44:13.911800 2811 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d7857fcf5-mzmsc_calico-apiserver(8767d1d9-ea87-40f6-8f9c-15997478a3da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d7857fcf5-mzmsc_calico-apiserver(8767d1d9-ea87-40f6-8f9c-15997478a3da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d56a3be418ac44c7df41ddd00a030a3984e7818876fe955e42610ec8b4b34b2c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d7857fcf5-mzmsc" podUID="8767d1d9-ea87-40f6-8f9c-15997478a3da" Sep 9 05:44:13.915573 systemd[1]: run-netns-cni\x2d54d7d450\x2de24e\x2d0a67\x2d8a26\x2df5ea496e0638.mount: Deactivated successfully. Sep 9 05:44:18.973222 kubelet[2811]: I0909 05:44:18.973160 2811 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:44:20.079531 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount969220903.mount: Deactivated successfully. Sep 9 05:44:20.109064 containerd[1568]: time="2025-09-09T05:44:20.108987205Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:20.111050 containerd[1568]: time="2025-09-09T05:44:20.110991983Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 05:44:20.112654 containerd[1568]: time="2025-09-09T05:44:20.112603438Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:20.115681 containerd[1568]: time="2025-09-09T05:44:20.115611001Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:20.116675 containerd[1568]: time="2025-09-09T05:44:20.116548970Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 7.004866214s" Sep 9 05:44:20.116793 containerd[1568]: time="2025-09-09T05:44:20.116685081Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 05:44:20.130766 containerd[1568]: time="2025-09-09T05:44:20.130724825Z" level=info msg="CreateContainer within sandbox \"c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 05:44:20.144990 containerd[1568]: time="2025-09-09T05:44:20.144530733Z" level=info msg="Container 847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:20.165667 containerd[1568]: time="2025-09-09T05:44:20.165599310Z" level=info msg="CreateContainer within sandbox \"c0aec4611eb67baa667b505e68308c7e4d1f3a86ca7d2a84562fbc2c54dd0667\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658\"" Sep 9 05:44:20.166581 containerd[1568]: time="2025-09-09T05:44:20.166541517Z" level=info msg="StartContainer for \"847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658\"" Sep 9 05:44:20.168869 containerd[1568]: time="2025-09-09T05:44:20.168825497Z" level=info msg="connecting to shim 847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658" address="unix:///run/containerd/s/f7449390268968ee76d81c8390bf18aab0213f1f55ff2e8b02aa06baf6f01d12" protocol=ttrpc version=3 Sep 9 05:44:20.204413 systemd[1]: Started cri-containerd-847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658.scope - libcontainer container 847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658. Sep 9 05:44:20.268149 containerd[1568]: time="2025-09-09T05:44:20.268100834Z" level=info msg="StartContainer for \"847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658\" returns successfully" Sep 9 05:44:20.390500 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 05:44:20.390670 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 05:44:20.609051 kubelet[2811]: I0909 05:44:20.608998 2811 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf692b7e-548d-45bc-bdd0-baa91f595ecc-whisker-ca-bundle\") pod \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\" (UID: \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\") " Sep 9 05:44:20.609752 kubelet[2811]: I0909 05:44:20.609090 2811 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf692b7e-548d-45bc-bdd0-baa91f595ecc-whisker-backend-key-pair\") pod \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\" (UID: \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\") " Sep 9 05:44:20.609752 kubelet[2811]: I0909 05:44:20.609125 2811 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqbs7\" (UniqueName: \"kubernetes.io/projected/cf692b7e-548d-45bc-bdd0-baa91f595ecc-kube-api-access-mqbs7\") pod \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\" (UID: \"cf692b7e-548d-45bc-bdd0-baa91f595ecc\") " Sep 9 05:44:20.612214 kubelet[2811]: I0909 05:44:20.610743 2811 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf692b7e-548d-45bc-bdd0-baa91f595ecc-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "cf692b7e-548d-45bc-bdd0-baa91f595ecc" (UID: "cf692b7e-548d-45bc-bdd0-baa91f595ecc"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 05:44:20.618622 kubelet[2811]: I0909 05:44:20.618576 2811 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf692b7e-548d-45bc-bdd0-baa91f595ecc-kube-api-access-mqbs7" (OuterVolumeSpecName: "kube-api-access-mqbs7") pod "cf692b7e-548d-45bc-bdd0-baa91f595ecc" (UID: "cf692b7e-548d-45bc-bdd0-baa91f595ecc"). InnerVolumeSpecName "kube-api-access-mqbs7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 05:44:20.620254 kubelet[2811]: I0909 05:44:20.620111 2811 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf692b7e-548d-45bc-bdd0-baa91f595ecc-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "cf692b7e-548d-45bc-bdd0-baa91f595ecc" (UID: "cf692b7e-548d-45bc-bdd0-baa91f595ecc"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 05:44:20.710265 kubelet[2811]: I0909 05:44:20.710207 2811 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf692b7e-548d-45bc-bdd0-baa91f595ecc-whisker-backend-key-pair\") on node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" DevicePath \"\"" Sep 9 05:44:20.710265 kubelet[2811]: I0909 05:44:20.710250 2811 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqbs7\" (UniqueName: \"kubernetes.io/projected/cf692b7e-548d-45bc-bdd0-baa91f595ecc-kube-api-access-mqbs7\") on node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" DevicePath \"\"" Sep 9 05:44:20.710265 kubelet[2811]: I0909 05:44:20.710271 2811 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf692b7e-548d-45bc-bdd0-baa91f595ecc-whisker-ca-bundle\") on node \"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023\" DevicePath \"\"" Sep 9 05:44:20.916522 systemd[1]: Removed slice kubepods-besteffort-podcf692b7e_548d_45bc_bdd0_baa91f595ecc.slice - libcontainer container kubepods-besteffort-podcf692b7e_548d_45bc_bdd0_baa91f595ecc.slice. Sep 9 05:44:21.077428 systemd[1]: var-lib-kubelet-pods-cf692b7e\x2d548d\x2d45bc\x2dbdd0\x2dbaa91f595ecc-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmqbs7.mount: Deactivated successfully. Sep 9 05:44:21.077595 systemd[1]: var-lib-kubelet-pods-cf692b7e\x2d548d\x2d45bc\x2dbdd0\x2dbaa91f595ecc-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 05:44:21.164775 kubelet[2811]: I0909 05:44:21.164511 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-x7xf4" podStartSLOduration=1.933193321 podStartE2EDuration="20.164487228s" podCreationTimestamp="2025-09-09 05:44:01 +0000 UTC" firstStartedPulling="2025-09-09 05:44:01.886714921 +0000 UTC m=+23.171262932" lastFinishedPulling="2025-09-09 05:44:20.118008817 +0000 UTC m=+41.402556839" observedRunningTime="2025-09-09 05:44:21.162664263 +0000 UTC m=+42.447212290" watchObservedRunningTime="2025-09-09 05:44:21.164487228 +0000 UTC m=+42.449035248" Sep 9 05:44:21.239717 systemd[1]: Created slice kubepods-besteffort-pod3943ccb9_9943_4579_8802_70240cb69bf1.slice - libcontainer container kubepods-besteffort-pod3943ccb9_9943_4579_8802_70240cb69bf1.slice. Sep 9 05:44:21.316568 kubelet[2811]: I0909 05:44:21.316510 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3943ccb9-9943-4579-8802-70240cb69bf1-whisker-backend-key-pair\") pod \"whisker-697798bcff-44slw\" (UID: \"3943ccb9-9943-4579-8802-70240cb69bf1\") " pod="calico-system/whisker-697798bcff-44slw" Sep 9 05:44:21.316568 kubelet[2811]: I0909 05:44:21.316587 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrt46\" (UniqueName: \"kubernetes.io/projected/3943ccb9-9943-4579-8802-70240cb69bf1-kube-api-access-vrt46\") pod \"whisker-697798bcff-44slw\" (UID: \"3943ccb9-9943-4579-8802-70240cb69bf1\") " pod="calico-system/whisker-697798bcff-44slw" Sep 9 05:44:21.316835 kubelet[2811]: I0909 05:44:21.316628 2811 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3943ccb9-9943-4579-8802-70240cb69bf1-whisker-ca-bundle\") pod \"whisker-697798bcff-44slw\" (UID: \"3943ccb9-9943-4579-8802-70240cb69bf1\") " pod="calico-system/whisker-697798bcff-44slw" Sep 9 05:44:21.551243 containerd[1568]: time="2025-09-09T05:44:21.551028843Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-697798bcff-44slw,Uid:3943ccb9-9943-4579-8802-70240cb69bf1,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:21.706158 systemd-networkd[1468]: califaa58d7f49d: Link UP Sep 9 05:44:21.706496 systemd-networkd[1468]: califaa58d7f49d: Gained carrier Sep 9 05:44:21.730402 containerd[1568]: 2025-09-09 05:44:21.589 [INFO][3935] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:44:21.730402 containerd[1568]: 2025-09-09 05:44:21.602 [INFO][3935] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0 whisker-697798bcff- calico-system 3943ccb9-9943-4579-8802-70240cb69bf1 912 0 2025-09-09 05:44:21 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:697798bcff projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023 whisker-697798bcff-44slw eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] califaa58d7f49d [] [] }} ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Namespace="calico-system" Pod="whisker-697798bcff-44slw" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-" Sep 9 05:44:21.730402 containerd[1568]: 2025-09-09 05:44:21.602 [INFO][3935] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Namespace="calico-system" Pod="whisker-697798bcff-44slw" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" Sep 9 05:44:21.730402 containerd[1568]: 2025-09-09 05:44:21.642 [INFO][3946] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" HandleID="k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" Sep 9 05:44:21.730774 containerd[1568]: 2025-09-09 05:44:21.642 [INFO][3946] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" HandleID="k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", "pod":"whisker-697798bcff-44slw", "timestamp":"2025-09-09 05:44:21.64276498 +0000 UTC"}, Hostname:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:44:21.730774 containerd[1568]: 2025-09-09 05:44:21.643 [INFO][3946] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:44:21.730774 containerd[1568]: 2025-09-09 05:44:21.643 [INFO][3946] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:44:21.730774 containerd[1568]: 2025-09-09 05:44:21.643 [INFO][3946] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:44:21.730774 containerd[1568]: 2025-09-09 05:44:21.652 [INFO][3946] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.730774 containerd[1568]: 2025-09-09 05:44:21.658 [INFO][3946] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.730774 containerd[1568]: 2025-09-09 05:44:21.668 [INFO][3946] ipam/ipam.go 511: Trying affinity for 192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.730774 containerd[1568]: 2025-09-09 05:44:21.671 [INFO][3946] ipam/ipam.go 158: Attempting to load block cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.731282 containerd[1568]: 2025-09-09 05:44:21.674 [INFO][3946] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.731282 containerd[1568]: 2025-09-09 05:44:21.674 [INFO][3946] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.107.128/26 handle="k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.731282 containerd[1568]: 2025-09-09 05:44:21.676 [INFO][3946] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5 Sep 9 05:44:21.731282 containerd[1568]: 2025-09-09 05:44:21.682 [INFO][3946] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.107.128/26 handle="k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.731282 containerd[1568]: 2025-09-09 05:44:21.688 [INFO][3946] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.107.129/26] block=192.168.107.128/26 handle="k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.731282 containerd[1568]: 2025-09-09 05:44:21.689 [INFO][3946] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.107.129/26] handle="k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:21.731282 containerd[1568]: 2025-09-09 05:44:21.690 [INFO][3946] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:44:21.731282 containerd[1568]: 2025-09-09 05:44:21.690 [INFO][3946] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.107.129/26] IPv6=[] ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" HandleID="k8s-pod-network.51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" Sep 9 05:44:21.732107 containerd[1568]: 2025-09-09 05:44:21.694 [INFO][3935] cni-plugin/k8s.go 418: Populated endpoint ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Namespace="calico-system" Pod="whisker-697798bcff-44slw" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0", GenerateName:"whisker-697798bcff-", Namespace:"calico-system", SelfLink:"", UID:"3943ccb9-9943-4579-8802-70240cb69bf1", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 44, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"697798bcff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"", Pod:"whisker-697798bcff-44slw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.107.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"califaa58d7f49d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:21.732309 containerd[1568]: 2025-09-09 05:44:21.694 [INFO][3935] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.129/32] ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Namespace="calico-system" Pod="whisker-697798bcff-44slw" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" Sep 9 05:44:21.732309 containerd[1568]: 2025-09-09 05:44:21.694 [INFO][3935] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califaa58d7f49d ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Namespace="calico-system" Pod="whisker-697798bcff-44slw" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" Sep 9 05:44:21.732309 containerd[1568]: 2025-09-09 05:44:21.705 [INFO][3935] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Namespace="calico-system" Pod="whisker-697798bcff-44slw" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" Sep 9 05:44:21.732679 containerd[1568]: 2025-09-09 05:44:21.708 [INFO][3935] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Namespace="calico-system" Pod="whisker-697798bcff-44slw" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0", GenerateName:"whisker-697798bcff-", Namespace:"calico-system", SelfLink:"", UID:"3943ccb9-9943-4579-8802-70240cb69bf1", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 44, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"697798bcff", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5", Pod:"whisker-697798bcff-44slw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.107.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"califaa58d7f49d", MAC:"02:e9:81:be:6c:ee", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:21.732910 containerd[1568]: 2025-09-09 05:44:21.725 [INFO][3935] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" Namespace="calico-system" Pod="whisker-697798bcff-44slw" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-whisker--697798bcff--44slw-eth0" Sep 9 05:44:21.770409 containerd[1568]: time="2025-09-09T05:44:21.770226761Z" level=info msg="connecting to shim 51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5" address="unix:///run/containerd/s/2dcd8d0b79100ff5a29d778786117e642868ee3093d07fb3fbacbf9329e844e5" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:21.808428 systemd[1]: Started cri-containerd-51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5.scope - libcontainer container 51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5. Sep 9 05:44:21.956412 containerd[1568]: time="2025-09-09T05:44:21.956295828Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-697798bcff-44slw,Uid:3943ccb9-9943-4579-8802-70240cb69bf1,Namespace:calico-system,Attempt:0,} returns sandbox id \"51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5\"" Sep 9 05:44:21.960601 containerd[1568]: time="2025-09-09T05:44:21.960542718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 05:44:22.775453 systemd-networkd[1468]: vxlan.calico: Link UP Sep 9 05:44:22.775494 systemd-networkd[1468]: vxlan.calico: Gained carrier Sep 9 05:44:22.915522 kubelet[2811]: I0909 05:44:22.915237 2811 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf692b7e-548d-45bc-bdd0-baa91f595ecc" path="/var/lib/kubelet/pods/cf692b7e-548d-45bc-bdd0-baa91f595ecc/volumes" Sep 9 05:44:23.005203 systemd-networkd[1468]: califaa58d7f49d: Gained IPv6LL Sep 9 05:44:23.101461 containerd[1568]: time="2025-09-09T05:44:23.101394959Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:23.104212 containerd[1568]: time="2025-09-09T05:44:23.103929051Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 05:44:23.107210 containerd[1568]: time="2025-09-09T05:44:23.107101832Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:23.111782 containerd[1568]: time="2025-09-09T05:44:23.111716622Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:23.114263 containerd[1568]: time="2025-09-09T05:44:23.114089370Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.152832301s" Sep 9 05:44:23.114263 containerd[1568]: time="2025-09-09T05:44:23.114136181Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 05:44:23.118266 containerd[1568]: time="2025-09-09T05:44:23.117824050Z" level=info msg="CreateContainer within sandbox \"51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 05:44:23.133624 containerd[1568]: time="2025-09-09T05:44:23.133567907Z" level=info msg="Container 63f4d66bfa5df2f5d94ceda6257c6cf2d5c04ba8995cd4e20c9cc0c77732505c: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:23.149741 containerd[1568]: time="2025-09-09T05:44:23.149676836Z" level=info msg="CreateContainer within sandbox \"51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"63f4d66bfa5df2f5d94ceda6257c6cf2d5c04ba8995cd4e20c9cc0c77732505c\"" Sep 9 05:44:23.153126 containerd[1568]: time="2025-09-09T05:44:23.153077281Z" level=info msg="StartContainer for \"63f4d66bfa5df2f5d94ceda6257c6cf2d5c04ba8995cd4e20c9cc0c77732505c\"" Sep 9 05:44:23.161133 containerd[1568]: time="2025-09-09T05:44:23.161070939Z" level=info msg="connecting to shim 63f4d66bfa5df2f5d94ceda6257c6cf2d5c04ba8995cd4e20c9cc0c77732505c" address="unix:///run/containerd/s/2dcd8d0b79100ff5a29d778786117e642868ee3093d07fb3fbacbf9329e844e5" protocol=ttrpc version=3 Sep 9 05:44:23.213431 systemd[1]: Started cri-containerd-63f4d66bfa5df2f5d94ceda6257c6cf2d5c04ba8995cd4e20c9cc0c77732505c.scope - libcontainer container 63f4d66bfa5df2f5d94ceda6257c6cf2d5c04ba8995cd4e20c9cc0c77732505c. Sep 9 05:44:23.308474 containerd[1568]: time="2025-09-09T05:44:23.308421112Z" level=info msg="StartContainer for \"63f4d66bfa5df2f5d94ceda6257c6cf2d5c04ba8995cd4e20c9cc0c77732505c\" returns successfully" Sep 9 05:44:23.312840 containerd[1568]: time="2025-09-09T05:44:23.312789556Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 05:44:24.796980 systemd-networkd[1468]: vxlan.calico: Gained IPv6LL Sep 9 05:44:24.908999 containerd[1568]: time="2025-09-09T05:44:24.908703582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gz8d9,Uid:77d44113-c086-484e-89d4-b7d4844d3cef,Namespace:kube-system,Attempt:0,}" Sep 9 05:44:24.912294 containerd[1568]: time="2025-09-09T05:44:24.912240186Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mvxm4,Uid:8c449465-a3d4-4a22-abbd-fae36370667f,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:25.228211 systemd-networkd[1468]: calie4fc33b6f0b: Link UP Sep 9 05:44:25.233840 systemd-networkd[1468]: calie4fc33b6f0b: Gained carrier Sep 9 05:44:25.281948 containerd[1568]: 2025-09-09 05:44:25.045 [INFO][4245] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0 coredns-668d6bf9bc- kube-system 77d44113-c086-484e-89d4-b7d4844d3cef 823 0 2025-09-09 05:43:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023 coredns-668d6bf9bc-gz8d9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie4fc33b6f0b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Namespace="kube-system" Pod="coredns-668d6bf9bc-gz8d9" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-" Sep 9 05:44:25.281948 containerd[1568]: 2025-09-09 05:44:25.046 [INFO][4245] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Namespace="kube-system" Pod="coredns-668d6bf9bc-gz8d9" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" Sep 9 05:44:25.281948 containerd[1568]: 2025-09-09 05:44:25.131 [INFO][4269] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" HandleID="k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" Sep 9 05:44:25.282933 containerd[1568]: 2025-09-09 05:44:25.132 [INFO][4269] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" HandleID="k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000315b10), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", "pod":"coredns-668d6bf9bc-gz8d9", "timestamp":"2025-09-09 05:44:25.131769627 +0000 UTC"}, Hostname:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:44:25.282933 containerd[1568]: 2025-09-09 05:44:25.132 [INFO][4269] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:44:25.282933 containerd[1568]: 2025-09-09 05:44:25.132 [INFO][4269] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:44:25.282933 containerd[1568]: 2025-09-09 05:44:25.133 [INFO][4269] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:44:25.282933 containerd[1568]: 2025-09-09 05:44:25.149 [INFO][4269] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.282933 containerd[1568]: 2025-09-09 05:44:25.159 [INFO][4269] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.282933 containerd[1568]: 2025-09-09 05:44:25.169 [INFO][4269] ipam/ipam.go 511: Trying affinity for 192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.282933 containerd[1568]: 2025-09-09 05:44:25.171 [INFO][4269] ipam/ipam.go 158: Attempting to load block cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.284826 containerd[1568]: 2025-09-09 05:44:25.175 [INFO][4269] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.284826 containerd[1568]: 2025-09-09 05:44:25.175 [INFO][4269] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.107.128/26 handle="k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.284826 containerd[1568]: 2025-09-09 05:44:25.178 [INFO][4269] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee Sep 9 05:44:25.284826 containerd[1568]: 2025-09-09 05:44:25.187 [INFO][4269] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.107.128/26 handle="k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.284826 containerd[1568]: 2025-09-09 05:44:25.203 [INFO][4269] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.107.130/26] block=192.168.107.128/26 handle="k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.284826 containerd[1568]: 2025-09-09 05:44:25.204 [INFO][4269] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.107.130/26] handle="k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.284826 containerd[1568]: 2025-09-09 05:44:25.204 [INFO][4269] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:44:25.284826 containerd[1568]: 2025-09-09 05:44:25.204 [INFO][4269] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.107.130/26] IPv6=[] ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" HandleID="k8s-pod-network.425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" Sep 9 05:44:25.286265 containerd[1568]: 2025-09-09 05:44:25.216 [INFO][4245] cni-plugin/k8s.go 418: Populated endpoint ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Namespace="kube-system" Pod="coredns-668d6bf9bc-gz8d9" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"77d44113-c086-484e-89d4-b7d4844d3cef", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 43, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"", Pod:"coredns-668d6bf9bc-gz8d9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.107.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie4fc33b6f0b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:25.286265 containerd[1568]: 2025-09-09 05:44:25.217 [INFO][4245] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.130/32] ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Namespace="kube-system" Pod="coredns-668d6bf9bc-gz8d9" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" Sep 9 05:44:25.286265 containerd[1568]: 2025-09-09 05:44:25.218 [INFO][4245] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie4fc33b6f0b ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Namespace="kube-system" Pod="coredns-668d6bf9bc-gz8d9" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" Sep 9 05:44:25.286265 containerd[1568]: 2025-09-09 05:44:25.237 [INFO][4245] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Namespace="kube-system" Pod="coredns-668d6bf9bc-gz8d9" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" Sep 9 05:44:25.286265 containerd[1568]: 2025-09-09 05:44:25.241 [INFO][4245] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Namespace="kube-system" Pod="coredns-668d6bf9bc-gz8d9" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"77d44113-c086-484e-89d4-b7d4844d3cef", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 43, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee", Pod:"coredns-668d6bf9bc-gz8d9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.107.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie4fc33b6f0b", MAC:"82:7e:66:1e:70:5f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:25.286265 containerd[1568]: 2025-09-09 05:44:25.272 [INFO][4245] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" Namespace="kube-system" Pod="coredns-668d6bf9bc-gz8d9" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--gz8d9-eth0" Sep 9 05:44:25.366379 systemd-networkd[1468]: califeeaba070c7: Link UP Sep 9 05:44:25.368359 systemd-networkd[1468]: califeeaba070c7: Gained carrier Sep 9 05:44:25.384322 containerd[1568]: time="2025-09-09T05:44:25.384251435Z" level=info msg="connecting to shim 425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee" address="unix:///run/containerd/s/07373091ecd4e77426488a1d62c48ab36d3a2f0791209915bcc1a9444e7344cc" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.052 [INFO][4243] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0 goldmane-54d579b49d- calico-system 8c449465-a3d4-4a22-abbd-fae36370667f 833 0 2025-09-09 05:44:00 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023 goldmane-54d579b49d-mvxm4 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califeeaba070c7 [] [] }} ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Namespace="calico-system" Pod="goldmane-54d579b49d-mvxm4" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.053 [INFO][4243] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Namespace="calico-system" Pod="goldmane-54d579b49d-mvxm4" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.159 [INFO][4271] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" HandleID="k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.160 [INFO][4271] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" HandleID="k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cb870), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", "pod":"goldmane-54d579b49d-mvxm4", "timestamp":"2025-09-09 05:44:25.159740163 +0000 UTC"}, Hostname:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.160 [INFO][4271] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.204 [INFO][4271] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.204 [INFO][4271] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.250 [INFO][4271] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.264 [INFO][4271] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.279 [INFO][4271] ipam/ipam.go 511: Trying affinity for 192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.286 [INFO][4271] ipam/ipam.go 158: Attempting to load block cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.297 [INFO][4271] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.297 [INFO][4271] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.107.128/26 handle="k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.305 [INFO][4271] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.317 [INFO][4271] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.107.128/26 handle="k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.339 [INFO][4271] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.107.131/26] block=192.168.107.128/26 handle="k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.339 [INFO][4271] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.107.131/26] handle="k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.339 [INFO][4271] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:44:25.411341 containerd[1568]: 2025-09-09 05:44:25.339 [INFO][4271] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.107.131/26] IPv6=[] ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" HandleID="k8s-pod-network.0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" Sep 9 05:44:25.413314 containerd[1568]: 2025-09-09 05:44:25.346 [INFO][4243] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Namespace="calico-system" Pod="goldmane-54d579b49d-mvxm4" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"8c449465-a3d4-4a22-abbd-fae36370667f", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 44, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"", Pod:"goldmane-54d579b49d-mvxm4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.107.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califeeaba070c7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:25.413314 containerd[1568]: 2025-09-09 05:44:25.347 [INFO][4243] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.131/32] ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Namespace="calico-system" Pod="goldmane-54d579b49d-mvxm4" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" Sep 9 05:44:25.413314 containerd[1568]: 2025-09-09 05:44:25.347 [INFO][4243] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califeeaba070c7 ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Namespace="calico-system" Pod="goldmane-54d579b49d-mvxm4" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" Sep 9 05:44:25.413314 containerd[1568]: 2025-09-09 05:44:25.368 [INFO][4243] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Namespace="calico-system" Pod="goldmane-54d579b49d-mvxm4" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" Sep 9 05:44:25.413314 containerd[1568]: 2025-09-09 05:44:25.375 [INFO][4243] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Namespace="calico-system" Pod="goldmane-54d579b49d-mvxm4" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"8c449465-a3d4-4a22-abbd-fae36370667f", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 44, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e", Pod:"goldmane-54d579b49d-mvxm4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.107.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califeeaba070c7", MAC:"c6:17:34:b1:20:be", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:25.413314 containerd[1568]: 2025-09-09 05:44:25.408 [INFO][4243] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" Namespace="calico-system" Pod="goldmane-54d579b49d-mvxm4" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-goldmane--54d579b49d--mvxm4-eth0" Sep 9 05:44:25.492589 systemd[1]: Started cri-containerd-425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee.scope - libcontainer container 425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee. Sep 9 05:44:25.526271 containerd[1568]: time="2025-09-09T05:44:25.525742641Z" level=info msg="connecting to shim 0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e" address="unix:///run/containerd/s/ff386e4ab11f0b771e60bd6836183009743b18e027b6513642be270859d2fcb6" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:25.607534 systemd[1]: Started cri-containerd-0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e.scope - libcontainer container 0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e. Sep 9 05:44:25.640013 containerd[1568]: time="2025-09-09T05:44:25.639949260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gz8d9,Uid:77d44113-c086-484e-89d4-b7d4844d3cef,Namespace:kube-system,Attempt:0,} returns sandbox id \"425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee\"" Sep 9 05:44:25.648132 containerd[1568]: time="2025-09-09T05:44:25.648003810Z" level=info msg="CreateContainer within sandbox \"425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:44:25.669107 containerd[1568]: time="2025-09-09T05:44:25.669061932Z" level=info msg="Container e144401c835fd2a89c128ff162afca9677732c4d19fe3b90c6365b334ba73d04: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:25.696800 containerd[1568]: time="2025-09-09T05:44:25.696744621Z" level=info msg="CreateContainer within sandbox \"425bf0e567bdd9e00377f94f18e393d9c467c22dc43e555b3915f0b56567b1ee\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e144401c835fd2a89c128ff162afca9677732c4d19fe3b90c6365b334ba73d04\"" Sep 9 05:44:25.699209 containerd[1568]: time="2025-09-09T05:44:25.698396973Z" level=info msg="StartContainer for \"e144401c835fd2a89c128ff162afca9677732c4d19fe3b90c6365b334ba73d04\"" Sep 9 05:44:25.700820 containerd[1568]: time="2025-09-09T05:44:25.700782951Z" level=info msg="connecting to shim e144401c835fd2a89c128ff162afca9677732c4d19fe3b90c6365b334ba73d04" address="unix:///run/containerd/s/07373091ecd4e77426488a1d62c48ab36d3a2f0791209915bcc1a9444e7344cc" protocol=ttrpc version=3 Sep 9 05:44:25.758458 systemd[1]: Started cri-containerd-e144401c835fd2a89c128ff162afca9677732c4d19fe3b90c6365b334ba73d04.scope - libcontainer container e144401c835fd2a89c128ff162afca9677732c4d19fe3b90c6365b334ba73d04. Sep 9 05:44:25.828008 containerd[1568]: time="2025-09-09T05:44:25.827961502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-mvxm4,Uid:8c449465-a3d4-4a22-abbd-fae36370667f,Namespace:calico-system,Attempt:0,} returns sandbox id \"0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e\"" Sep 9 05:44:25.852645 containerd[1568]: time="2025-09-09T05:44:25.852597570Z" level=info msg="StartContainer for \"e144401c835fd2a89c128ff162afca9677732c4d19fe3b90c6365b334ba73d04\" returns successfully" Sep 9 05:44:25.891215 containerd[1568]: time="2025-09-09T05:44:25.890099170Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:25.892372 containerd[1568]: time="2025-09-09T05:44:25.892330695Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 05:44:25.894079 containerd[1568]: time="2025-09-09T05:44:25.894031196Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:25.897803 containerd[1568]: time="2025-09-09T05:44:25.897762514Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:25.900689 containerd[1568]: time="2025-09-09T05:44:25.900623263Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.587781859s" Sep 9 05:44:25.901293 containerd[1568]: time="2025-09-09T05:44:25.901261122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 05:44:25.904899 containerd[1568]: time="2025-09-09T05:44:25.904871385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 05:44:25.906900 containerd[1568]: time="2025-09-09T05:44:25.906751852Z" level=info msg="CreateContainer within sandbox \"51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 05:44:25.911014 containerd[1568]: time="2025-09-09T05:44:25.910985361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nt4z8,Uid:fae9fae7-86cf-409f-b61d-e750df25a87f,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:25.912147 containerd[1568]: time="2025-09-09T05:44:25.911974244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d7857fcf5-gs95d,Uid:86de38b5-26f0-4f51-9314-86aac9313780,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:44:25.920442 containerd[1568]: time="2025-09-09T05:44:25.919160430Z" level=info msg="Container e5a5e71ec427b0e09eac5816758a8765759b792aba71b753db74d7e41717b60a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:25.940094 containerd[1568]: time="2025-09-09T05:44:25.940038462Z" level=info msg="CreateContainer within sandbox \"51e4f475ba3503f7aec2345d009c7d125f73ffc4b517f87e4ee65772e25e2be5\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"e5a5e71ec427b0e09eac5816758a8765759b792aba71b753db74d7e41717b60a\"" Sep 9 05:44:25.955565 containerd[1568]: time="2025-09-09T05:44:25.955497760Z" level=info msg="StartContainer for \"e5a5e71ec427b0e09eac5816758a8765759b792aba71b753db74d7e41717b60a\"" Sep 9 05:44:25.973619 containerd[1568]: time="2025-09-09T05:44:25.973521503Z" level=info msg="connecting to shim e5a5e71ec427b0e09eac5816758a8765759b792aba71b753db74d7e41717b60a" address="unix:///run/containerd/s/2dcd8d0b79100ff5a29d778786117e642868ee3093d07fb3fbacbf9329e844e5" protocol=ttrpc version=3 Sep 9 05:44:26.029847 systemd[1]: Started cri-containerd-e5a5e71ec427b0e09eac5816758a8765759b792aba71b753db74d7e41717b60a.scope - libcontainer container e5a5e71ec427b0e09eac5816758a8765759b792aba71b753db74d7e41717b60a. Sep 9 05:44:26.229756 containerd[1568]: time="2025-09-09T05:44:26.229704948Z" level=info msg="StartContainer for \"e5a5e71ec427b0e09eac5816758a8765759b792aba71b753db74d7e41717b60a\" returns successfully" Sep 9 05:44:26.257549 systemd-networkd[1468]: cali91deef55180: Link UP Sep 9 05:44:26.259514 systemd-networkd[1468]: cali91deef55180: Gained carrier Sep 9 05:44:26.315358 kubelet[2811]: I0909 05:44:26.315272 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-gz8d9" podStartSLOduration=41.315247395 podStartE2EDuration="41.315247395s" podCreationTimestamp="2025-09-09 05:43:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:44:26.313591472 +0000 UTC m=+47.598139500" watchObservedRunningTime="2025-09-09 05:44:26.315247395 +0000 UTC m=+47.599795420" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.043 [INFO][4440] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0 csi-node-driver- calico-system fae9fae7-86cf-409f-b61d-e750df25a87f 724 0 2025-09-09 05:44:01 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023 csi-node-driver-nt4z8 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali91deef55180 [] [] }} ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Namespace="calico-system" Pod="csi-node-driver-nt4z8" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.044 [INFO][4440] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Namespace="calico-system" Pod="csi-node-driver-nt4z8" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.114 [INFO][4476] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" HandleID="k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.114 [INFO][4476] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" HandleID="k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd110), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", "pod":"csi-node-driver-nt4z8", "timestamp":"2025-09-09 05:44:26.114357114 +0000 UTC"}, Hostname:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.114 [INFO][4476] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.114 [INFO][4476] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.114 [INFO][4476] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.131 [INFO][4476] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.142 [INFO][4476] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.149 [INFO][4476] ipam/ipam.go 511: Trying affinity for 192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.152 [INFO][4476] ipam/ipam.go 158: Attempting to load block cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.156 [INFO][4476] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.156 [INFO][4476] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.107.128/26 handle="k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.167 [INFO][4476] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73 Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.195 [INFO][4476] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.107.128/26 handle="k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.221 [INFO][4476] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.107.132/26] block=192.168.107.128/26 handle="k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.222 [INFO][4476] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.107.132/26] handle="k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.223 [INFO][4476] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:44:26.342370 containerd[1568]: 2025-09-09 05:44:26.224 [INFO][4476] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.107.132/26] IPv6=[] ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" HandleID="k8s-pod-network.7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" Sep 9 05:44:26.347353 containerd[1568]: 2025-09-09 05:44:26.239 [INFO][4440] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Namespace="calico-system" Pod="csi-node-driver-nt4z8" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fae9fae7-86cf-409f-b61d-e750df25a87f", ResourceVersion:"724", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"", Pod:"csi-node-driver-nt4z8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.107.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali91deef55180", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:26.347353 containerd[1568]: 2025-09-09 05:44:26.239 [INFO][4440] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.132/32] ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Namespace="calico-system" Pod="csi-node-driver-nt4z8" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" Sep 9 05:44:26.347353 containerd[1568]: 2025-09-09 05:44:26.239 [INFO][4440] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali91deef55180 ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Namespace="calico-system" Pod="csi-node-driver-nt4z8" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" Sep 9 05:44:26.347353 containerd[1568]: 2025-09-09 05:44:26.264 [INFO][4440] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Namespace="calico-system" Pod="csi-node-driver-nt4z8" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" Sep 9 05:44:26.347353 containerd[1568]: 2025-09-09 05:44:26.274 [INFO][4440] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Namespace="calico-system" Pod="csi-node-driver-nt4z8" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fae9fae7-86cf-409f-b61d-e750df25a87f", ResourceVersion:"724", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73", Pod:"csi-node-driver-nt4z8", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.107.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali91deef55180", MAC:"c2:33:b0:50:72:5f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:26.347353 containerd[1568]: 2025-09-09 05:44:26.336 [INFO][4440] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" Namespace="calico-system" Pod="csi-node-driver-nt4z8" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-csi--node--driver--nt4z8-eth0" Sep 9 05:44:26.392937 containerd[1568]: time="2025-09-09T05:44:26.392851652Z" level=info msg="connecting to shim 7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73" address="unix:///run/containerd/s/26b07d6c09439393330f4690523f7183173bc6be4f6049c7a145411138762ab0" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:26.440803 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3055012923.mount: Deactivated successfully. Sep 9 05:44:26.480984 systemd[1]: Started cri-containerd-7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73.scope - libcontainer container 7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73. Sep 9 05:44:26.498445 systemd-networkd[1468]: calie09adfbe9bd: Link UP Sep 9 05:44:26.501237 systemd-networkd[1468]: calie09adfbe9bd: Gained carrier Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.069 [INFO][4431] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0 calico-apiserver-7d7857fcf5- calico-apiserver 86de38b5-26f0-4f51-9314-86aac9313780 836 0 2025-09-09 05:43:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d7857fcf5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023 calico-apiserver-7d7857fcf5-gs95d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie09adfbe9bd [] [] }} ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-gs95d" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.069 [INFO][4431] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-gs95d" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.144 [INFO][4482] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" HandleID="k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.144 [INFO][4482] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" HandleID="k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000370d30), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", "pod":"calico-apiserver-7d7857fcf5-gs95d", "timestamp":"2025-09-09 05:44:26.144140263 +0000 UTC"}, Hostname:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.144 [INFO][4482] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.223 [INFO][4482] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.224 [INFO][4482] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.277 [INFO][4482] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.332 [INFO][4482] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.360 [INFO][4482] ipam/ipam.go 511: Trying affinity for 192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.377 [INFO][4482] ipam/ipam.go 158: Attempting to load block cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.407 [INFO][4482] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.408 [INFO][4482] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.107.128/26 handle="k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.416 [INFO][4482] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65 Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.444 [INFO][4482] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.107.128/26 handle="k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.473 [INFO][4482] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.107.133/26] block=192.168.107.128/26 handle="k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.473 [INFO][4482] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.107.133/26] handle="k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.474 [INFO][4482] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:44:26.558798 containerd[1568]: 2025-09-09 05:44:26.474 [INFO][4482] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.107.133/26] IPv6=[] ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" HandleID="k8s-pod-network.65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" Sep 9 05:44:26.560001 containerd[1568]: 2025-09-09 05:44:26.484 [INFO][4431] cni-plugin/k8s.go 418: Populated endpoint ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-gs95d" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0", GenerateName:"calico-apiserver-7d7857fcf5-", Namespace:"calico-apiserver", SelfLink:"", UID:"86de38b5-26f0-4f51-9314-86aac9313780", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d7857fcf5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"", Pod:"calico-apiserver-7d7857fcf5-gs95d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.107.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie09adfbe9bd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:26.560001 containerd[1568]: 2025-09-09 05:44:26.486 [INFO][4431] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.133/32] ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-gs95d" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" Sep 9 05:44:26.560001 containerd[1568]: 2025-09-09 05:44:26.486 [INFO][4431] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie09adfbe9bd ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-gs95d" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" Sep 9 05:44:26.560001 containerd[1568]: 2025-09-09 05:44:26.504 [INFO][4431] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-gs95d" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" Sep 9 05:44:26.560001 containerd[1568]: 2025-09-09 05:44:26.509 [INFO][4431] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-gs95d" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0", GenerateName:"calico-apiserver-7d7857fcf5-", Namespace:"calico-apiserver", SelfLink:"", UID:"86de38b5-26f0-4f51-9314-86aac9313780", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d7857fcf5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65", Pod:"calico-apiserver-7d7857fcf5-gs95d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.107.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie09adfbe9bd", MAC:"6e:7f:dc:fd:76:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:26.560001 containerd[1568]: 2025-09-09 05:44:26.554 [INFO][4431] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-gs95d" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--gs95d-eth0" Sep 9 05:44:26.613233 containerd[1568]: time="2025-09-09T05:44:26.612078858Z" level=info msg="connecting to shim 65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65" address="unix:///run/containerd/s/130f7444ef34b9d45bea3846c253d728d9e8cfb44c8a17e6050a90809f3642c9" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:26.694425 systemd[1]: Started cri-containerd-65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65.scope - libcontainer container 65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65. Sep 9 05:44:26.741377 containerd[1568]: time="2025-09-09T05:44:26.741218536Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-nt4z8,Uid:fae9fae7-86cf-409f-b61d-e750df25a87f,Namespace:calico-system,Attempt:0,} returns sandbox id \"7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73\"" Sep 9 05:44:26.906657 containerd[1568]: time="2025-09-09T05:44:26.905553229Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d7857fcf5-gs95d,Uid:86de38b5-26f0-4f51-9314-86aac9313780,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65\"" Sep 9 05:44:26.972550 containerd[1568]: time="2025-09-09T05:44:26.972487057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8c89c8979-s4l22,Uid:005a5e53-ed87-45db-9af7-c3f891ffa294,Namespace:calico-system,Attempt:0,}" Sep 9 05:44:26.988445 containerd[1568]: time="2025-09-09T05:44:26.988203170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2tqqm,Uid:a2a152ca-9105-4e9c-9026-f43cca5fbe8f,Namespace:kube-system,Attempt:0,}" Sep 9 05:44:27.100382 systemd-networkd[1468]: calie4fc33b6f0b: Gained IPv6LL Sep 9 05:44:27.292710 systemd-networkd[1468]: califeeaba070c7: Gained IPv6LL Sep 9 05:44:27.329256 systemd-networkd[1468]: cali781694f10b4: Link UP Sep 9 05:44:27.335358 systemd-networkd[1468]: cali781694f10b4: Gained carrier Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.078 [INFO][4616] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0 calico-kube-controllers-8c89c8979- calico-system 005a5e53-ed87-45db-9af7-c3f891ffa294 830 0 2025-09-09 05:44:01 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8c89c8979 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023 calico-kube-controllers-8c89c8979-s4l22 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali781694f10b4 [] [] }} ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Namespace="calico-system" Pod="calico-kube-controllers-8c89c8979-s4l22" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.078 [INFO][4616] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Namespace="calico-system" Pod="calico-kube-controllers-8c89c8979-s4l22" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.191 [INFO][4639] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" HandleID="k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.193 [INFO][4639] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" HandleID="k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000322250), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", "pod":"calico-kube-controllers-8c89c8979-s4l22", "timestamp":"2025-09-09 05:44:27.191314367 +0000 UTC"}, Hostname:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.193 [INFO][4639] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.193 [INFO][4639] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.193 [INFO][4639] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.221 [INFO][4639] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.235 [INFO][4639] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.243 [INFO][4639] ipam/ipam.go 511: Trying affinity for 192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.250 [INFO][4639] ipam/ipam.go 158: Attempting to load block cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.257 [INFO][4639] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.258 [INFO][4639] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.107.128/26 handle="k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.261 [INFO][4639] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.280 [INFO][4639] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.107.128/26 handle="k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.305 [INFO][4639] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.107.134/26] block=192.168.107.128/26 handle="k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.305 [INFO][4639] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.107.134/26] handle="k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.305 [INFO][4639] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:44:27.407993 containerd[1568]: 2025-09-09 05:44:27.307 [INFO][4639] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.107.134/26] IPv6=[] ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" HandleID="k8s-pod-network.5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" Sep 9 05:44:27.411137 containerd[1568]: 2025-09-09 05:44:27.321 [INFO][4616] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Namespace="calico-system" Pod="calico-kube-controllers-8c89c8979-s4l22" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0", GenerateName:"calico-kube-controllers-8c89c8979-", Namespace:"calico-system", SelfLink:"", UID:"005a5e53-ed87-45db-9af7-c3f891ffa294", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8c89c8979", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"", Pod:"calico-kube-controllers-8c89c8979-s4l22", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.107.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali781694f10b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:27.411137 containerd[1568]: 2025-09-09 05:44:27.321 [INFO][4616] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.134/32] ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Namespace="calico-system" Pod="calico-kube-controllers-8c89c8979-s4l22" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" Sep 9 05:44:27.411137 containerd[1568]: 2025-09-09 05:44:27.322 [INFO][4616] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali781694f10b4 ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Namespace="calico-system" Pod="calico-kube-controllers-8c89c8979-s4l22" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" Sep 9 05:44:27.411137 containerd[1568]: 2025-09-09 05:44:27.346 [INFO][4616] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Namespace="calico-system" Pod="calico-kube-controllers-8c89c8979-s4l22" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" Sep 9 05:44:27.411137 containerd[1568]: 2025-09-09 05:44:27.354 [INFO][4616] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Namespace="calico-system" Pod="calico-kube-controllers-8c89c8979-s4l22" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0", GenerateName:"calico-kube-controllers-8c89c8979-", Namespace:"calico-system", SelfLink:"", UID:"005a5e53-ed87-45db-9af7-c3f891ffa294", ResourceVersion:"830", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 44, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8c89c8979", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a", Pod:"calico-kube-controllers-8c89c8979-s4l22", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.107.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali781694f10b4", MAC:"ce:88:74:0d:29:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:27.411137 containerd[1568]: 2025-09-09 05:44:27.395 [INFO][4616] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" Namespace="calico-system" Pod="calico-kube-controllers-8c89c8979-s4l22" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--kube--controllers--8c89c8979--s4l22-eth0" Sep 9 05:44:27.465653 kubelet[2811]: I0909 05:44:27.465573 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-697798bcff-44slw" podStartSLOduration=2.521558393 podStartE2EDuration="6.465547228s" podCreationTimestamp="2025-09-09 05:44:21 +0000 UTC" firstStartedPulling="2025-09-09 05:44:21.959985382 +0000 UTC m=+43.244533385" lastFinishedPulling="2025-09-09 05:44:25.903974205 +0000 UTC m=+47.188522220" observedRunningTime="2025-09-09 05:44:27.464832058 +0000 UTC m=+48.749380084" watchObservedRunningTime="2025-09-09 05:44:27.465547228 +0000 UTC m=+48.750095253" Sep 9 05:44:27.480738 systemd-networkd[1468]: califcd7fc76667: Link UP Sep 9 05:44:27.487226 systemd-networkd[1468]: califcd7fc76667: Gained carrier Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.154 [INFO][4625] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0 coredns-668d6bf9bc- kube-system a2a152ca-9105-4e9c-9026-f43cca5fbe8f 834 0 2025-09-09 05:43:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023 coredns-668d6bf9bc-2tqqm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] califcd7fc76667 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Namespace="kube-system" Pod="coredns-668d6bf9bc-2tqqm" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.154 [INFO][4625] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Namespace="kube-system" Pod="coredns-668d6bf9bc-2tqqm" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.287 [INFO][4646] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" HandleID="k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.287 [INFO][4646] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" HandleID="k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000122320), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", "pod":"coredns-668d6bf9bc-2tqqm", "timestamp":"2025-09-09 05:44:27.287606887 +0000 UTC"}, Hostname:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.288 [INFO][4646] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.306 [INFO][4646] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.308 [INFO][4646] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.364 [INFO][4646] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.390 [INFO][4646] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.411 [INFO][4646] ipam/ipam.go 511: Trying affinity for 192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.417 [INFO][4646] ipam/ipam.go 158: Attempting to load block cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.424 [INFO][4646] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.424 [INFO][4646] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.107.128/26 handle="k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.430 [INFO][4646] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3 Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.445 [INFO][4646] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.107.128/26 handle="k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.460 [INFO][4646] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.107.135/26] block=192.168.107.128/26 handle="k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.463 [INFO][4646] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.107.135/26] handle="k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.463 [INFO][4646] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:44:27.573106 containerd[1568]: 2025-09-09 05:44:27.463 [INFO][4646] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.107.135/26] IPv6=[] ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" HandleID="k8s-pod-network.a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" Sep 9 05:44:27.576007 containerd[1568]: 2025-09-09 05:44:27.473 [INFO][4625] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Namespace="kube-system" Pod="coredns-668d6bf9bc-2tqqm" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a2a152ca-9105-4e9c-9026-f43cca5fbe8f", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 43, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"", Pod:"coredns-668d6bf9bc-2tqqm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.107.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califcd7fc76667", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:27.576007 containerd[1568]: 2025-09-09 05:44:27.474 [INFO][4625] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.135/32] ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Namespace="kube-system" Pod="coredns-668d6bf9bc-2tqqm" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" Sep 9 05:44:27.576007 containerd[1568]: 2025-09-09 05:44:27.474 [INFO][4625] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califcd7fc76667 ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Namespace="kube-system" Pod="coredns-668d6bf9bc-2tqqm" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" Sep 9 05:44:27.576007 containerd[1568]: 2025-09-09 05:44:27.518 [INFO][4625] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Namespace="kube-system" Pod="coredns-668d6bf9bc-2tqqm" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" Sep 9 05:44:27.576007 containerd[1568]: 2025-09-09 05:44:27.531 [INFO][4625] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Namespace="kube-system" Pod="coredns-668d6bf9bc-2tqqm" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a2a152ca-9105-4e9c-9026-f43cca5fbe8f", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 43, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3", Pod:"coredns-668d6bf9bc-2tqqm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.107.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"califcd7fc76667", MAC:"96:47:db:a9:b2:9a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:27.576007 containerd[1568]: 2025-09-09 05:44:27.555 [INFO][4625] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" Namespace="kube-system" Pod="coredns-668d6bf9bc-2tqqm" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-coredns--668d6bf9bc--2tqqm-eth0" Sep 9 05:44:27.595324 containerd[1568]: time="2025-09-09T05:44:27.592940051Z" level=info msg="connecting to shim 5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a" address="unix:///run/containerd/s/614c92acbd519863d97556d72e7f97e81fdcbfd8a4568702bc2cbdcbf12e1218" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:27.615161 systemd-networkd[1468]: cali91deef55180: Gained IPv6LL Sep 9 05:44:27.665310 systemd[1]: Started cri-containerd-5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a.scope - libcontainer container 5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a. Sep 9 05:44:27.741949 systemd-networkd[1468]: calie09adfbe9bd: Gained IPv6LL Sep 9 05:44:27.845620 containerd[1568]: time="2025-09-09T05:44:27.845314343Z" level=info msg="connecting to shim a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3" address="unix:///run/containerd/s/3d17795c20ff8d984deda33aa3e25c24b084b2f762ef9355bde3a3a3d3caba00" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:27.969076 systemd[1]: Started cri-containerd-a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3.scope - libcontainer container a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3. Sep 9 05:44:28.038744 containerd[1568]: time="2025-09-09T05:44:28.038694022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8c89c8979-s4l22,Uid:005a5e53-ed87-45db-9af7-c3f891ffa294,Namespace:calico-system,Attempt:0,} returns sandbox id \"5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a\"" Sep 9 05:44:28.127844 containerd[1568]: time="2025-09-09T05:44:28.127047852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-2tqqm,Uid:a2a152ca-9105-4e9c-9026-f43cca5fbe8f,Namespace:kube-system,Attempt:0,} returns sandbox id \"a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3\"" Sep 9 05:44:28.137334 containerd[1568]: time="2025-09-09T05:44:28.137147999Z" level=info msg="CreateContainer within sandbox \"a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:44:28.165138 containerd[1568]: time="2025-09-09T05:44:28.165061433Z" level=info msg="Container e8a1061afd1c7b1309ed10437e2b23bb4ee7b51317560c8a4a22847ddddfc50b: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:28.182964 containerd[1568]: time="2025-09-09T05:44:28.182294621Z" level=info msg="CreateContainer within sandbox \"a205e19fe5e3c7d610129498fe224e2b9d0c14e78d57b5bc4592734ab04641e3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e8a1061afd1c7b1309ed10437e2b23bb4ee7b51317560c8a4a22847ddddfc50b\"" Sep 9 05:44:28.185462 containerd[1568]: time="2025-09-09T05:44:28.185407577Z" level=info msg="StartContainer for \"e8a1061afd1c7b1309ed10437e2b23bb4ee7b51317560c8a4a22847ddddfc50b\"" Sep 9 05:44:28.191618 containerd[1568]: time="2025-09-09T05:44:28.191573018Z" level=info msg="connecting to shim e8a1061afd1c7b1309ed10437e2b23bb4ee7b51317560c8a4a22847ddddfc50b" address="unix:///run/containerd/s/3d17795c20ff8d984deda33aa3e25c24b084b2f762ef9355bde3a3a3d3caba00" protocol=ttrpc version=3 Sep 9 05:44:28.266956 systemd[1]: Started cri-containerd-e8a1061afd1c7b1309ed10437e2b23bb4ee7b51317560c8a4a22847ddddfc50b.scope - libcontainer container e8a1061afd1c7b1309ed10437e2b23bb4ee7b51317560c8a4a22847ddddfc50b. Sep 9 05:44:28.408125 containerd[1568]: time="2025-09-09T05:44:28.405673268Z" level=info msg="StartContainer for \"e8a1061afd1c7b1309ed10437e2b23bb4ee7b51317560c8a4a22847ddddfc50b\" returns successfully" Sep 9 05:44:28.637724 systemd-networkd[1468]: cali781694f10b4: Gained IPv6LL Sep 9 05:44:28.910196 containerd[1568]: time="2025-09-09T05:44:28.909566376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d7857fcf5-mzmsc,Uid:8767d1d9-ea87-40f6-8f9c-15997478a3da,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:44:29.149230 systemd-networkd[1468]: califcd7fc76667: Gained IPv6LL Sep 9 05:44:29.163196 systemd-networkd[1468]: calie9dbc3581de: Link UP Sep 9 05:44:29.165593 systemd-networkd[1468]: calie9dbc3581de: Gained carrier Sep 9 05:44:29.188593 kubelet[2811]: I0909 05:44:29.187861 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-2tqqm" podStartSLOduration=44.187832532 podStartE2EDuration="44.187832532s" podCreationTimestamp="2025-09-09 05:43:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:44:28.478807673 +0000 UTC m=+49.763355699" watchObservedRunningTime="2025-09-09 05:44:29.187832532 +0000 UTC m=+50.472380556" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.012 [INFO][4814] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0 calico-apiserver-7d7857fcf5- calico-apiserver 8767d1d9-ea87-40f6-8f9c-15997478a3da 835 0 2025-09-09 05:43:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d7857fcf5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023 calico-apiserver-7d7857fcf5-mzmsc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie9dbc3581de [] [] }} ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-mzmsc" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.012 [INFO][4814] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-mzmsc" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.068 [INFO][4827] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" HandleID="k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.069 [INFO][4827] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" HandleID="k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5120), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", "pod":"calico-apiserver-7d7857fcf5-mzmsc", "timestamp":"2025-09-09 05:44:29.068679223 +0000 UTC"}, Hostname:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.069 [INFO][4827] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.070 [INFO][4827] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.070 [INFO][4827] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023' Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.084 [INFO][4827] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.092 [INFO][4827] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.099 [INFO][4827] ipam/ipam.go 511: Trying affinity for 192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.105 [INFO][4827] ipam/ipam.go 158: Attempting to load block cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.110 [INFO][4827] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.107.128/26 host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.110 [INFO][4827] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.107.128/26 handle="k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.113 [INFO][4827] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8 Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.127 [INFO][4827] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.107.128/26 handle="k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.146 [INFO][4827] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.107.136/26] block=192.168.107.128/26 handle="k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.146 [INFO][4827] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.107.136/26] handle="k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" host="ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023" Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.146 [INFO][4827] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:44:29.194258 containerd[1568]: 2025-09-09 05:44:29.146 [INFO][4827] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.107.136/26] IPv6=[] ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" HandleID="k8s-pod-network.1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Workload="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" Sep 9 05:44:29.195783 containerd[1568]: 2025-09-09 05:44:29.152 [INFO][4814] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-mzmsc" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0", GenerateName:"calico-apiserver-7d7857fcf5-", Namespace:"calico-apiserver", SelfLink:"", UID:"8767d1d9-ea87-40f6-8f9c-15997478a3da", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d7857fcf5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"", Pod:"calico-apiserver-7d7857fcf5-mzmsc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.107.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie9dbc3581de", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:29.195783 containerd[1568]: 2025-09-09 05:44:29.152 [INFO][4814] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.107.136/32] ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-mzmsc" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" Sep 9 05:44:29.195783 containerd[1568]: 2025-09-09 05:44:29.153 [INFO][4814] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie9dbc3581de ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-mzmsc" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" Sep 9 05:44:29.195783 containerd[1568]: 2025-09-09 05:44:29.167 [INFO][4814] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-mzmsc" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" Sep 9 05:44:29.195783 containerd[1568]: 2025-09-09 05:44:29.168 [INFO][4814] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-mzmsc" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0", GenerateName:"calico-apiserver-7d7857fcf5-", Namespace:"calico-apiserver", SelfLink:"", UID:"8767d1d9-ea87-40f6-8f9c-15997478a3da", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 43, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d7857fcf5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-nightly-20250908-2100-d8bb7886c1bb21edc023", ContainerID:"1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8", Pod:"calico-apiserver-7d7857fcf5-mzmsc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.107.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie9dbc3581de", MAC:"ea:5d:f7:07:0b:d4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:44:29.195783 containerd[1568]: 2025-09-09 05:44:29.191 [INFO][4814] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" Namespace="calico-apiserver" Pod="calico-apiserver-7d7857fcf5-mzmsc" WorkloadEndpoint="ci--4452--0--0--nightly--20250908--2100--d8bb7886c1bb21edc023-k8s-calico--apiserver--7d7857fcf5--mzmsc-eth0" Sep 9 05:44:29.268143 containerd[1568]: time="2025-09-09T05:44:29.267730996Z" level=info msg="connecting to shim 1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8" address="unix:///run/containerd/s/31dfb5c68fb4f6e3d60e285c242c09735cbc8d96c320fca66ba2fae2ff0aacf9" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:44:29.335417 systemd[1]: Started cri-containerd-1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8.scope - libcontainer container 1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8. Sep 9 05:44:29.748016 containerd[1568]: time="2025-09-09T05:44:29.747959914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d7857fcf5-mzmsc,Uid:8767d1d9-ea87-40f6-8f9c-15997478a3da,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8\"" Sep 9 05:44:29.819596 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1321025530.mount: Deactivated successfully. Sep 9 05:44:30.556605 systemd-networkd[1468]: calie9dbc3581de: Gained IPv6LL Sep 9 05:44:30.784563 containerd[1568]: time="2025-09-09T05:44:30.784495212Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:30.785862 containerd[1568]: time="2025-09-09T05:44:30.785728534Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 05:44:30.787372 containerd[1568]: time="2025-09-09T05:44:30.787329850Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:30.791596 containerd[1568]: time="2025-09-09T05:44:30.790372754Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:30.791596 containerd[1568]: time="2025-09-09T05:44:30.791444547Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 4.886141459s" Sep 9 05:44:30.791596 containerd[1568]: time="2025-09-09T05:44:30.791484218Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 05:44:30.795243 containerd[1568]: time="2025-09-09T05:44:30.794024589Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 05:44:30.797009 containerd[1568]: time="2025-09-09T05:44:30.796956604Z" level=info msg="CreateContainer within sandbox \"0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 05:44:30.810351 containerd[1568]: time="2025-09-09T05:44:30.808006905Z" level=info msg="Container 17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:30.820767 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1231421978.mount: Deactivated successfully. Sep 9 05:44:30.829332 containerd[1568]: time="2025-09-09T05:44:30.829261521Z" level=info msg="CreateContainer within sandbox \"0682bcb446f09f51f61eca6ba6b69906b79a4d76be1027c7a91c4eac60746f6e\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\"" Sep 9 05:44:30.830553 containerd[1568]: time="2025-09-09T05:44:30.830165990Z" level=info msg="StartContainer for \"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\"" Sep 9 05:44:30.832972 containerd[1568]: time="2025-09-09T05:44:30.832831369Z" level=info msg="connecting to shim 17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271" address="unix:///run/containerd/s/ff386e4ab11f0b771e60bd6836183009743b18e027b6513642be270859d2fcb6" protocol=ttrpc version=3 Sep 9 05:44:30.871438 systemd[1]: Started cri-containerd-17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271.scope - libcontainer container 17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271. Sep 9 05:44:30.946624 containerd[1568]: time="2025-09-09T05:44:30.946553958Z" level=info msg="StartContainer for \"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" returns successfully" Sep 9 05:44:31.686701 containerd[1568]: time="2025-09-09T05:44:31.686635047Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" id:\"69bdb2c883ef132cf3bd56e8d76c67a8aa0166401298b8eff9533af5b5d53e0c\" pid:4944 exit_status:1 exited_at:{seconds:1757396671 nanos:685871332}" Sep 9 05:44:31.908620 containerd[1568]: time="2025-09-09T05:44:31.908463380Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:31.910009 containerd[1568]: time="2025-09-09T05:44:31.909963351Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 05:44:31.912375 containerd[1568]: time="2025-09-09T05:44:31.911511700Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:31.915429 containerd[1568]: time="2025-09-09T05:44:31.915377208Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:31.916421 containerd[1568]: time="2025-09-09T05:44:31.916379691Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.122308736s" Sep 9 05:44:31.916595 containerd[1568]: time="2025-09-09T05:44:31.916568738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 05:44:31.919390 containerd[1568]: time="2025-09-09T05:44:31.919344171Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:44:31.921212 containerd[1568]: time="2025-09-09T05:44:31.921166051Z" level=info msg="CreateContainer within sandbox \"7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 05:44:31.935391 containerd[1568]: time="2025-09-09T05:44:31.935334692Z" level=info msg="Container 3ba70fd72bc7bbb208d4d184824625bcd344b80f8f5f19c556e85d898a80a22a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:31.949663 containerd[1568]: time="2025-09-09T05:44:31.949064835Z" level=info msg="CreateContainer within sandbox \"7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"3ba70fd72bc7bbb208d4d184824625bcd344b80f8f5f19c556e85d898a80a22a\"" Sep 9 05:44:31.952204 containerd[1568]: time="2025-09-09T05:44:31.950371497Z" level=info msg="StartContainer for \"3ba70fd72bc7bbb208d4d184824625bcd344b80f8f5f19c556e85d898a80a22a\"" Sep 9 05:44:31.952788 containerd[1568]: time="2025-09-09T05:44:31.952737573Z" level=info msg="connecting to shim 3ba70fd72bc7bbb208d4d184824625bcd344b80f8f5f19c556e85d898a80a22a" address="unix:///run/containerd/s/26b07d6c09439393330f4690523f7183173bc6be4f6049c7a145411138762ab0" protocol=ttrpc version=3 Sep 9 05:44:31.990449 systemd[1]: Started cri-containerd-3ba70fd72bc7bbb208d4d184824625bcd344b80f8f5f19c556e85d898a80a22a.scope - libcontainer container 3ba70fd72bc7bbb208d4d184824625bcd344b80f8f5f19c556e85d898a80a22a. Sep 9 05:44:32.057913 containerd[1568]: time="2025-09-09T05:44:32.057856833Z" level=info msg="StartContainer for \"3ba70fd72bc7bbb208d4d184824625bcd344b80f8f5f19c556e85d898a80a22a\" returns successfully" Sep 9 05:44:32.610716 containerd[1568]: time="2025-09-09T05:44:32.610650307Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" id:\"29ee285e2469a2ff159619fb6686f503eb5f7a66ed939bbe64563b3629c3e1f3\" pid:5000 exit_status:1 exited_at:{seconds:1757396672 nanos:609859131}" Sep 9 05:44:33.547095 ntpd[1545]: Listen normally on 7 vxlan.calico 192.168.107.128:123 Sep 9 05:44:33.549843 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 7 vxlan.calico 192.168.107.128:123 Sep 9 05:44:33.549843 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 8 califaa58d7f49d [fe80::ecee:eeff:feee:eeee%4]:123 Sep 9 05:44:33.549843 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 9 vxlan.calico [fe80::6425:25ff:fe1e:eff5%5]:123 Sep 9 05:44:33.549843 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 10 calie4fc33b6f0b [fe80::ecee:eeff:feee:eeee%8]:123 Sep 9 05:44:33.549843 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 11 califeeaba070c7 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 9 05:44:33.549843 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 12 cali91deef55180 [fe80::ecee:eeff:feee:eeee%10]:123 Sep 9 05:44:33.549843 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 13 calie09adfbe9bd [fe80::ecee:eeff:feee:eeee%11]:123 Sep 9 05:44:33.548638 ntpd[1545]: Listen normally on 8 califaa58d7f49d [fe80::ecee:eeff:feee:eeee%4]:123 Sep 9 05:44:33.548755 ntpd[1545]: Listen normally on 9 vxlan.calico [fe80::6425:25ff:fe1e:eff5%5]:123 Sep 9 05:44:33.548817 ntpd[1545]: Listen normally on 10 calie4fc33b6f0b [fe80::ecee:eeff:feee:eeee%8]:123 Sep 9 05:44:33.548873 ntpd[1545]: Listen normally on 11 califeeaba070c7 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 9 05:44:33.548944 ntpd[1545]: Listen normally on 12 cali91deef55180 [fe80::ecee:eeff:feee:eeee%10]:123 Sep 9 05:44:33.549108 ntpd[1545]: Listen normally on 13 calie09adfbe9bd [fe80::ecee:eeff:feee:eeee%11]:123 Sep 9 05:44:33.551285 ntpd[1545]: Listen normally on 14 cali781694f10b4 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 9 05:44:33.551736 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 14 cali781694f10b4 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 9 05:44:33.551736 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 15 califcd7fc76667 [fe80::ecee:eeff:feee:eeee%13]:123 Sep 9 05:44:33.551736 ntpd[1545]: 9 Sep 05:44:33 ntpd[1545]: Listen normally on 16 calie9dbc3581de [fe80::ecee:eeff:feee:eeee%14]:123 Sep 9 05:44:33.551381 ntpd[1545]: Listen normally on 15 califcd7fc76667 [fe80::ecee:eeff:feee:eeee%13]:123 Sep 9 05:44:33.551447 ntpd[1545]: Listen normally on 16 calie9dbc3581de [fe80::ecee:eeff:feee:eeee%14]:123 Sep 9 05:44:33.625888 containerd[1568]: time="2025-09-09T05:44:33.625812944Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" id:\"799b186542c8422ed6f011458ec3801da3418aa92a2ca455b2d8ced17555ee05\" pid:5028 exit_status:1 exited_at:{seconds:1757396673 nanos:624607089}" Sep 9 05:44:35.098472 containerd[1568]: time="2025-09-09T05:44:35.098387313Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:35.099776 containerd[1568]: time="2025-09-09T05:44:35.099667170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 05:44:35.101637 containerd[1568]: time="2025-09-09T05:44:35.101076982Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:35.104442 containerd[1568]: time="2025-09-09T05:44:35.104400261Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:35.105463 containerd[1568]: time="2025-09-09T05:44:35.105420796Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.186021132s" Sep 9 05:44:35.105566 containerd[1568]: time="2025-09-09T05:44:35.105469871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:44:35.108200 containerd[1568]: time="2025-09-09T05:44:35.107495868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 05:44:35.111073 containerd[1568]: time="2025-09-09T05:44:35.111008689Z" level=info msg="CreateContainer within sandbox \"65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:44:35.127206 containerd[1568]: time="2025-09-09T05:44:35.126401182Z" level=info msg="Container a4749f8c712c05eb1ec7539ae2f695274cb84df2a25d52eb746e10d3e68b4373: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:35.140552 containerd[1568]: time="2025-09-09T05:44:35.140481082Z" level=info msg="CreateContainer within sandbox \"65b06ce0a15f2a92008485272237888c5f402aba66ec0695c3512daaba869d65\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a4749f8c712c05eb1ec7539ae2f695274cb84df2a25d52eb746e10d3e68b4373\"" Sep 9 05:44:35.141298 containerd[1568]: time="2025-09-09T05:44:35.141258619Z" level=info msg="StartContainer for \"a4749f8c712c05eb1ec7539ae2f695274cb84df2a25d52eb746e10d3e68b4373\"" Sep 9 05:44:35.143512 containerd[1568]: time="2025-09-09T05:44:35.143460269Z" level=info msg="connecting to shim a4749f8c712c05eb1ec7539ae2f695274cb84df2a25d52eb746e10d3e68b4373" address="unix:///run/containerd/s/130f7444ef34b9d45bea3846c253d728d9e8cfb44c8a17e6050a90809f3642c9" protocol=ttrpc version=3 Sep 9 05:44:35.183468 systemd[1]: Started cri-containerd-a4749f8c712c05eb1ec7539ae2f695274cb84df2a25d52eb746e10d3e68b4373.scope - libcontainer container a4749f8c712c05eb1ec7539ae2f695274cb84df2a25d52eb746e10d3e68b4373. Sep 9 05:44:35.262953 containerd[1568]: time="2025-09-09T05:44:35.262904630Z" level=info msg="StartContainer for \"a4749f8c712c05eb1ec7539ae2f695274cb84df2a25d52eb746e10d3e68b4373\" returns successfully" Sep 9 05:44:35.519352 kubelet[2811]: I0909 05:44:35.518568 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-mvxm4" podStartSLOduration=30.558954071 podStartE2EDuration="35.518541749s" podCreationTimestamp="2025-09-09 05:44:00 +0000 UTC" firstStartedPulling="2025-09-09 05:44:25.8333293 +0000 UTC m=+47.117877317" lastFinishedPulling="2025-09-09 05:44:30.792916978 +0000 UTC m=+52.077464995" observedRunningTime="2025-09-09 05:44:31.503679737 +0000 UTC m=+52.788227762" watchObservedRunningTime="2025-09-09 05:44:35.518541749 +0000 UTC m=+56.803089765" Sep 9 05:44:36.500324 kubelet[2811]: I0909 05:44:36.500262 2811 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:44:37.945335 containerd[1568]: time="2025-09-09T05:44:37.945273613Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" id:\"cc0d323a6c2221956f71c98cec3421a444a0976f9d963b7294bd2bee9cc0b004\" pid:5103 exited_at:{seconds:1757396677 nanos:942950735}" Sep 9 05:44:38.449695 containerd[1568]: time="2025-09-09T05:44:38.449569566Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:38.451292 containerd[1568]: time="2025-09-09T05:44:38.451009818Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 05:44:38.452555 containerd[1568]: time="2025-09-09T05:44:38.452502848Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:38.455870 containerd[1568]: time="2025-09-09T05:44:38.455828612Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:38.456850 containerd[1568]: time="2025-09-09T05:44:38.456810689Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.349270223s" Sep 9 05:44:38.457023 containerd[1568]: time="2025-09-09T05:44:38.456980688Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 05:44:38.458607 containerd[1568]: time="2025-09-09T05:44:38.458573112Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:44:38.483691 containerd[1568]: time="2025-09-09T05:44:38.483636238Z" level=info msg="CreateContainer within sandbox \"5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 05:44:38.506075 containerd[1568]: time="2025-09-09T05:44:38.502789354Z" level=info msg="Container e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:38.517923 containerd[1568]: time="2025-09-09T05:44:38.517856301Z" level=info msg="CreateContainer within sandbox \"5dc98456175b2d628107ac8a8ead7da40edb3e854369ab812060686efa34c52a\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0\"" Sep 9 05:44:38.519056 containerd[1568]: time="2025-09-09T05:44:38.519010033Z" level=info msg="StartContainer for \"e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0\"" Sep 9 05:44:38.520653 containerd[1568]: time="2025-09-09T05:44:38.520611691Z" level=info msg="connecting to shim e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0" address="unix:///run/containerd/s/614c92acbd519863d97556d72e7f97e81fdcbfd8a4568702bc2cbdcbf12e1218" protocol=ttrpc version=3 Sep 9 05:44:38.557441 systemd[1]: Started cri-containerd-e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0.scope - libcontainer container e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0. Sep 9 05:44:38.630114 containerd[1568]: time="2025-09-09T05:44:38.630045516Z" level=info msg="StartContainer for \"e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0\" returns successfully" Sep 9 05:44:38.675215 containerd[1568]: time="2025-09-09T05:44:38.675012210Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:38.679340 containerd[1568]: time="2025-09-09T05:44:38.679289574Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 05:44:38.685912 containerd[1568]: time="2025-09-09T05:44:38.685868034Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 227.132626ms" Sep 9 05:44:38.686354 containerd[1568]: time="2025-09-09T05:44:38.686215354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:44:38.692636 containerd[1568]: time="2025-09-09T05:44:38.691607100Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 05:44:38.696879 containerd[1568]: time="2025-09-09T05:44:38.696813874Z" level=info msg="CreateContainer within sandbox \"1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:44:38.717124 containerd[1568]: time="2025-09-09T05:44:38.714692185Z" level=info msg="Container 361daf8f7559b1b9943203278d5e4426af64e63113a0de485239655feda28173: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:38.733012 containerd[1568]: time="2025-09-09T05:44:38.732958009Z" level=info msg="CreateContainer within sandbox \"1f7650bd0e1f204dc681abcf691f457f785ad98514c0d7dbb293dbbc8933fdc8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"361daf8f7559b1b9943203278d5e4426af64e63113a0de485239655feda28173\"" Sep 9 05:44:38.734267 containerd[1568]: time="2025-09-09T05:44:38.734212742Z" level=info msg="StartContainer for \"361daf8f7559b1b9943203278d5e4426af64e63113a0de485239655feda28173\"" Sep 9 05:44:38.737349 containerd[1568]: time="2025-09-09T05:44:38.737308113Z" level=info msg="connecting to shim 361daf8f7559b1b9943203278d5e4426af64e63113a0de485239655feda28173" address="unix:///run/containerd/s/31dfb5c68fb4f6e3d60e285c242c09735cbc8d96c320fca66ba2fae2ff0aacf9" protocol=ttrpc version=3 Sep 9 05:44:38.776329 systemd[1]: Started cri-containerd-361daf8f7559b1b9943203278d5e4426af64e63113a0de485239655feda28173.scope - libcontainer container 361daf8f7559b1b9943203278d5e4426af64e63113a0de485239655feda28173. Sep 9 05:44:38.882139 containerd[1568]: time="2025-09-09T05:44:38.882035092Z" level=info msg="StartContainer for \"361daf8f7559b1b9943203278d5e4426af64e63113a0de485239655feda28173\" returns successfully" Sep 9 05:44:39.553096 kubelet[2811]: I0909 05:44:39.552889 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d7857fcf5-gs95d" podStartSLOduration=35.397074785 podStartE2EDuration="43.552841117s" podCreationTimestamp="2025-09-09 05:43:56 +0000 UTC" firstStartedPulling="2025-09-09 05:44:26.951586971 +0000 UTC m=+48.236134987" lastFinishedPulling="2025-09-09 05:44:35.107353304 +0000 UTC m=+56.391901319" observedRunningTime="2025-09-09 05:44:35.521689177 +0000 UTC m=+56.806237203" watchObservedRunningTime="2025-09-09 05:44:39.552841117 +0000 UTC m=+60.837389143" Sep 9 05:44:39.555348 kubelet[2811]: I0909 05:44:39.553623 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d7857fcf5-mzmsc" podStartSLOduration=34.614337901 podStartE2EDuration="43.553603981s" podCreationTimestamp="2025-09-09 05:43:56 +0000 UTC" firstStartedPulling="2025-09-09 05:44:29.751061707 +0000 UTC m=+51.035609726" lastFinishedPulling="2025-09-09 05:44:38.690327785 +0000 UTC m=+59.974875806" observedRunningTime="2025-09-09 05:44:39.552311386 +0000 UTC m=+60.836859413" watchObservedRunningTime="2025-09-09 05:44:39.553603981 +0000 UTC m=+60.838152006" Sep 9 05:44:39.668344 containerd[1568]: time="2025-09-09T05:44:39.667106028Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0\" id:\"ec5364ed4f855caa13d2c60162dbe444d8113d3140f47e0e7bcb604a7730937c\" pid:5210 exited_at:{seconds:1757396679 nanos:666326036}" Sep 9 05:44:39.701783 kubelet[2811]: I0909 05:44:39.699136 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8c89c8979-s4l22" podStartSLOduration=28.282955516 podStartE2EDuration="38.699110255s" podCreationTimestamp="2025-09-09 05:44:01 +0000 UTC" firstStartedPulling="2025-09-09 05:44:28.041979485 +0000 UTC m=+49.326527491" lastFinishedPulling="2025-09-09 05:44:38.458134215 +0000 UTC m=+59.742682230" observedRunningTime="2025-09-09 05:44:39.591404576 +0000 UTC m=+60.875952600" watchObservedRunningTime="2025-09-09 05:44:39.699110255 +0000 UTC m=+60.983658280" Sep 9 05:44:40.323211 containerd[1568]: time="2025-09-09T05:44:40.323079110Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:40.325750 containerd[1568]: time="2025-09-09T05:44:40.325687552Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 05:44:40.327081 containerd[1568]: time="2025-09-09T05:44:40.327041169Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:40.332462 containerd[1568]: time="2025-09-09T05:44:40.332404797Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:44:40.334415 containerd[1568]: time="2025-09-09T05:44:40.334377297Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.642013334s" Sep 9 05:44:40.334744 containerd[1568]: time="2025-09-09T05:44:40.334424792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 05:44:40.341517 containerd[1568]: time="2025-09-09T05:44:40.341436778Z" level=info msg="CreateContainer within sandbox \"7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 05:44:40.359210 containerd[1568]: time="2025-09-09T05:44:40.355038054Z" level=info msg="Container 5cea420bd97eba95af6c2cdfbec2e6ae53ed7a67527b6f8cd37979768d91fd69: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:44:40.382546 containerd[1568]: time="2025-09-09T05:44:40.382477796Z" level=info msg="CreateContainer within sandbox \"7751be4bc3125c3ac070cd8c668835501514471333c27bff785d1b28a5e7db73\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5cea420bd97eba95af6c2cdfbec2e6ae53ed7a67527b6f8cd37979768d91fd69\"" Sep 9 05:44:40.384081 containerd[1568]: time="2025-09-09T05:44:40.383098754Z" level=info msg="StartContainer for \"5cea420bd97eba95af6c2cdfbec2e6ae53ed7a67527b6f8cd37979768d91fd69\"" Sep 9 05:44:40.387145 containerd[1568]: time="2025-09-09T05:44:40.387019102Z" level=info msg="connecting to shim 5cea420bd97eba95af6c2cdfbec2e6ae53ed7a67527b6f8cd37979768d91fd69" address="unix:///run/containerd/s/26b07d6c09439393330f4690523f7183173bc6be4f6049c7a145411138762ab0" protocol=ttrpc version=3 Sep 9 05:44:40.451471 systemd[1]: Started cri-containerd-5cea420bd97eba95af6c2cdfbec2e6ae53ed7a67527b6f8cd37979768d91fd69.scope - libcontainer container 5cea420bd97eba95af6c2cdfbec2e6ae53ed7a67527b6f8cd37979768d91fd69. Sep 9 05:44:40.543204 kubelet[2811]: I0909 05:44:40.542985 2811 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:44:40.562750 containerd[1568]: time="2025-09-09T05:44:40.562694931Z" level=info msg="StartContainer for \"5cea420bd97eba95af6c2cdfbec2e6ae53ed7a67527b6f8cd37979768d91fd69\" returns successfully" Sep 9 05:44:41.073393 kubelet[2811]: I0909 05:44:41.073330 2811 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 05:44:41.073393 kubelet[2811]: I0909 05:44:41.073379 2811 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 05:44:41.568524 kubelet[2811]: I0909 05:44:41.568433 2811 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-nt4z8" podStartSLOduration=26.979955042 podStartE2EDuration="40.568405862s" podCreationTimestamp="2025-09-09 05:44:01 +0000 UTC" firstStartedPulling="2025-09-09 05:44:26.748343181 +0000 UTC m=+48.032891184" lastFinishedPulling="2025-09-09 05:44:40.336793985 +0000 UTC m=+61.621342004" observedRunningTime="2025-09-09 05:44:41.566713514 +0000 UTC m=+62.851261539" watchObservedRunningTime="2025-09-09 05:44:41.568405862 +0000 UTC m=+62.852953887" Sep 9 05:44:47.399169 kubelet[2811]: I0909 05:44:47.398764 2811 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:44:51.239078 containerd[1568]: time="2025-09-09T05:44:51.238966645Z" level=info msg="TaskExit event in podsandbox handler container_id:\"847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658\" id:\"a9a24cf47528c8e413ac16bd6686bb37de35b3d16fd9cdb6ef29830bd71676a3\" pid:5285 exited_at:{seconds:1757396691 nanos:238576719}" Sep 9 05:44:51.373042 containerd[1568]: time="2025-09-09T05:44:51.372981363Z" level=info msg="TaskExit event in podsandbox handler container_id:\"847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658\" id:\"9a78936a3cd413fb16e29d4f1ae8539999e93452905d05bd67c56f767f2f62aa\" pid:5310 exited_at:{seconds:1757396691 nanos:372650477}" Sep 9 05:45:03.657464 containerd[1568]: time="2025-09-09T05:45:03.657407298Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" id:\"54420ff36fa5fd31725d035efca4f6437cbe9d26e609e0d1ce6d84944ecf8de8\" pid:5344 exited_at:{seconds:1757396703 nanos:656465566}" Sep 9 05:45:06.663605 systemd[1]: Started sshd@9-10.128.0.19:22-139.178.89.65:53856.service - OpenSSH per-connection server daemon (139.178.89.65:53856). Sep 9 05:45:07.001538 sshd[5360]: Accepted publickey for core from 139.178.89.65 port 53856 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:07.004070 sshd-session[5360]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:07.015250 systemd-logind[1555]: New session 10 of user core. Sep 9 05:45:07.023456 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 05:45:07.409682 sshd[5363]: Connection closed by 139.178.89.65 port 53856 Sep 9 05:45:07.411458 sshd-session[5360]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:07.420387 systemd-logind[1555]: Session 10 logged out. Waiting for processes to exit. Sep 9 05:45:07.421512 systemd[1]: sshd@9-10.128.0.19:22-139.178.89.65:53856.service: Deactivated successfully. Sep 9 05:45:07.427029 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 05:45:07.432118 systemd-logind[1555]: Removed session 10. Sep 9 05:45:08.450949 kubelet[2811]: I0909 05:45:08.450335 2811 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:45:09.618092 containerd[1568]: time="2025-09-09T05:45:09.617742956Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0\" id:\"ab7477a4b165f37943484ab5bb328bebb6853418d6c1efef6a28b61bb55d7d7a\" pid:5392 exited_at:{seconds:1757396709 nanos:617373871}" Sep 9 05:45:12.467480 systemd[1]: Started sshd@10-10.128.0.19:22-139.178.89.65:42974.service - OpenSSH per-connection server daemon (139.178.89.65:42974). Sep 9 05:45:12.798006 sshd[5402]: Accepted publickey for core from 139.178.89.65 port 42974 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:12.800784 sshd-session[5402]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:12.809130 systemd-logind[1555]: New session 11 of user core. Sep 9 05:45:12.817613 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 05:45:13.172076 sshd[5405]: Connection closed by 139.178.89.65 port 42974 Sep 9 05:45:13.173322 sshd-session[5402]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:13.182207 systemd[1]: sshd@10-10.128.0.19:22-139.178.89.65:42974.service: Deactivated successfully. Sep 9 05:45:13.187477 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 05:45:13.190147 systemd-logind[1555]: Session 11 logged out. Waiting for processes to exit. Sep 9 05:45:13.193545 systemd-logind[1555]: Removed session 11. Sep 9 05:45:18.230529 systemd[1]: Started sshd@11-10.128.0.19:22-139.178.89.65:42976.service - OpenSSH per-connection server daemon (139.178.89.65:42976). Sep 9 05:45:18.560301 sshd[5420]: Accepted publickey for core from 139.178.89.65 port 42976 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:18.562803 sshd-session[5420]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:18.574479 systemd-logind[1555]: New session 12 of user core. Sep 9 05:45:18.582810 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 05:45:18.914034 sshd[5423]: Connection closed by 139.178.89.65 port 42976 Sep 9 05:45:18.916460 sshd-session[5420]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:18.925303 systemd[1]: sshd@11-10.128.0.19:22-139.178.89.65:42976.service: Deactivated successfully. Sep 9 05:45:18.930015 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 05:45:18.933427 systemd-logind[1555]: Session 12 logged out. Waiting for processes to exit. Sep 9 05:45:18.936283 systemd-logind[1555]: Removed session 12. Sep 9 05:45:21.392670 containerd[1568]: time="2025-09-09T05:45:21.392614289Z" level=info msg="TaskExit event in podsandbox handler container_id:\"847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658\" id:\"a298e6ae3a345a20a8e98eab3b25ae4278206b659df94ac334546072ee4e7c2c\" pid:5447 exited_at:{seconds:1757396721 nanos:391939706}" Sep 9 05:45:23.974523 systemd[1]: Started sshd@12-10.128.0.19:22-139.178.89.65:36588.service - OpenSSH per-connection server daemon (139.178.89.65:36588). Sep 9 05:45:24.306160 sshd[5459]: Accepted publickey for core from 139.178.89.65 port 36588 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:24.309951 sshd-session[5459]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:24.321964 systemd-logind[1555]: New session 13 of user core. Sep 9 05:45:24.328475 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 05:45:24.653685 sshd[5462]: Connection closed by 139.178.89.65 port 36588 Sep 9 05:45:24.654535 sshd-session[5459]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:24.665148 systemd[1]: sshd@12-10.128.0.19:22-139.178.89.65:36588.service: Deactivated successfully. Sep 9 05:45:24.670499 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 05:45:24.676425 systemd-logind[1555]: Session 13 logged out. Waiting for processes to exit. Sep 9 05:45:24.678370 systemd-logind[1555]: Removed session 13. Sep 9 05:45:24.712020 systemd[1]: Started sshd@13-10.128.0.19:22-139.178.89.65:36592.service - OpenSSH per-connection server daemon (139.178.89.65:36592). Sep 9 05:45:25.046410 sshd[5475]: Accepted publickey for core from 139.178.89.65 port 36592 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:25.050128 sshd-session[5475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:25.062337 systemd-logind[1555]: New session 14 of user core. Sep 9 05:45:25.067316 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 05:45:25.461338 sshd[5478]: Connection closed by 139.178.89.65 port 36592 Sep 9 05:45:25.460585 sshd-session[5475]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:25.471643 systemd[1]: sshd@13-10.128.0.19:22-139.178.89.65:36592.service: Deactivated successfully. Sep 9 05:45:25.479584 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 05:45:25.481517 systemd-logind[1555]: Session 14 logged out. Waiting for processes to exit. Sep 9 05:45:25.485613 systemd-logind[1555]: Removed session 14. Sep 9 05:45:25.519900 systemd[1]: Started sshd@14-10.128.0.19:22-139.178.89.65:36596.service - OpenSSH per-connection server daemon (139.178.89.65:36596). Sep 9 05:45:25.853919 sshd[5488]: Accepted publickey for core from 139.178.89.65 port 36596 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:25.856764 sshd-session[5488]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:25.869512 systemd-logind[1555]: New session 15 of user core. Sep 9 05:45:25.877390 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 05:45:26.210549 sshd[5491]: Connection closed by 139.178.89.65 port 36596 Sep 9 05:45:26.212577 sshd-session[5488]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:26.221257 systemd-logind[1555]: Session 15 logged out. Waiting for processes to exit. Sep 9 05:45:26.222838 systemd[1]: sshd@14-10.128.0.19:22-139.178.89.65:36596.service: Deactivated successfully. Sep 9 05:45:26.228477 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 05:45:26.234797 systemd-logind[1555]: Removed session 15. Sep 9 05:45:31.272120 systemd[1]: Started sshd@15-10.128.0.19:22-139.178.89.65:50726.service - OpenSSH per-connection server daemon (139.178.89.65:50726). Sep 9 05:45:31.606988 sshd[5508]: Accepted publickey for core from 139.178.89.65 port 50726 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:31.609913 sshd-session[5508]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:31.618999 systemd-logind[1555]: New session 16 of user core. Sep 9 05:45:31.628650 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 05:45:32.051267 sshd[5511]: Connection closed by 139.178.89.65 port 50726 Sep 9 05:45:32.050053 sshd-session[5508]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:32.060830 systemd[1]: sshd@15-10.128.0.19:22-139.178.89.65:50726.service: Deactivated successfully. Sep 9 05:45:32.067464 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 05:45:32.070437 systemd-logind[1555]: Session 16 logged out. Waiting for processes to exit. Sep 9 05:45:32.074867 systemd-logind[1555]: Removed session 16. Sep 9 05:45:33.857693 containerd[1568]: time="2025-09-09T05:45:33.857634731Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" id:\"935148ff17b3a5f130a31acbb1bb1bcc52ba2910cc6402f10ae4ae7393681447\" pid:5534 exited_at:{seconds:1757396733 nanos:856223245}" Sep 9 05:45:37.109415 systemd[1]: Started sshd@16-10.128.0.19:22-139.178.89.65:50742.service - OpenSSH per-connection server daemon (139.178.89.65:50742). Sep 9 05:45:37.447962 sshd[5545]: Accepted publickey for core from 139.178.89.65 port 50742 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:37.449590 sshd-session[5545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:37.461533 systemd-logind[1555]: New session 17 of user core. Sep 9 05:45:37.469429 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 05:45:37.867883 sshd[5548]: Connection closed by 139.178.89.65 port 50742 Sep 9 05:45:37.869522 sshd-session[5545]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:37.878961 systemd[1]: sshd@16-10.128.0.19:22-139.178.89.65:50742.service: Deactivated successfully. Sep 9 05:45:37.881631 systemd-logind[1555]: Session 17 logged out. Waiting for processes to exit. Sep 9 05:45:37.885612 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 05:45:37.892021 systemd-logind[1555]: Removed session 17. Sep 9 05:45:37.973060 containerd[1568]: time="2025-09-09T05:45:37.973003357Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" id:\"70d1681359678dc1964a7c24792fb9c58393e4a98dbf3e9718c5e75fc114dede\" pid:5561 exited_at:{seconds:1757396737 nanos:972657512}" Sep 9 05:45:39.597054 containerd[1568]: time="2025-09-09T05:45:39.596780337Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0\" id:\"2e9d8c24572eb6fececf965fd8d950262594aefc1070492e16ce6a61f7bf73df\" pid:5596 exited_at:{seconds:1757396739 nanos:595865651}" Sep 9 05:45:40.479512 containerd[1568]: time="2025-09-09T05:45:40.479421200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0\" id:\"e1672daedcd572d18bc1d76ac526ae2b6269ad3ada443fadeaa6895f0e8aaef5\" pid:5617 exited_at:{seconds:1757396740 nanos:479034375}" Sep 9 05:45:42.927561 systemd[1]: Started sshd@17-10.128.0.19:22-139.178.89.65:47248.service - OpenSSH per-connection server daemon (139.178.89.65:47248). Sep 9 05:45:43.263305 sshd[5627]: Accepted publickey for core from 139.178.89.65 port 47248 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:43.265082 sshd-session[5627]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:43.276879 systemd-logind[1555]: New session 18 of user core. Sep 9 05:45:43.282637 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 05:45:43.626016 sshd[5630]: Connection closed by 139.178.89.65 port 47248 Sep 9 05:45:43.626521 sshd-session[5627]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:43.637327 systemd[1]: sshd@17-10.128.0.19:22-139.178.89.65:47248.service: Deactivated successfully. Sep 9 05:45:43.642416 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 05:45:43.646171 systemd-logind[1555]: Session 18 logged out. Waiting for processes to exit. Sep 9 05:45:43.648743 systemd-logind[1555]: Removed session 18. Sep 9 05:45:48.691528 systemd[1]: Started sshd@18-10.128.0.19:22-139.178.89.65:47264.service - OpenSSH per-connection server daemon (139.178.89.65:47264). Sep 9 05:45:49.021557 sshd[5650]: Accepted publickey for core from 139.178.89.65 port 47264 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:49.025385 sshd-session[5650]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:49.041801 systemd-logind[1555]: New session 19 of user core. Sep 9 05:45:49.045404 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 05:45:49.416120 sshd[5653]: Connection closed by 139.178.89.65 port 47264 Sep 9 05:45:49.417605 sshd-session[5650]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:49.427753 systemd[1]: sshd@18-10.128.0.19:22-139.178.89.65:47264.service: Deactivated successfully. Sep 9 05:45:49.432400 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 05:45:49.435140 systemd-logind[1555]: Session 19 logged out. Waiting for processes to exit. Sep 9 05:45:49.438432 systemd-logind[1555]: Removed session 19. Sep 9 05:45:49.477991 systemd[1]: Started sshd@19-10.128.0.19:22-139.178.89.65:47276.service - OpenSSH per-connection server daemon (139.178.89.65:47276). Sep 9 05:45:49.802171 sshd[5666]: Accepted publickey for core from 139.178.89.65 port 47276 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:49.805262 sshd-session[5666]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:49.817811 systemd-logind[1555]: New session 20 of user core. Sep 9 05:45:49.824657 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 05:45:50.235338 sshd[5669]: Connection closed by 139.178.89.65 port 47276 Sep 9 05:45:50.235140 sshd-session[5666]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:50.251030 systemd[1]: sshd@19-10.128.0.19:22-139.178.89.65:47276.service: Deactivated successfully. Sep 9 05:45:50.257424 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 05:45:50.264579 systemd-logind[1555]: Session 20 logged out. Waiting for processes to exit. Sep 9 05:45:50.267237 systemd-logind[1555]: Removed session 20. Sep 9 05:45:50.300558 systemd[1]: Started sshd@20-10.128.0.19:22-139.178.89.65:33194.service - OpenSSH per-connection server daemon (139.178.89.65:33194). Sep 9 05:45:50.645021 sshd[5681]: Accepted publickey for core from 139.178.89.65 port 33194 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:50.648788 sshd-session[5681]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:50.659952 systemd-logind[1555]: New session 21 of user core. Sep 9 05:45:50.665640 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 9 05:45:51.555393 containerd[1568]: time="2025-09-09T05:45:51.555273581Z" level=info msg="TaskExit event in podsandbox handler container_id:\"847e1797d71f8ae6b2241861fb2ff6e8c236d141ebcd5027ad707c63d4efd658\" id:\"daa8385c5bc2fdc641bb68ee7749b261490bd1495a1506768c11cad941d3f692\" pid:5705 exited_at:{seconds:1757396751 nanos:554442670}" Sep 9 05:45:51.913381 sshd[5684]: Connection closed by 139.178.89.65 port 33194 Sep 9 05:45:51.911949 sshd-session[5681]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:51.921133 systemd[1]: sshd@20-10.128.0.19:22-139.178.89.65:33194.service: Deactivated successfully. Sep 9 05:45:51.921958 systemd-logind[1555]: Session 21 logged out. Waiting for processes to exit. Sep 9 05:45:51.927122 systemd[1]: session-21.scope: Deactivated successfully. Sep 9 05:45:51.936790 systemd-logind[1555]: Removed session 21. Sep 9 05:45:51.972959 systemd[1]: Started sshd@21-10.128.0.19:22-139.178.89.65:33204.service - OpenSSH per-connection server daemon (139.178.89.65:33204). Sep 9 05:45:52.314949 sshd[5724]: Accepted publickey for core from 139.178.89.65 port 33204 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:52.317822 sshd-session[5724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:52.329253 systemd-logind[1555]: New session 22 of user core. Sep 9 05:45:52.333428 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 9 05:45:52.910788 sshd[5729]: Connection closed by 139.178.89.65 port 33204 Sep 9 05:45:52.912965 sshd-session[5724]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:52.930899 systemd[1]: sshd@21-10.128.0.19:22-139.178.89.65:33204.service: Deactivated successfully. Sep 9 05:45:52.937484 systemd[1]: session-22.scope: Deactivated successfully. Sep 9 05:45:52.942067 systemd-logind[1555]: Session 22 logged out. Waiting for processes to exit. Sep 9 05:45:52.946857 systemd-logind[1555]: Removed session 22. Sep 9 05:45:52.974347 systemd[1]: Started sshd@22-10.128.0.19:22-139.178.89.65:33206.service - OpenSSH per-connection server daemon (139.178.89.65:33206). Sep 9 05:45:53.311857 sshd[5746]: Accepted publickey for core from 139.178.89.65 port 33206 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:53.314542 sshd-session[5746]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:53.323247 systemd-logind[1555]: New session 23 of user core. Sep 9 05:45:53.334656 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 9 05:45:53.654725 sshd[5749]: Connection closed by 139.178.89.65 port 33206 Sep 9 05:45:53.655829 sshd-session[5746]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:53.665355 systemd[1]: sshd@22-10.128.0.19:22-139.178.89.65:33206.service: Deactivated successfully. Sep 9 05:45:53.669598 systemd[1]: session-23.scope: Deactivated successfully. Sep 9 05:45:53.671837 systemd-logind[1555]: Session 23 logged out. Waiting for processes to exit. Sep 9 05:45:53.675383 systemd-logind[1555]: Removed session 23. Sep 9 05:45:58.715986 systemd[1]: Started sshd@23-10.128.0.19:22-139.178.89.65:33212.service - OpenSSH per-connection server daemon (139.178.89.65:33212). Sep 9 05:45:59.051344 sshd[5775]: Accepted publickey for core from 139.178.89.65 port 33212 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:45:59.054299 sshd-session[5775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:45:59.069524 systemd-logind[1555]: New session 24 of user core. Sep 9 05:45:59.076883 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 9 05:45:59.391769 sshd[5778]: Connection closed by 139.178.89.65 port 33212 Sep 9 05:45:59.392642 sshd-session[5775]: pam_unix(sshd:session): session closed for user core Sep 9 05:45:59.405132 systemd[1]: sshd@23-10.128.0.19:22-139.178.89.65:33212.service: Deactivated successfully. Sep 9 05:45:59.411021 systemd[1]: session-24.scope: Deactivated successfully. Sep 9 05:45:59.413490 systemd-logind[1555]: Session 24 logged out. Waiting for processes to exit. Sep 9 05:45:59.417547 systemd-logind[1555]: Removed session 24. Sep 9 05:46:03.782257 containerd[1568]: time="2025-09-09T05:46:03.782198401Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17864bba112df883a28364bff6f07223fe1979d81a75b513e5dbdbecc9ec6271\" id:\"f4d19ba1b9795e4320d2a0ecb02637475d8277f1656e90a09c07fabec70beda2\" pid:5804 exited_at:{seconds:1757396763 nanos:780685130}" Sep 9 05:46:04.451479 systemd[1]: Started sshd@24-10.128.0.19:22-139.178.89.65:59928.service - OpenSSH per-connection server daemon (139.178.89.65:59928). Sep 9 05:46:04.781609 sshd[5815]: Accepted publickey for core from 139.178.89.65 port 59928 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:46:04.784123 sshd-session[5815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:46:04.798055 systemd-logind[1555]: New session 25 of user core. Sep 9 05:46:04.806003 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 9 05:46:05.155606 sshd[5820]: Connection closed by 139.178.89.65 port 59928 Sep 9 05:46:05.157353 sshd-session[5815]: pam_unix(sshd:session): session closed for user core Sep 9 05:46:05.167347 systemd[1]: sshd@24-10.128.0.19:22-139.178.89.65:59928.service: Deactivated successfully. Sep 9 05:46:05.172572 systemd[1]: session-25.scope: Deactivated successfully. Sep 9 05:46:05.175055 systemd-logind[1555]: Session 25 logged out. Waiting for processes to exit. Sep 9 05:46:05.178536 systemd-logind[1555]: Removed session 25. Sep 9 05:46:09.604344 containerd[1568]: time="2025-09-09T05:46:09.604165798Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e4321099ab15f1cd7e4f2b22fcd6e288efc9e2d9dde3f6990c2f70c3e715a7d0\" id:\"7bb43825bc019b7e4b2a4907eefb8fc31badd69b761e78d49053c08f2f19a441\" pid:5844 exited_at:{seconds:1757396769 nanos:603762401}" Sep 9 05:46:10.217548 systemd[1]: Started sshd@25-10.128.0.19:22-139.178.89.65:50824.service - OpenSSH per-connection server daemon (139.178.89.65:50824). Sep 9 05:46:10.561532 sshd[5854]: Accepted publickey for core from 139.178.89.65 port 50824 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:46:10.563389 sshd-session[5854]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:46:10.573156 systemd-logind[1555]: New session 26 of user core. Sep 9 05:46:10.582405 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 9 05:46:10.956245 sshd[5858]: Connection closed by 139.178.89.65 port 50824 Sep 9 05:46:10.957003 sshd-session[5854]: pam_unix(sshd:session): session closed for user core Sep 9 05:46:10.967104 systemd[1]: sshd@25-10.128.0.19:22-139.178.89.65:50824.service: Deactivated successfully. Sep 9 05:46:10.973342 systemd[1]: session-26.scope: Deactivated successfully. Sep 9 05:46:10.976000 systemd-logind[1555]: Session 26 logged out. Waiting for processes to exit. Sep 9 05:46:10.981712 systemd-logind[1555]: Removed session 26. Sep 9 05:46:16.014548 systemd[1]: Started sshd@26-10.128.0.19:22-139.178.89.65:50828.service - OpenSSH per-connection server daemon (139.178.89.65:50828). Sep 9 05:46:16.342611 sshd[5872]: Accepted publickey for core from 139.178.89.65 port 50828 ssh2: RSA SHA256:QSDpUihtIai1/X8svdSqOld/LKc/E5lpY4TpkeXfmcw Sep 9 05:46:16.344230 sshd-session[5872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:46:16.353161 systemd-logind[1555]: New session 27 of user core. Sep 9 05:46:16.361437 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 9 05:46:16.732322 sshd[5875]: Connection closed by 139.178.89.65 port 50828 Sep 9 05:46:16.733277 sshd-session[5872]: pam_unix(sshd:session): session closed for user core Sep 9 05:46:16.741941 systemd[1]: sshd@26-10.128.0.19:22-139.178.89.65:50828.service: Deactivated successfully. Sep 9 05:46:16.747510 systemd[1]: session-27.scope: Deactivated successfully. Sep 9 05:46:16.752335 systemd-logind[1555]: Session 27 logged out. Waiting for processes to exit. Sep 9 05:46:16.754826 systemd-logind[1555]: Removed session 27.