Mar 6 02:59:23.146133 kernel: Linux version 6.12.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu Mar 5 23:16:40 -00 2026 Mar 6 02:59:23.146181 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=5bef16c10382b6f77f9493af2297475832ff2f09f1ada4155425ad9b32dd6e53 Mar 6 02:59:23.146207 kernel: BIOS-provided physical RAM map: Mar 6 02:59:23.146223 kernel: BIOS-e820: [mem 0x0000000000000000-0x0000000000000fff] reserved Mar 6 02:59:23.146246 kernel: BIOS-e820: [mem 0x0000000000001000-0x0000000000054fff] usable Mar 6 02:59:23.146262 kernel: BIOS-e820: [mem 0x0000000000055000-0x000000000005ffff] reserved Mar 6 02:59:23.146282 kernel: BIOS-e820: [mem 0x0000000000060000-0x0000000000097fff] usable Mar 6 02:59:23.146299 kernel: BIOS-e820: [mem 0x0000000000098000-0x000000000009ffff] reserved Mar 6 02:59:23.146315 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bd2e4fff] usable Mar 6 02:59:23.146337 kernel: BIOS-e820: [mem 0x00000000bd2e5000-0x00000000bd2eefff] ACPI data Mar 6 02:59:23.146354 kernel: BIOS-e820: [mem 0x00000000bd2ef000-0x00000000bf8ecfff] usable Mar 6 02:59:23.146370 kernel: BIOS-e820: [mem 0x00000000bf8ed000-0x00000000bfb6cfff] reserved Mar 6 02:59:23.146387 kernel: BIOS-e820: [mem 0x00000000bfb6d000-0x00000000bfb7efff] ACPI data Mar 6 02:59:23.146404 kernel: BIOS-e820: [mem 0x00000000bfb7f000-0x00000000bfbfefff] ACPI NVS Mar 6 02:59:23.146425 kernel: BIOS-e820: [mem 0x00000000bfbff000-0x00000000bffdffff] usable Mar 6 02:59:23.146471 kernel: BIOS-e820: [mem 0x00000000bffe0000-0x00000000bfffffff] reserved Mar 6 02:59:23.146486 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000021fffffff] usable Mar 6 02:59:23.146501 kernel: NX (Execute Disable) protection: active Mar 6 02:59:23.146516 kernel: APIC: Static calls initialized Mar 6 02:59:23.146532 kernel: efi: EFI v2.7 by EDK II Mar 6 02:59:23.146549 kernel: efi: TPMFinalLog=0xbfbf7000 ACPI=0xbfb7e000 ACPI 2.0=0xbfb7e014 SMBIOS=0xbf9ca000 MEMATTR=0xbd2ef018 RNG=0xbfb73018 TPMEventLog=0xbd2e5018 Mar 6 02:59:23.146566 kernel: random: crng init done Mar 6 02:59:23.146581 kernel: secureboot: Secure boot disabled Mar 6 02:59:23.146599 kernel: SMBIOS 2.4 present. Mar 6 02:59:23.146616 kernel: DMI: Google Google Compute Engine/Google Compute Engine, BIOS Google 02/12/2026 Mar 6 02:59:23.146638 kernel: DMI: Memory slots populated: 1/1 Mar 6 02:59:23.146653 kernel: Hypervisor detected: KVM Mar 6 02:59:23.146668 kernel: last_pfn = 0xbffe0 max_arch_pfn = 0x400000000 Mar 6 02:59:23.146683 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 6 02:59:23.146700 kernel: kvm-clock: using sched offset of 15493773183 cycles Mar 6 02:59:23.146717 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 6 02:59:23.146733 kernel: tsc: Detected 2299.998 MHz processor Mar 6 02:59:23.146750 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 6 02:59:23.146767 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 6 02:59:23.146783 kernel: last_pfn = 0x220000 max_arch_pfn = 0x400000000 Mar 6 02:59:23.146806 kernel: MTRR map: 3 entries (2 fixed + 1 variable; max 18), built from 8 variable MTRRs Mar 6 02:59:23.146822 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 6 02:59:23.146839 kernel: last_pfn = 0xbffe0 max_arch_pfn = 0x400000000 Mar 6 02:59:23.146876 kernel: Using GB pages for direct mapping Mar 6 02:59:23.146893 kernel: ACPI: Early table checksum verification disabled Mar 6 02:59:23.146918 kernel: ACPI: RSDP 0x00000000BFB7E014 000024 (v02 Google) Mar 6 02:59:23.146935 kernel: ACPI: XSDT 0x00000000BFB7D0E8 00005C (v01 Google GOOGFACP 00000001 01000013) Mar 6 02:59:23.146956 kernel: ACPI: FACP 0x00000000BFB78000 0000F4 (v02 Google GOOGFACP 00000001 GOOG 00000001) Mar 6 02:59:23.146973 kernel: ACPI: DSDT 0x00000000BFB79000 001A64 (v01 Google GOOGDSDT 00000001 GOOG 00000001) Mar 6 02:59:23.146990 kernel: ACPI: FACS 0x00000000BFBF2000 000040 Mar 6 02:59:23.147009 kernel: ACPI: SSDT 0x00000000BFB7C000 000316 (v02 GOOGLE Tpm2Tabl 00001000 INTL 20250404) Mar 6 02:59:23.147028 kernel: ACPI: TPM2 0x00000000BFB7B000 000034 (v04 GOOGLE 00000001 GOOG 00000001) Mar 6 02:59:23.147048 kernel: ACPI: SRAT 0x00000000BFB77000 0000C8 (v03 Google GOOGSRAT 00000001 GOOG 00000001) Mar 6 02:59:23.147067 kernel: ACPI: APIC 0x00000000BFB76000 000076 (v05 Google GOOGAPIC 00000001 GOOG 00000001) Mar 6 02:59:23.147092 kernel: ACPI: SSDT 0x00000000BFB75000 000980 (v01 Google GOOGSSDT 00000001 GOOG 00000001) Mar 6 02:59:23.147112 kernel: ACPI: WAET 0x00000000BFB74000 000028 (v01 Google GOOGWAET 00000001 GOOG 00000001) Mar 6 02:59:23.147132 kernel: ACPI: Reserving FACP table memory at [mem 0xbfb78000-0xbfb780f3] Mar 6 02:59:23.147152 kernel: ACPI: Reserving DSDT table memory at [mem 0xbfb79000-0xbfb7aa63] Mar 6 02:59:23.147172 kernel: ACPI: Reserving FACS table memory at [mem 0xbfbf2000-0xbfbf203f] Mar 6 02:59:23.147191 kernel: ACPI: Reserving SSDT table memory at [mem 0xbfb7c000-0xbfb7c315] Mar 6 02:59:23.147211 kernel: ACPI: Reserving TPM2 table memory at [mem 0xbfb7b000-0xbfb7b033] Mar 6 02:59:23.147238 kernel: ACPI: Reserving SRAT table memory at [mem 0xbfb77000-0xbfb770c7] Mar 6 02:59:23.147257 kernel: ACPI: Reserving APIC table memory at [mem 0xbfb76000-0xbfb76075] Mar 6 02:59:23.147282 kernel: ACPI: Reserving SSDT table memory at [mem 0xbfb75000-0xbfb7597f] Mar 6 02:59:23.147301 kernel: ACPI: Reserving WAET table memory at [mem 0xbfb74000-0xbfb74027] Mar 6 02:59:23.147321 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Mar 6 02:59:23.147341 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0xbfffffff] Mar 6 02:59:23.147361 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x21fffffff] Mar 6 02:59:23.147381 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0xbfffffff] -> [mem 0x00001000-0xbfffffff] Mar 6 02:59:23.147401 kernel: NUMA: Node 0 [mem 0x00001000-0xbfffffff] + [mem 0x100000000-0x21fffffff] -> [mem 0x00001000-0x21fffffff] Mar 6 02:59:23.147421 kernel: NODE_DATA(0) allocated [mem 0x21fff6dc0-0x21fffdfff] Mar 6 02:59:23.147458 kernel: Zone ranges: Mar 6 02:59:23.147484 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 6 02:59:23.147504 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 6 02:59:23.147524 kernel: Normal [mem 0x0000000100000000-0x000000021fffffff] Mar 6 02:59:23.147543 kernel: Device empty Mar 6 02:59:23.147562 kernel: Movable zone start for each node Mar 6 02:59:23.147581 kernel: Early memory node ranges Mar 6 02:59:23.147601 kernel: node 0: [mem 0x0000000000001000-0x0000000000054fff] Mar 6 02:59:23.147621 kernel: node 0: [mem 0x0000000000060000-0x0000000000097fff] Mar 6 02:59:23.147641 kernel: node 0: [mem 0x0000000000100000-0x00000000bd2e4fff] Mar 6 02:59:23.147660 kernel: node 0: [mem 0x00000000bd2ef000-0x00000000bf8ecfff] Mar 6 02:59:23.147684 kernel: node 0: [mem 0x00000000bfbff000-0x00000000bffdffff] Mar 6 02:59:23.147704 kernel: node 0: [mem 0x0000000100000000-0x000000021fffffff] Mar 6 02:59:23.147723 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000021fffffff] Mar 6 02:59:23.147743 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 6 02:59:23.147762 kernel: On node 0, zone DMA: 11 pages in unavailable ranges Mar 6 02:59:23.147781 kernel: On node 0, zone DMA: 104 pages in unavailable ranges Mar 6 02:59:23.147801 kernel: On node 0, zone DMA32: 10 pages in unavailable ranges Mar 6 02:59:23.147821 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Mar 6 02:59:23.147840 kernel: On node 0, zone Normal: 32 pages in unavailable ranges Mar 6 02:59:23.147865 kernel: ACPI: PM-Timer IO Port: 0xb008 Mar 6 02:59:23.147885 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 6 02:59:23.147905 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 6 02:59:23.147925 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 6 02:59:23.147944 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 6 02:59:23.147964 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 6 02:59:23.147983 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 6 02:59:23.148003 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 6 02:59:23.148023 kernel: CPU topo: Max. logical packages: 1 Mar 6 02:59:23.148047 kernel: CPU topo: Max. logical dies: 1 Mar 6 02:59:23.148067 kernel: CPU topo: Max. dies per package: 1 Mar 6 02:59:23.148087 kernel: CPU topo: Max. threads per core: 2 Mar 6 02:59:23.148107 kernel: CPU topo: Num. cores per package: 1 Mar 6 02:59:23.148127 kernel: CPU topo: Num. threads per package: 2 Mar 6 02:59:23.148145 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Mar 6 02:59:23.148165 kernel: [mem 0xc0000000-0xffffffff] available for PCI devices Mar 6 02:59:23.148185 kernel: Booting paravirtualized kernel on KVM Mar 6 02:59:23.148205 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 6 02:59:23.148237 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 6 02:59:23.148257 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Mar 6 02:59:23.148277 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Mar 6 02:59:23.148296 kernel: pcpu-alloc: [0] 0 1 Mar 6 02:59:23.148315 kernel: kvm-guest: PV spinlocks enabled Mar 6 02:59:23.148334 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 6 02:59:23.148356 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=5bef16c10382b6f77f9493af2297475832ff2f09f1ada4155425ad9b32dd6e53 Mar 6 02:59:23.148376 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Mar 6 02:59:23.148402 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 6 02:59:23.148422 kernel: Fallback order for Node 0: 0 Mar 6 02:59:23.150618 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1965136 Mar 6 02:59:23.150647 kernel: Policy zone: Normal Mar 6 02:59:23.150666 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 6 02:59:23.150684 kernel: software IO TLB: area num 2. Mar 6 02:59:23.150725 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 6 02:59:23.150751 kernel: Kernel/User page tables isolation: enabled Mar 6 02:59:23.150768 kernel: ftrace: allocating 40099 entries in 157 pages Mar 6 02:59:23.150788 kernel: ftrace: allocated 157 pages with 5 groups Mar 6 02:59:23.150806 kernel: Dynamic Preempt: voluntary Mar 6 02:59:23.150824 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 6 02:59:23.150850 kernel: rcu: RCU event tracing is enabled. Mar 6 02:59:23.150869 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 6 02:59:23.150890 kernel: Trampoline variant of Tasks RCU enabled. Mar 6 02:59:23.150910 kernel: Rude variant of Tasks RCU enabled. Mar 6 02:59:23.150929 kernel: Tracing variant of Tasks RCU enabled. Mar 6 02:59:23.150955 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 6 02:59:23.150975 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 6 02:59:23.150993 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 6 02:59:23.151011 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 6 02:59:23.151031 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 6 02:59:23.151052 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 6 02:59:23.151072 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 6 02:59:23.151091 kernel: Console: colour dummy device 80x25 Mar 6 02:59:23.151110 kernel: printk: legacy console [ttyS0] enabled Mar 6 02:59:23.151135 kernel: ACPI: Core revision 20240827 Mar 6 02:59:23.151155 kernel: APIC: Switch to symmetric I/O mode setup Mar 6 02:59:23.151175 kernel: x2apic enabled Mar 6 02:59:23.151193 kernel: APIC: Switched APIC routing to: physical x2apic Mar 6 02:59:23.151211 kernel: ..TIMER: vector=0x30 apic1=0 pin1=0 apic2=-1 pin2=-1 Mar 6 02:59:23.151239 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Mar 6 02:59:23.151258 kernel: Calibrating delay loop (skipped) preset value.. 4599.99 BogoMIPS (lpj=2299998) Mar 6 02:59:23.151278 kernel: Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 1024 Mar 6 02:59:23.151298 kernel: Last level dTLB entries: 4KB 1024, 2MB 1024, 4MB 1024, 1GB 4 Mar 6 02:59:23.151324 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 6 02:59:23.151343 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Mar 6 02:59:23.151362 kernel: Spectre V2 : Mitigation: IBRS Mar 6 02:59:23.151381 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Mar 6 02:59:23.151399 kernel: RETBleed: Mitigation: IBRS Mar 6 02:59:23.151460 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 6 02:59:23.151482 kernel: Spectre V2 : User space: Mitigation: STIBP via prctl Mar 6 02:59:23.151503 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 6 02:59:23.151524 kernel: MDS: Mitigation: Clear CPU buffers Mar 6 02:59:23.151551 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Mar 6 02:59:23.151571 kernel: active return thunk: its_return_thunk Mar 6 02:59:23.151592 kernel: ITS: Mitigation: Aligned branch/return thunks Mar 6 02:59:23.151613 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 6 02:59:23.151634 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 6 02:59:23.151654 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 6 02:59:23.151674 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 6 02:59:23.151696 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Mar 6 02:59:23.151722 kernel: Freeing SMP alternatives memory: 32K Mar 6 02:59:23.151743 kernel: pid_max: default: 32768 minimum: 301 Mar 6 02:59:23.151762 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Mar 6 02:59:23.151783 kernel: landlock: Up and running. Mar 6 02:59:23.151804 kernel: SELinux: Initializing. Mar 6 02:59:23.151825 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 6 02:59:23.151846 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 6 02:59:23.151866 kernel: smpboot: CPU0: Intel(R) Xeon(R) CPU @ 2.30GHz (family: 0x6, model: 0x3f, stepping: 0x0) Mar 6 02:59:23.151886 kernel: Performance Events: unsupported p6 CPU model 63 no PMU driver, software events only. Mar 6 02:59:23.151913 kernel: signal: max sigframe size: 1776 Mar 6 02:59:23.151933 kernel: rcu: Hierarchical SRCU implementation. Mar 6 02:59:23.151955 kernel: rcu: Max phase no-delay instances is 400. Mar 6 02:59:23.151976 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Mar 6 02:59:23.151996 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 6 02:59:23.152016 kernel: smp: Bringing up secondary CPUs ... Mar 6 02:59:23.152037 kernel: smpboot: x86: Booting SMP configuration: Mar 6 02:59:23.152057 kernel: .... node #0, CPUs: #1 Mar 6 02:59:23.152079 kernel: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Mar 6 02:59:23.152107 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Mar 6 02:59:23.152127 kernel: smp: Brought up 1 node, 2 CPUs Mar 6 02:59:23.152148 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) Mar 6 02:59:23.152169 kernel: Memory: 7556056K/7860544K available (14336K kernel code, 2445K rwdata, 26064K rodata, 46196K init, 2564K bss, 298908K reserved, 0K cma-reserved) Mar 6 02:59:23.152190 kernel: devtmpfs: initialized Mar 6 02:59:23.152211 kernel: x86/mm: Memory block size: 128MB Mar 6 02:59:23.152238 kernel: ACPI: PM: Registering ACPI NVS region [mem 0xbfb7f000-0xbfbfefff] (524288 bytes) Mar 6 02:59:23.152259 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 6 02:59:23.152280 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 6 02:59:23.152305 kernel: pinctrl core: initialized pinctrl subsystem Mar 6 02:59:23.152326 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 6 02:59:23.152347 kernel: audit: initializing netlink subsys (disabled) Mar 6 02:59:23.152368 kernel: audit: type=2000 audit(1772765958.478:1): state=initialized audit_enabled=0 res=1 Mar 6 02:59:23.152388 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 6 02:59:23.152409 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 6 02:59:23.152430 kernel: cpuidle: using governor menu Mar 6 02:59:23.152508 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 6 02:59:23.152528 kernel: dca service started, version 1.12.1 Mar 6 02:59:23.152553 kernel: PCI: Using configuration type 1 for base access Mar 6 02:59:23.152573 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 6 02:59:23.152594 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 6 02:59:23.152615 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 6 02:59:23.152636 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 6 02:59:23.152657 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 6 02:59:23.152678 kernel: ACPI: Added _OSI(Module Device) Mar 6 02:59:23.152698 kernel: ACPI: Added _OSI(Processor Device) Mar 6 02:59:23.152719 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 6 02:59:23.152746 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Mar 6 02:59:23.152767 kernel: ACPI: Interpreter enabled Mar 6 02:59:23.152787 kernel: ACPI: PM: (supports S0 S3 S5) Mar 6 02:59:23.152807 kernel: ACPI: Using IOAPIC for interrupt routing Mar 6 02:59:23.152827 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 6 02:59:23.152848 kernel: PCI: Ignoring E820 reservations for host bridge windows Mar 6 02:59:23.152870 kernel: ACPI: Enabled 16 GPEs in block 00 to 0F Mar 6 02:59:23.152890 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 6 02:59:23.153284 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Mar 6 02:59:23.153615 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Mar 6 02:59:23.153848 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Mar 6 02:59:23.153873 kernel: PCI host bridge to bus 0000:00 Mar 6 02:59:23.154096 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 6 02:59:23.154317 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 6 02:59:23.154554 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 6 02:59:23.154771 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfefff window] Mar 6 02:59:23.154976 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 6 02:59:23.155224 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint Mar 6 02:59:23.155512 kernel: pci 0000:00:01.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Mar 6 02:59:23.155752 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Mar 6 02:59:23.155977 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Mar 6 02:59:23.156224 kernel: pci 0000:00:03.0: [1af4:1004] type 00 class 0x000000 conventional PCI endpoint Mar 6 02:59:23.156476 kernel: pci 0000:00:03.0: BAR 0 [io 0xc040-0xc07f] Mar 6 02:59:23.156706 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc0001000-0xc000107f] Mar 6 02:59:23.156941 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Mar 6 02:59:23.157172 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc03f] Mar 6 02:59:23.157414 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc0000000-0xc000007f] Mar 6 02:59:23.157687 kernel: pci 0000:00:05.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Mar 6 02:59:23.157914 kernel: pci 0000:00:05.0: BAR 0 [io 0xc080-0xc09f] Mar 6 02:59:23.158141 kernel: pci 0000:00:05.0: BAR 1 [mem 0xc0002000-0xc000203f] Mar 6 02:59:23.158166 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 6 02:59:23.158187 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 6 02:59:23.158208 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 6 02:59:23.158236 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 6 02:59:23.158258 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Mar 6 02:59:23.158285 kernel: iommu: Default domain type: Translated Mar 6 02:59:23.158306 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 6 02:59:23.158327 kernel: efivars: Registered efivars operations Mar 6 02:59:23.158347 kernel: PCI: Using ACPI for IRQ routing Mar 6 02:59:23.158368 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 6 02:59:23.158388 kernel: e820: reserve RAM buffer [mem 0x00055000-0x0005ffff] Mar 6 02:59:23.158409 kernel: e820: reserve RAM buffer [mem 0x00098000-0x0009ffff] Mar 6 02:59:23.158429 kernel: e820: reserve RAM buffer [mem 0xbd2e5000-0xbfffffff] Mar 6 02:59:23.158467 kernel: e820: reserve RAM buffer [mem 0xbf8ed000-0xbfffffff] Mar 6 02:59:23.158492 kernel: e820: reserve RAM buffer [mem 0xbffe0000-0xbfffffff] Mar 6 02:59:23.158513 kernel: vgaarb: loaded Mar 6 02:59:23.158534 kernel: clocksource: Switched to clocksource kvm-clock Mar 6 02:59:23.158555 kernel: VFS: Disk quotas dquot_6.6.0 Mar 6 02:59:23.158575 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 6 02:59:23.158596 kernel: pnp: PnP ACPI init Mar 6 02:59:23.158616 kernel: pnp: PnP ACPI: found 7 devices Mar 6 02:59:23.158638 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 6 02:59:23.158658 kernel: NET: Registered PF_INET protocol family Mar 6 02:59:23.158679 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 6 02:59:23.158705 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Mar 6 02:59:23.158726 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 6 02:59:23.158747 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 6 02:59:23.158768 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Mar 6 02:59:23.158789 kernel: TCP: Hash tables configured (established 65536 bind 65536) Mar 6 02:59:23.158810 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Mar 6 02:59:23.158831 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Mar 6 02:59:23.158851 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 6 02:59:23.158876 kernel: NET: Registered PF_XDP protocol family Mar 6 02:59:23.159094 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 6 02:59:23.159311 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 6 02:59:23.159545 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 6 02:59:23.159755 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfefff window] Mar 6 02:59:23.159988 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 6 02:59:23.160015 kernel: PCI: CLS 0 bytes, default 64 Mar 6 02:59:23.160043 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 6 02:59:23.160064 kernel: software IO TLB: mapped [mem 0x00000000b7f7f000-0x00000000bbf7f000] (64MB) Mar 6 02:59:23.160085 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 6 02:59:23.160107 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Mar 6 02:59:23.160129 kernel: clocksource: Switched to clocksource tsc Mar 6 02:59:23.160149 kernel: Initialise system trusted keyrings Mar 6 02:59:23.160170 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Mar 6 02:59:23.160191 kernel: Key type asymmetric registered Mar 6 02:59:23.160212 kernel: Asymmetric key parser 'x509' registered Mar 6 02:59:23.160244 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 6 02:59:23.160265 kernel: io scheduler mq-deadline registered Mar 6 02:59:23.160285 kernel: io scheduler kyber registered Mar 6 02:59:23.160306 kernel: io scheduler bfq registered Mar 6 02:59:23.160327 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 6 02:59:23.160349 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Mar 6 02:59:23.160600 kernel: virtio-pci 0000:00:03.0: virtio_pci: leaving for legacy driver Mar 6 02:59:23.160627 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 10 Mar 6 02:59:23.160856 kernel: virtio-pci 0000:00:04.0: virtio_pci: leaving for legacy driver Mar 6 02:59:23.160887 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Mar 6 02:59:23.161114 kernel: virtio-pci 0000:00:05.0: virtio_pci: leaving for legacy driver Mar 6 02:59:23.161135 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 6 02:59:23.161152 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 6 02:59:23.161170 kernel: 00:04: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Mar 6 02:59:23.161188 kernel: 00:05: ttyS2 at I/O 0x3e8 (irq = 6, base_baud = 115200) is a 16550A Mar 6 02:59:23.161208 kernel: 00:06: ttyS3 at I/O 0x2e8 (irq = 7, base_baud = 115200) is a 16550A Mar 6 02:59:23.161485 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x9009, rev-id 0) Mar 6 02:59:23.161525 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 6 02:59:23.161547 kernel: i8042: Warning: Keylock active Mar 6 02:59:23.161563 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 6 02:59:23.162497 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 6 02:59:23.162786 kernel: rtc_cmos 00:00: RTC can wake from S4 Mar 6 02:59:23.165679 kernel: rtc_cmos 00:00: registered as rtc0 Mar 6 02:59:23.165910 kernel: rtc_cmos 00:00: setting system clock to 2026-03-06T02:59:22 UTC (1772765962) Mar 6 02:59:23.166122 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Mar 6 02:59:23.166155 kernel: intel_pstate: CPU model not supported Mar 6 02:59:23.166177 kernel: pstore: Using crash dump compression: deflate Mar 6 02:59:23.166198 kernel: pstore: Registered efi_pstore as persistent store backend Mar 6 02:59:23.166220 kernel: NET: Registered PF_INET6 protocol family Mar 6 02:59:23.166249 kernel: Segment Routing with IPv6 Mar 6 02:59:23.166270 kernel: In-situ OAM (IOAM) with IPv6 Mar 6 02:59:23.166291 kernel: NET: Registered PF_PACKET protocol family Mar 6 02:59:23.166312 kernel: Key type dns_resolver registered Mar 6 02:59:23.166334 kernel: IPI shorthand broadcast: enabled Mar 6 02:59:23.166360 kernel: sched_clock: Marking stable (3899004059, 149236289)->(4298420482, -250180134) Mar 6 02:59:23.166381 kernel: registered taskstats version 1 Mar 6 02:59:23.166402 kernel: Loading compiled-in X.509 certificates Mar 6 02:59:23.166423 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.74-flatcar: 30893fe9fd219d26109af079e6493e1c8b1c00af' Mar 6 02:59:23.166459 kernel: Demotion targets for Node 0: null Mar 6 02:59:23.166480 kernel: Key type .fscrypt registered Mar 6 02:59:23.166500 kernel: Key type fscrypt-provisioning registered Mar 6 02:59:23.166521 kernel: ima: Allocated hash algorithm: sha1 Mar 6 02:59:23.166542 kernel: ima: No architecture policies found Mar 6 02:59:23.166568 kernel: clk: Disabling unused clocks Mar 6 02:59:23.166590 kernel: Warning: unable to open an initial console. Mar 6 02:59:23.166611 kernel: Freeing unused kernel image (initmem) memory: 46196K Mar 6 02:59:23.166649 kernel: Write protecting the kernel read-only data: 40960k Mar 6 02:59:23.166670 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 6 02:59:23.166692 kernel: Freeing unused kernel image (rodata/data gap) memory: 560K Mar 6 02:59:23.166713 kernel: Run /init as init process Mar 6 02:59:23.166734 kernel: with arguments: Mar 6 02:59:23.166755 kernel: /init Mar 6 02:59:23.166780 kernel: with environment: Mar 6 02:59:23.166801 kernel: HOME=/ Mar 6 02:59:23.166822 kernel: TERM=linux Mar 6 02:59:23.166843 systemd[1]: Successfully made /usr/ read-only. Mar 6 02:59:23.166888 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 6 02:59:23.166911 systemd[1]: Detected virtualization google. Mar 6 02:59:23.166933 systemd[1]: Detected architecture x86-64. Mar 6 02:59:23.166959 systemd[1]: Running in initrd. Mar 6 02:59:23.166979 systemd[1]: No hostname configured, using default hostname. Mar 6 02:59:23.167001 systemd[1]: Hostname set to . Mar 6 02:59:23.167023 systemd[1]: Initializing machine ID from random generator. Mar 6 02:59:23.167044 systemd[1]: Queued start job for default target initrd.target. Mar 6 02:59:23.167066 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:59:23.167109 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:59:23.167137 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 6 02:59:23.167164 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 6 02:59:23.167192 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 6 02:59:23.167221 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 6 02:59:23.167254 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 6 02:59:23.167277 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 6 02:59:23.167306 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:59:23.167330 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:59:23.167353 systemd[1]: Reached target paths.target - Path Units. Mar 6 02:59:23.167377 systemd[1]: Reached target slices.target - Slice Units. Mar 6 02:59:23.167400 systemd[1]: Reached target swap.target - Swaps. Mar 6 02:59:23.167423 systemd[1]: Reached target timers.target - Timer Units. Mar 6 02:59:23.168155 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 6 02:59:23.168181 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 6 02:59:23.168211 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 6 02:59:23.168350 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 6 02:59:23.168384 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:59:23.168407 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 6 02:59:23.168429 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:59:23.172011 systemd[1]: Reached target sockets.target - Socket Units. Mar 6 02:59:23.172036 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 6 02:59:23.172057 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 6 02:59:23.172079 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 6 02:59:23.172110 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Mar 6 02:59:23.172131 systemd[1]: Starting systemd-fsck-usr.service... Mar 6 02:59:23.172152 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 6 02:59:23.172174 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 6 02:59:23.172196 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:59:23.172218 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 6 02:59:23.172251 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:59:23.172272 systemd[1]: Finished systemd-fsck-usr.service. Mar 6 02:59:23.172294 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 6 02:59:23.172357 systemd-journald[192]: Collecting audit messages is disabled. Mar 6 02:59:23.172407 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:59:23.172430 systemd-journald[192]: Journal started Mar 6 02:59:23.172515 systemd-journald[192]: Runtime Journal (/run/log/journal/e4edf2540e364edd8cdd1233ae2f6328) is 8M, max 148.6M, 140.6M free. Mar 6 02:59:23.127493 systemd-modules-load[193]: Inserted module 'overlay' Mar 6 02:59:23.182468 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 6 02:59:23.182529 systemd[1]: Started systemd-journald.service - Journal Service. Mar 6 02:59:23.185699 kernel: Bridge firewalling registered Mar 6 02:59:23.184929 systemd-modules-load[193]: Inserted module 'br_netfilter' Mar 6 02:59:23.186143 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 6 02:59:23.191989 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 6 02:59:23.200993 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 6 02:59:23.211617 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 6 02:59:23.216791 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 6 02:59:23.221260 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 6 02:59:23.247620 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:59:23.251766 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:59:23.258061 systemd-tmpfiles[213]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Mar 6 02:59:23.260867 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 02:59:23.268912 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:59:23.274587 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 6 02:59:23.281009 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 6 02:59:23.315474 dracut-cmdline[230]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=5bef16c10382b6f77f9493af2297475832ff2f09f1ada4155425ad9b32dd6e53 Mar 6 02:59:23.349626 systemd-resolved[231]: Positive Trust Anchors: Mar 6 02:59:23.349652 systemd-resolved[231]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 6 02:59:23.349723 systemd-resolved[231]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 6 02:59:23.357193 systemd-resolved[231]: Defaulting to hostname 'linux'. Mar 6 02:59:23.361411 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 6 02:59:23.370707 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:59:23.448476 kernel: SCSI subsystem initialized Mar 6 02:59:23.461465 kernel: Loading iSCSI transport class v2.0-870. Mar 6 02:59:23.473488 kernel: iscsi: registered transport (tcp) Mar 6 02:59:23.499483 kernel: iscsi: registered transport (qla4xxx) Mar 6 02:59:23.499570 kernel: QLogic iSCSI HBA Driver Mar 6 02:59:23.523590 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 6 02:59:23.543281 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:59:23.550525 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 6 02:59:23.613155 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 6 02:59:23.616550 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 6 02:59:23.679480 kernel: raid6: avx2x4 gen() 17973 MB/s Mar 6 02:59:23.696469 kernel: raid6: avx2x2 gen() 18008 MB/s Mar 6 02:59:23.713833 kernel: raid6: avx2x1 gen() 13870 MB/s Mar 6 02:59:23.713903 kernel: raid6: using algorithm avx2x2 gen() 18008 MB/s Mar 6 02:59:23.732023 kernel: raid6: .... xor() 18572 MB/s, rmw enabled Mar 6 02:59:23.732084 kernel: raid6: using avx2x2 recovery algorithm Mar 6 02:59:23.755487 kernel: xor: automatically using best checksumming function avx Mar 6 02:59:23.939491 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 6 02:59:23.948833 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 6 02:59:23.951573 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:59:23.988187 systemd-udevd[440]: Using default interface naming scheme 'v255'. Mar 6 02:59:23.997007 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:59:24.001357 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 6 02:59:24.034373 dracut-pre-trigger[447]: rd.md=0: removing MD RAID activation Mar 6 02:59:24.067607 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 6 02:59:24.074514 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 6 02:59:24.176217 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:59:24.183476 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 6 02:59:24.304785 kernel: virtio_scsi virtio0: 1/0/0 default/read/poll queues Mar 6 02:59:24.315523 kernel: scsi host0: Virtio SCSI HBA Mar 6 02:59:24.315634 kernel: blk-mq: reduced tag depth to 10240 Mar 6 02:59:24.323486 kernel: scsi 0:0:1:0: Direct-Access Google PersistentDisk 1 PQ: 0 ANSI: 6 Mar 6 02:59:24.323698 kernel: cryptd: max_cpu_qlen set to 1000 Mar 6 02:59:24.369486 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Mar 6 02:59:24.377484 kernel: AES CTR mode by8 optimization enabled Mar 6 02:59:24.431631 kernel: sd 0:0:1:0: [sda] 33554432 512-byte logical blocks: (17.2 GB/16.0 GiB) Mar 6 02:59:24.435649 kernel: sd 0:0:1:0: [sda] 4096-byte physical blocks Mar 6 02:59:24.414803 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:59:24.436715 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:59:24.454618 kernel: sd 0:0:1:0: [sda] Write Protect is off Mar 6 02:59:24.454940 kernel: sd 0:0:1:0: [sda] Mode Sense: 1f 00 00 08 Mar 6 02:59:24.455219 kernel: sd 0:0:1:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 6 02:59:24.448923 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:59:24.451176 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:59:24.458979 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:59:24.471044 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 6 02:59:24.471082 kernel: GPT:17805311 != 33554431 Mar 6 02:59:24.471115 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 6 02:59:24.471138 kernel: GPT:17805311 != 33554431 Mar 6 02:59:24.471169 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 6 02:59:24.471192 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:59:24.471216 kernel: sd 0:0:1:0: [sda] Attached SCSI disk Mar 6 02:59:24.516312 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:59:24.582027 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - PersistentDisk ROOT. Mar 6 02:59:24.583291 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 6 02:59:24.601178 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - PersistentDisk EFI-SYSTEM. Mar 6 02:59:24.622840 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - PersistentDisk OEM. Mar 6 02:59:24.633904 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - PersistentDisk USR-A. Mar 6 02:59:24.634196 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - PersistentDisk USR-A. Mar 6 02:59:24.641658 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 6 02:59:24.646552 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:59:24.651562 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 6 02:59:24.656830 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 6 02:59:24.673646 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 6 02:59:24.684544 disk-uuid[595]: Primary Header is updated. Mar 6 02:59:24.684544 disk-uuid[595]: Secondary Entries is updated. Mar 6 02:59:24.684544 disk-uuid[595]: Secondary Header is updated. Mar 6 02:59:24.700474 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:59:24.701290 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 6 02:59:25.731572 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 6 02:59:25.731656 disk-uuid[596]: The operation has completed successfully. Mar 6 02:59:25.812415 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 6 02:59:25.812633 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 6 02:59:25.867764 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 6 02:59:25.896243 sh[617]: Success Mar 6 02:59:25.919996 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 6 02:59:25.920125 kernel: device-mapper: uevent: version 1.0.3 Mar 6 02:59:25.920157 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Mar 6 02:59:25.933466 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Mar 6 02:59:26.015030 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 6 02:59:26.021558 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 6 02:59:26.042752 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 6 02:59:26.063472 kernel: BTRFS: device fsid 1235dd15-5252-4928-9c6c-372370c6bfca devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (629) Mar 6 02:59:26.066547 kernel: BTRFS info (device dm-0): first mount of filesystem 1235dd15-5252-4928-9c6c-372370c6bfca Mar 6 02:59:26.066610 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 6 02:59:26.092973 kernel: BTRFS info (device dm-0 state E): enabling ssd optimizations Mar 6 02:59:26.093064 kernel: BTRFS info (device dm-0 state E): disabling log replay at mount time Mar 6 02:59:26.093090 kernel: BTRFS info (device dm-0 state E): enabling free space tree Mar 6 02:59:26.096648 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 6 02:59:26.097457 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Mar 6 02:59:26.102573 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 6 02:59:26.103829 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 6 02:59:26.113340 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 6 02:59:26.151504 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (660) Mar 6 02:59:26.154942 kernel: BTRFS info (device sda6): first mount of filesystem 993ea71e-e97d-4f5e-b5c7-fdac31a53b6b Mar 6 02:59:26.155012 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 6 02:59:26.166661 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 6 02:59:26.166738 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:59:26.166764 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:59:26.175513 kernel: BTRFS info (device sda6): last unmount of filesystem 993ea71e-e97d-4f5e-b5c7-fdac31a53b6b Mar 6 02:59:26.177310 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 6 02:59:26.187638 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 6 02:59:26.274695 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 6 02:59:26.289269 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 6 02:59:26.427014 systemd-networkd[798]: lo: Link UP Mar 6 02:59:26.427597 systemd-networkd[798]: lo: Gained carrier Mar 6 02:59:26.430225 systemd-networkd[798]: Enumeration completed Mar 6 02:59:26.430883 systemd-networkd[798]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:59:26.430890 systemd-networkd[798]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 02:59:26.431166 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 6 02:59:26.434795 systemd-networkd[798]: eth0: Link UP Mar 6 02:59:26.435064 systemd-networkd[798]: eth0: Gained carrier Mar 6 02:59:26.435086 systemd-networkd[798]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:59:26.444522 systemd[1]: Reached target network.target - Network. Mar 6 02:59:26.447510 systemd-networkd[798]: eth0: Overlong DHCP hostname received, shortened from 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce.c.flatcar-212911.internal' to 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 02:59:26.455712 ignition[728]: Ignition 2.22.0 Mar 6 02:59:26.447529 systemd-networkd[798]: eth0: DHCPv4 address 10.128.0.87/32, gateway 10.128.0.1 acquired from 169.254.169.254 Mar 6 02:59:26.455720 ignition[728]: Stage: fetch-offline Mar 6 02:59:26.458952 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 6 02:59:26.455756 ignition[728]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:59:26.467490 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 6 02:59:26.455766 ignition[728]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Mar 6 02:59:26.455882 ignition[728]: parsed url from cmdline: "" Mar 6 02:59:26.455886 ignition[728]: no config URL provided Mar 6 02:59:26.455892 ignition[728]: reading system config file "/usr/lib/ignition/user.ign" Mar 6 02:59:26.455902 ignition[728]: no config at "/usr/lib/ignition/user.ign" Mar 6 02:59:26.455910 ignition[728]: failed to fetch config: resource requires networking Mar 6 02:59:26.456948 ignition[728]: Ignition finished successfully Mar 6 02:59:26.510043 ignition[808]: Ignition 2.22.0 Mar 6 02:59:26.510530 ignition[808]: Stage: fetch Mar 6 02:59:26.510764 ignition[808]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:59:26.510777 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Mar 6 02:59:26.510905 ignition[808]: parsed url from cmdline: "" Mar 6 02:59:26.510911 ignition[808]: no config URL provided Mar 6 02:59:26.510922 ignition[808]: reading system config file "/usr/lib/ignition/user.ign" Mar 6 02:59:26.510935 ignition[808]: no config at "/usr/lib/ignition/user.ign" Mar 6 02:59:26.524583 unknown[808]: fetched base config from "system" Mar 6 02:59:26.510974 ignition[808]: GET http://169.254.169.254/computeMetadata/v1/instance/attributes/user-data: attempt #1 Mar 6 02:59:26.524604 unknown[808]: fetched base config from "system" Mar 6 02:59:26.515822 ignition[808]: GET result: OK Mar 6 02:59:26.524614 unknown[808]: fetched user config from "gcp" Mar 6 02:59:26.516025 ignition[808]: parsing config with SHA512: 94de79a31e7dbc4a1e8a142d9ac291ade3d7282cc132f6a7996165400cb4be3c47c18f37be744076f11339cab657f8df3913abf66df3fa0b349aa9fe7d41b1e9 Mar 6 02:59:26.529007 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 6 02:59:26.525743 ignition[808]: fetch: fetch complete Mar 6 02:59:26.532068 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 6 02:59:26.525752 ignition[808]: fetch: fetch passed Mar 6 02:59:26.525819 ignition[808]: Ignition finished successfully Mar 6 02:59:26.574087 ignition[815]: Ignition 2.22.0 Mar 6 02:59:26.574104 ignition[815]: Stage: kargs Mar 6 02:59:26.574324 ignition[815]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:59:26.577918 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 6 02:59:26.574341 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Mar 6 02:59:26.581592 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 6 02:59:26.575539 ignition[815]: kargs: kargs passed Mar 6 02:59:26.575604 ignition[815]: Ignition finished successfully Mar 6 02:59:26.629510 ignition[822]: Ignition 2.22.0 Mar 6 02:59:26.629534 ignition[822]: Stage: disks Mar 6 02:59:26.629790 ignition[822]: no configs at "/usr/lib/ignition/base.d" Mar 6 02:59:26.633702 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 6 02:59:26.629808 ignition[822]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Mar 6 02:59:26.638043 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 6 02:59:26.631668 ignition[822]: disks: disks passed Mar 6 02:59:26.640788 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 6 02:59:26.631757 ignition[822]: Ignition finished successfully Mar 6 02:59:26.647748 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 6 02:59:26.650791 systemd[1]: Reached target sysinit.target - System Initialization. Mar 6 02:59:26.655845 systemd[1]: Reached target basic.target - Basic System. Mar 6 02:59:26.662249 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 6 02:59:26.711855 systemd-fsck[831]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Mar 6 02:59:26.723602 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 6 02:59:26.728773 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 6 02:59:26.918455 kernel: EXT4-fs (sda9): mounted filesystem 16ab7223-a8af-43d2-ad40-7e1bf0ff2a89 r/w with ordered data mode. Quota mode: none. Mar 6 02:59:26.919950 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 6 02:59:26.924156 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 6 02:59:26.929036 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 6 02:59:26.943850 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 6 02:59:26.951185 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 6 02:59:26.951279 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 6 02:59:26.962513 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (839) Mar 6 02:59:26.951325 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 6 02:59:26.967747 kernel: BTRFS info (device sda6): first mount of filesystem 993ea71e-e97d-4f5e-b5c7-fdac31a53b6b Mar 6 02:59:26.967963 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 6 02:59:26.963028 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 6 02:59:26.972868 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 6 02:59:26.980225 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 6 02:59:26.980276 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:59:26.980303 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:59:26.984216 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 6 02:59:27.098610 initrd-setup-root[865]: cut: /sysroot/etc/passwd: No such file or directory Mar 6 02:59:27.106861 initrd-setup-root[872]: cut: /sysroot/etc/group: No such file or directory Mar 6 02:59:27.114359 initrd-setup-root[879]: cut: /sysroot/etc/shadow: No such file or directory Mar 6 02:59:27.121579 initrd-setup-root[886]: cut: /sysroot/etc/gshadow: No such file or directory Mar 6 02:59:27.299152 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 6 02:59:27.306420 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 6 02:59:27.307739 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 6 02:59:27.338119 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 6 02:59:27.341940 kernel: BTRFS info (device sda6): last unmount of filesystem 993ea71e-e97d-4f5e-b5c7-fdac31a53b6b Mar 6 02:59:27.380000 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 6 02:59:27.391589 ignition[953]: INFO : Ignition 2.22.0 Mar 6 02:59:27.391589 ignition[953]: INFO : Stage: mount Mar 6 02:59:27.396148 ignition[953]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:59:27.396148 ignition[953]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Mar 6 02:59:27.396148 ignition[953]: INFO : mount: mount passed Mar 6 02:59:27.396148 ignition[953]: INFO : Ignition finished successfully Mar 6 02:59:27.395475 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 6 02:59:27.399902 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 6 02:59:27.427207 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 6 02:59:27.461508 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (965) Mar 6 02:59:27.466068 kernel: BTRFS info (device sda6): first mount of filesystem 993ea71e-e97d-4f5e-b5c7-fdac31a53b6b Mar 6 02:59:27.466192 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 6 02:59:27.477755 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 6 02:59:27.477874 kernel: BTRFS info (device sda6): turning on async discard Mar 6 02:59:27.477915 kernel: BTRFS info (device sda6): enabling free space tree Mar 6 02:59:27.482610 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 6 02:59:27.523972 ignition[982]: INFO : Ignition 2.22.0 Mar 6 02:59:27.523972 ignition[982]: INFO : Stage: files Mar 6 02:59:27.530705 ignition[982]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:59:27.530705 ignition[982]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Mar 6 02:59:27.530705 ignition[982]: DEBUG : files: compiled without relabeling support, skipping Mar 6 02:59:27.530705 ignition[982]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 6 02:59:27.530705 ignition[982]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 6 02:59:27.546596 ignition[982]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 6 02:59:27.546596 ignition[982]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 6 02:59:27.546596 ignition[982]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 6 02:59:27.546596 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 6 02:59:27.546596 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Mar 6 02:59:27.536148 unknown[982]: wrote ssh authorized keys file for user: core Mar 6 02:59:27.663456 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 6 02:59:27.825751 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Mar 6 02:59:27.825751 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 6 02:59:27.835778 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.35.1-x86-64.raw: attempt #1 Mar 6 02:59:28.290762 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 6 02:59:28.336746 systemd-networkd[798]: eth0: Gained IPv6LL Mar 6 02:59:28.930994 ignition[982]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.35.1-x86-64.raw" Mar 6 02:59:28.930994 ignition[982]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 6 02:59:28.940596 ignition[982]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 6 02:59:28.940596 ignition[982]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 6 02:59:28.940596 ignition[982]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 6 02:59:28.940596 ignition[982]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Mar 6 02:59:28.940596 ignition[982]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Mar 6 02:59:28.940596 ignition[982]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 6 02:59:28.940596 ignition[982]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 6 02:59:28.940596 ignition[982]: INFO : files: files passed Mar 6 02:59:28.940596 ignition[982]: INFO : Ignition finished successfully Mar 6 02:59:28.940634 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 6 02:59:28.951042 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 6 02:59:28.963899 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 6 02:59:28.987142 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 6 02:59:28.987334 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 6 02:59:29.001474 initrd-setup-root-after-ignition[1012]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:59:29.001474 initrd-setup-root-after-ignition[1012]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:59:29.014659 initrd-setup-root-after-ignition[1016]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 6 02:59:29.007158 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 6 02:59:29.011536 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 6 02:59:29.019992 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 6 02:59:29.093698 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 6 02:59:29.094052 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 6 02:59:29.100471 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 6 02:59:29.102897 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 6 02:59:29.108062 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 6 02:59:29.110778 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 6 02:59:29.141648 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 6 02:59:29.145344 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 6 02:59:29.178084 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:59:29.182032 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:59:29.185856 systemd[1]: Stopped target timers.target - Timer Units. Mar 6 02:59:29.191930 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 6 02:59:29.192172 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 6 02:59:29.201739 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 6 02:59:29.202342 systemd[1]: Stopped target basic.target - Basic System. Mar 6 02:59:29.207137 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 6 02:59:29.211094 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 6 02:59:29.216166 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 6 02:59:29.222158 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Mar 6 02:59:29.227168 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 6 02:59:29.232071 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 6 02:59:29.237132 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 6 02:59:29.242231 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 6 02:59:29.251815 systemd[1]: Stopped target swap.target - Swaps. Mar 6 02:59:29.252186 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 6 02:59:29.252637 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 6 02:59:29.262162 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:59:29.265086 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:59:29.270070 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 6 02:59:29.270541 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:59:29.275661 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 6 02:59:29.276098 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 6 02:59:29.291725 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 6 02:59:29.292166 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 6 02:59:29.296360 systemd[1]: ignition-files.service: Deactivated successfully. Mar 6 02:59:29.296642 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 6 02:59:29.303642 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 6 02:59:29.312707 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 6 02:59:29.313000 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:59:29.320338 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 6 02:59:29.329650 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 6 02:59:29.330420 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:59:29.340280 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 6 02:59:29.340753 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 6 02:59:29.363280 ignition[1036]: INFO : Ignition 2.22.0 Mar 6 02:59:29.363280 ignition[1036]: INFO : Stage: umount Mar 6 02:59:29.363280 ignition[1036]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 6 02:59:29.363280 ignition[1036]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Mar 6 02:59:29.366723 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 6 02:59:29.384740 ignition[1036]: INFO : umount: umount passed Mar 6 02:59:29.384740 ignition[1036]: INFO : Ignition finished successfully Mar 6 02:59:29.366885 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 6 02:59:29.368321 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 6 02:59:29.368765 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 6 02:59:29.383407 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 6 02:59:29.384180 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 6 02:59:29.384294 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 6 02:59:29.392557 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 6 02:59:29.392714 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 6 02:59:29.395707 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 6 02:59:29.395777 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 6 02:59:29.400733 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 6 02:59:29.400850 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 6 02:59:29.406670 systemd[1]: Stopped target network.target - Network. Mar 6 02:59:29.411793 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 6 02:59:29.412009 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 6 02:59:29.418761 systemd[1]: Stopped target paths.target - Path Units. Mar 6 02:59:29.423619 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 6 02:59:29.427597 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:59:29.433630 systemd[1]: Stopped target slices.target - Slice Units. Mar 6 02:59:29.437636 systemd[1]: Stopped target sockets.target - Socket Units. Mar 6 02:59:29.441734 systemd[1]: iscsid.socket: Deactivated successfully. Mar 6 02:59:29.441850 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 6 02:59:29.446517 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 6 02:59:29.446635 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 6 02:59:29.452684 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 6 02:59:29.452803 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 6 02:59:29.457754 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 6 02:59:29.457862 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 6 02:59:29.463661 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 6 02:59:29.463757 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 6 02:59:29.467016 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 6 02:59:29.472835 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 6 02:59:29.477945 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 6 02:59:29.478113 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 6 02:59:29.484514 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 6 02:59:29.484831 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 6 02:59:29.484949 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 6 02:59:29.493007 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 6 02:59:29.494464 systemd[1]: Stopped target network-pre.target - Preparation for Network. Mar 6 02:59:29.498704 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 6 02:59:29.498780 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:59:29.507051 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 6 02:59:29.514853 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 6 02:59:29.515198 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 6 02:59:29.522736 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 6 02:59:29.522837 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:59:29.528990 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 6 02:59:29.529169 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 6 02:59:29.530356 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 6 02:59:29.530610 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:59:29.538245 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:59:29.550762 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 6 02:59:29.550885 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:59:29.555042 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 6 02:59:29.555318 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:59:29.565381 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 6 02:59:29.565513 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 6 02:59:29.572862 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 6 02:59:29.573089 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:59:29.577857 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 6 02:59:29.578054 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 6 02:59:29.589993 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 6 02:59:29.590108 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 6 02:59:29.597878 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 6 02:59:29.598041 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 6 02:59:29.607311 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 6 02:59:29.618649 systemd[1]: systemd-network-generator.service: Deactivated successfully. Mar 6 02:59:29.618954 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:59:29.625901 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 6 02:59:29.626017 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:59:29.631651 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 6 02:59:29.738722 systemd-journald[192]: Received SIGTERM from PID 1 (systemd). Mar 6 02:59:29.631843 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:59:29.641030 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Mar 6 02:59:29.641182 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 6 02:59:29.641339 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 6 02:59:29.642232 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 6 02:59:29.642380 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 6 02:59:29.650513 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 6 02:59:29.650676 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 6 02:59:29.656284 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 6 02:59:29.662703 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 6 02:59:29.694266 systemd[1]: Switching root. Mar 6 02:59:29.771601 systemd-journald[192]: Journal stopped Mar 6 02:59:31.945207 kernel: SELinux: policy capability network_peer_controls=1 Mar 6 02:59:31.945281 kernel: SELinux: policy capability open_perms=1 Mar 6 02:59:31.945331 kernel: SELinux: policy capability extended_socket_class=1 Mar 6 02:59:31.945350 kernel: SELinux: policy capability always_check_network=0 Mar 6 02:59:31.945369 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 6 02:59:31.945393 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 6 02:59:31.945414 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 6 02:59:31.945450 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 6 02:59:31.945495 kernel: SELinux: policy capability userspace_initial_context=0 Mar 6 02:59:31.945515 kernel: audit: type=1403 audit(1772765970.344:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 6 02:59:31.945538 systemd[1]: Successfully loaded SELinux policy in 73.007ms. Mar 6 02:59:31.945567 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.024ms. Mar 6 02:59:31.945590 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 6 02:59:31.945610 systemd[1]: Detected virtualization google. Mar 6 02:59:31.945636 systemd[1]: Detected architecture x86-64. Mar 6 02:59:31.945656 systemd[1]: Detected first boot. Mar 6 02:59:31.945677 systemd[1]: Initializing machine ID from random generator. Mar 6 02:59:31.945698 zram_generator::config[1078]: No configuration found. Mar 6 02:59:31.945720 kernel: Guest personality initialized and is inactive Mar 6 02:59:31.945740 kernel: VMCI host device registered (name=vmci, major=10, minor=258) Mar 6 02:59:31.945764 kernel: Initialized host personality Mar 6 02:59:31.945784 kernel: NET: Registered PF_VSOCK protocol family Mar 6 02:59:31.945810 systemd[1]: Populated /etc with preset unit settings. Mar 6 02:59:31.945833 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 6 02:59:31.945853 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 6 02:59:31.945893 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 6 02:59:31.945930 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 6 02:59:31.945956 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 6 02:59:31.945977 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 6 02:59:31.946005 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 6 02:59:31.946027 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 6 02:59:31.946049 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 6 02:59:31.946070 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 6 02:59:31.946199 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 6 02:59:31.946234 systemd[1]: Created slice user.slice - User and Session Slice. Mar 6 02:59:31.946255 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 6 02:59:31.952541 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 6 02:59:31.952566 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 6 02:59:31.952589 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 6 02:59:31.952612 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 6 02:59:31.952644 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 6 02:59:31.952667 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 6 02:59:31.952690 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 6 02:59:31.952716 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 6 02:59:31.952739 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 6 02:59:31.952761 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 6 02:59:31.952782 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 6 02:59:31.952804 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 6 02:59:31.952825 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 6 02:59:31.952847 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 6 02:59:31.952873 systemd[1]: Reached target slices.target - Slice Units. Mar 6 02:59:31.952895 systemd[1]: Reached target swap.target - Swaps. Mar 6 02:59:31.952917 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 6 02:59:31.952939 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 6 02:59:31.952961 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 6 02:59:31.952985 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 6 02:59:31.953011 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 6 02:59:31.953034 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 6 02:59:31.953056 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 6 02:59:31.953079 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 6 02:59:31.953102 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 6 02:59:31.953124 systemd[1]: Mounting media.mount - External Media Directory... Mar 6 02:59:31.953147 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 02:59:31.953173 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 6 02:59:31.953195 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 6 02:59:31.953218 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 6 02:59:31.953240 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 6 02:59:31.953263 systemd[1]: Reached target machines.target - Containers. Mar 6 02:59:31.953286 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 6 02:59:31.953316 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:59:31.953338 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 6 02:59:31.953364 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 6 02:59:31.953386 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:59:31.953408 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 6 02:59:31.953430 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:59:31.953467 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 6 02:59:31.953489 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:59:31.953513 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 6 02:59:31.953535 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 6 02:59:31.953558 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 6 02:59:31.953584 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 6 02:59:31.953606 systemd[1]: Stopped systemd-fsck-usr.service. Mar 6 02:59:31.953629 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:59:31.953651 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 6 02:59:31.953673 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 6 02:59:31.953695 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 6 02:59:31.953717 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 6 02:59:31.953739 kernel: fuse: init (API version 7.41) Mar 6 02:59:31.953764 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 6 02:59:31.953787 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 6 02:59:31.953809 systemd[1]: verity-setup.service: Deactivated successfully. Mar 6 02:59:31.953831 systemd[1]: Stopped verity-setup.service. Mar 6 02:59:31.953853 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 02:59:31.953874 kernel: loop: module loaded Mar 6 02:59:31.953895 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 6 02:59:31.953918 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 6 02:59:31.953940 systemd[1]: Mounted media.mount - External Media Directory. Mar 6 02:59:31.953967 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 6 02:59:31.953989 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 6 02:59:31.954011 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 6 02:59:31.954033 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 6 02:59:31.954054 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 6 02:59:31.958970 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 6 02:59:31.959030 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:59:31.959055 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:59:31.959085 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:59:31.959108 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:59:31.959132 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 6 02:59:31.959154 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 6 02:59:31.959177 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:59:31.959199 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:59:31.959221 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 6 02:59:31.959244 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 6 02:59:31.959266 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 6 02:59:31.959293 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 6 02:59:31.959323 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 6 02:59:31.959347 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 6 02:59:31.959371 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 6 02:59:31.959406 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 6 02:59:31.959451 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 6 02:59:31.959485 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 6 02:59:31.959568 systemd-journald[1145]: Collecting audit messages is disabled. Mar 6 02:59:31.959614 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 6 02:59:31.959640 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:59:31.959669 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 6 02:59:31.959690 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 02:59:31.959710 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 6 02:59:31.959734 systemd-journald[1145]: Journal started Mar 6 02:59:31.959775 systemd-journald[1145]: Runtime Journal (/run/log/journal/e5ce27bf54984664a90a8b595eb95151) is 8M, max 148.6M, 140.6M free. Mar 6 02:59:31.966518 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 02:59:31.327677 systemd[1]: Queued start job for default target multi-user.target. Mar 6 02:59:31.354424 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 6 02:59:31.355152 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 6 02:59:31.983764 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 6 02:59:32.001470 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 6 02:59:32.001567 systemd[1]: Started systemd-journald.service - Journal Service. Mar 6 02:59:32.007355 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 6 02:59:32.010780 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 6 02:59:32.014002 kernel: ACPI: bus type drm_connector registered Mar 6 02:59:32.022941 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 6 02:59:32.023250 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 6 02:59:32.041876 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 6 02:59:32.068934 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 6 02:59:32.073404 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 6 02:59:32.077513 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 6 02:59:32.086668 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 6 02:59:32.087687 kernel: loop0: detected capacity change from 0 to 50736 Mar 6 02:59:32.094651 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 6 02:59:32.169186 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 6 02:59:32.171383 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 6 02:59:32.183630 systemd-journald[1145]: Time spent on flushing to /var/log/journal/e5ce27bf54984664a90a8b595eb95151 is 87.727ms for 966 entries. Mar 6 02:59:32.183630 systemd-journald[1145]: System Journal (/var/log/journal/e5ce27bf54984664a90a8b595eb95151) is 8M, max 584.8M, 576.8M free. Mar 6 02:59:32.313657 systemd-journald[1145]: Received client request to flush runtime journal. Mar 6 02:59:32.313725 kernel: loop1: detected capacity change from 0 to 217752 Mar 6 02:59:32.224563 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 6 02:59:32.244537 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 6 02:59:32.253965 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 6 02:59:32.279836 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 6 02:59:32.317386 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 6 02:59:32.347245 systemd-tmpfiles[1216]: ACLs are not supported, ignoring. Mar 6 02:59:32.348224 systemd-tmpfiles[1216]: ACLs are not supported, ignoring. Mar 6 02:59:32.359423 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 6 02:59:32.366697 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 6 02:59:32.374477 kernel: loop2: detected capacity change from 0 to 128560 Mar 6 02:59:32.464506 kernel: loop3: detected capacity change from 0 to 110984 Mar 6 02:59:32.556519 kernel: loop4: detected capacity change from 0 to 50736 Mar 6 02:59:32.592951 kernel: loop5: detected capacity change from 0 to 217752 Mar 6 02:59:32.634479 kernel: loop6: detected capacity change from 0 to 128560 Mar 6 02:59:32.690489 kernel: loop7: detected capacity change from 0 to 110984 Mar 6 02:59:32.741009 (sd-merge)[1226]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-gce'. Mar 6 02:59:32.743113 (sd-merge)[1226]: Merged extensions into '/usr'. Mar 6 02:59:32.762619 systemd[1]: Reload requested from client PID 1181 ('systemd-sysext') (unit systemd-sysext.service)... Mar 6 02:59:32.762642 systemd[1]: Reloading... Mar 6 02:59:32.947476 zram_generator::config[1251]: No configuration found. Mar 6 02:59:33.212302 ldconfig[1174]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 6 02:59:33.416880 systemd[1]: Reloading finished in 652 ms. Mar 6 02:59:33.431237 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 6 02:59:33.435106 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 6 02:59:33.453337 systemd[1]: Starting ensure-sysext.service... Mar 6 02:59:33.465042 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 6 02:59:33.493992 systemd[1]: Reload requested from client PID 1292 ('systemctl') (unit ensure-sysext.service)... Mar 6 02:59:33.494015 systemd[1]: Reloading... Mar 6 02:59:33.520480 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Mar 6 02:59:33.520538 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Mar 6 02:59:33.521041 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 6 02:59:33.521591 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 6 02:59:33.527148 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 6 02:59:33.527714 systemd-tmpfiles[1293]: ACLs are not supported, ignoring. Mar 6 02:59:33.527846 systemd-tmpfiles[1293]: ACLs are not supported, ignoring. Mar 6 02:59:33.537784 systemd-tmpfiles[1293]: Detected autofs mount point /boot during canonicalization of boot. Mar 6 02:59:33.537809 systemd-tmpfiles[1293]: Skipping /boot Mar 6 02:59:33.568283 systemd-tmpfiles[1293]: Detected autofs mount point /boot during canonicalization of boot. Mar 6 02:59:33.568503 systemd-tmpfiles[1293]: Skipping /boot Mar 6 02:59:33.632466 zram_generator::config[1318]: No configuration found. Mar 6 02:59:33.868812 systemd[1]: Reloading finished in 374 ms. Mar 6 02:59:33.894493 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 6 02:59:33.917563 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 6 02:59:33.931223 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 6 02:59:33.947377 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 6 02:59:33.952254 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 6 02:59:33.960062 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 6 02:59:33.968930 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 6 02:59:33.977618 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 6 02:59:33.988280 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 02:59:33.988644 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:59:33.993249 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 6 02:59:34.000578 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:59:34.005185 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 6 02:59:34.007918 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:59:34.008127 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:59:34.023696 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 6 02:59:34.027545 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 02:59:34.031164 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:59:34.032128 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:59:34.051508 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 02:59:34.051918 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:59:34.056348 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 6 02:59:34.058282 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:59:34.058509 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:59:34.058691 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 02:59:34.061520 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 6 02:59:34.081745 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 02:59:34.082448 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 6 02:59:34.093941 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 6 02:59:34.099697 systemd[1]: Starting setup-oem.service - Setup OEM... Mar 6 02:59:34.100523 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 6 02:59:34.100762 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 6 02:59:34.101083 systemd[1]: Reached target time-set.target - System Time Set. Mar 6 02:59:34.101291 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 6 02:59:34.112903 systemd[1]: Finished ensure-sysext.service. Mar 6 02:59:34.135383 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 6 02:59:34.142290 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 6 02:59:34.149726 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 6 02:59:34.150060 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 6 02:59:34.153962 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 6 02:59:34.160730 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 6 02:59:34.166213 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 6 02:59:34.167565 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 6 02:59:34.171301 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 6 02:59:34.172839 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 6 02:59:34.188894 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 6 02:59:34.197153 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 6 02:59:34.204822 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 6 02:59:34.224080 systemd-udevd[1365]: Using default interface naming scheme 'v255'. Mar 6 02:59:34.238556 systemd[1]: Finished setup-oem.service - Setup OEM. Mar 6 02:59:34.247735 systemd[1]: Starting oem-gce-enable-oslogin.service - Enable GCE OS Login... Mar 6 02:59:34.251689 augenrules[1411]: No rules Mar 6 02:59:34.255045 systemd[1]: audit-rules.service: Deactivated successfully. Mar 6 02:59:34.256053 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 6 02:59:34.261014 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 6 02:59:34.296197 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 6 02:59:34.299943 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 6 02:59:34.329637 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 6 02:59:34.341243 systemd[1]: Finished oem-gce-enable-oslogin.service - Enable GCE OS Login. Mar 6 02:59:34.363299 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 6 02:59:34.411575 systemd-resolved[1364]: Positive Trust Anchors: Mar 6 02:59:34.411607 systemd-resolved[1364]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 6 02:59:34.411685 systemd-resolved[1364]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 6 02:59:34.432414 systemd-resolved[1364]: Defaulting to hostname 'linux'. Mar 6 02:59:34.443373 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 6 02:59:34.453810 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 6 02:59:34.464736 systemd[1]: Reached target sysinit.target - System Initialization. Mar 6 02:59:34.474853 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 6 02:59:34.485738 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 6 02:59:34.496661 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Mar 6 02:59:34.506974 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 6 02:59:34.516873 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 6 02:59:34.527639 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 6 02:59:34.538633 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 6 02:59:34.538694 systemd[1]: Reached target paths.target - Path Units. Mar 6 02:59:34.546636 systemd[1]: Reached target timers.target - Timer Units. Mar 6 02:59:34.557563 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 6 02:59:34.570325 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 6 02:59:34.583383 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 6 02:59:34.594906 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 6 02:59:34.605657 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 6 02:59:34.627595 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 6 02:59:34.636760 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 6 02:59:34.681712 systemd-networkd[1442]: lo: Link UP Mar 6 02:59:34.681731 systemd-networkd[1442]: lo: Gained carrier Mar 6 02:59:34.685303 systemd-networkd[1442]: Enumeration completed Mar 6 02:59:34.724344 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 6 02:59:34.733905 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 6 02:59:34.751967 systemd-networkd[1442]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:59:34.755222 systemd[1]: Condition check resulted in dev-tpmrm0.device - /dev/tpmrm0 being skipped. Mar 6 02:59:34.756153 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 6 02:59:34.756319 systemd-networkd[1442]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 6 02:59:34.758913 systemd-networkd[1442]: eth0: Link UP Mar 6 02:59:34.759161 systemd-networkd[1442]: eth0: Gained carrier Mar 6 02:59:34.759198 systemd-networkd[1442]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 6 02:59:34.761079 systemd[1]: Reached target network.target - Network. Mar 6 02:59:34.773522 kernel: mousedev: PS/2 mouse device common for all mice Mar 6 02:59:34.774916 systemd[1]: Reached target sockets.target - Socket Units. Mar 6 02:59:34.776556 systemd-networkd[1442]: eth0: Overlong DHCP hostname received, shortened from 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce.c.flatcar-212911.internal' to 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 02:59:34.776605 systemd-networkd[1442]: eth0: DHCPv4 address 10.128.0.87/32, gateway 10.128.0.1 acquired from 169.254.169.254 Mar 6 02:59:34.783634 systemd[1]: Reached target basic.target - Basic System. Mar 6 02:59:34.791675 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Mar 6 02:59:34.800659 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 6 02:59:34.800715 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 6 02:59:34.804220 systemd[1]: Starting containerd.service - containerd container runtime... Mar 6 02:59:34.823498 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 6 02:59:34.835741 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 6 02:59:34.845254 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 6 02:59:34.857676 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 6 02:59:34.874628 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 6 02:59:34.888062 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 6 02:59:34.896845 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 6 02:59:34.902764 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Mar 6 02:59:34.918869 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 6 02:59:34.935485 systemd[1]: Started ntpd.service - Network Time Service. Mar 6 02:59:34.965945 jq[1476]: false Mar 6 02:59:34.990467 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Refreshing passwd entry cache Mar 6 02:59:34.989370 oslogin_cache_refresh[1480]: Refreshing passwd entry cache Mar 6 02:59:34.994219 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 6 02:59:35.008000 extend-filesystems[1477]: Found /dev/sda6 Mar 6 02:59:35.056842 extend-filesystems[1477]: Found /dev/sda9 Mar 6 02:59:35.052308 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 6 02:59:35.063013 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Failure getting users, quitting Mar 6 02:59:35.063013 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 6 02:59:35.063013 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Refreshing group entry cache Mar 6 02:59:35.063013 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Failure getting groups, quitting Mar 6 02:59:35.063013 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 6 02:59:35.011880 oslogin_cache_refresh[1480]: Failure getting users, quitting Mar 6 02:59:35.063331 coreos-metadata[1473]: Mar 06 02:59:35.060 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/hostname: Attempt #1 Mar 6 02:59:35.063331 coreos-metadata[1473]: Mar 06 02:59:35.062 INFO Fetch successful Mar 6 02:59:35.011938 oslogin_cache_refresh[1480]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Mar 6 02:59:35.063898 coreos-metadata[1473]: Mar 06 02:59:35.062 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/network-interfaces/0/access-configs/0/external-ip: Attempt #1 Mar 6 02:59:35.012019 oslogin_cache_refresh[1480]: Refreshing group entry cache Mar 6 02:59:35.013911 oslogin_cache_refresh[1480]: Failure getting groups, quitting Mar 6 02:59:35.013930 oslogin_cache_refresh[1480]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Mar 6 02:59:35.064552 extend-filesystems[1477]: Checking size of /dev/sda9 Mar 6 02:59:35.081832 coreos-metadata[1473]: Mar 06 02:59:35.064 INFO Fetch successful Mar 6 02:59:35.081832 coreos-metadata[1473]: Mar 06 02:59:35.064 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/network-interfaces/0/ip: Attempt #1 Mar 6 02:59:35.081832 coreos-metadata[1473]: Mar 06 02:59:35.064 INFO Fetch successful Mar 6 02:59:35.081832 coreos-metadata[1473]: Mar 06 02:59:35.064 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/machine-type: Attempt #1 Mar 6 02:59:35.081832 coreos-metadata[1473]: Mar 06 02:59:35.068 INFO Fetch successful Mar 6 02:59:35.076720 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 6 02:59:35.090231 ntpd[1483]: ntpd 4.2.8p18@1.4062-o Thu Mar 5 21:56:02 UTC 2026 (1): Starting Mar 6 02:59:35.090325 ntpd[1483]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 6 02:59:35.091001 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: ntpd 4.2.8p18@1.4062-o Thu Mar 5 21:56:02 UTC 2026 (1): Starting Mar 6 02:59:35.091001 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 6 02:59:35.091001 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: ---------------------------------------------------- Mar 6 02:59:35.091001 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: ntp-4 is maintained by Network Time Foundation, Mar 6 02:59:35.091001 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 6 02:59:35.091001 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: corporation. Support and training for ntp-4 are Mar 6 02:59:35.091001 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: available at https://www.nwtime.org/support Mar 6 02:59:35.091001 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: ---------------------------------------------------- Mar 6 02:59:35.090342 ntpd[1483]: ---------------------------------------------------- Mar 6 02:59:35.090355 ntpd[1483]: ntp-4 is maintained by Network Time Foundation, Mar 6 02:59:35.090368 ntpd[1483]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 6 02:59:35.090381 ntpd[1483]: corporation. Support and training for ntp-4 are Mar 6 02:59:35.090393 ntpd[1483]: available at https://www.nwtime.org/support Mar 6 02:59:35.090407 ntpd[1483]: ---------------------------------------------------- Mar 6 02:59:35.099430 kernel: ACPI: button: Power Button [PWRF] Mar 6 02:59:35.099924 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 6 02:59:35.104151 ntpd[1483]: proto: precision = 0.086 usec (-23) Mar 6 02:59:35.104339 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: proto: precision = 0.086 usec (-23) Mar 6 02:59:35.119056 kernel: piix4_smbus 0000:00:01.3: SMBus base address uninitialized - upgrade BIOS or use force_addr=0xaddr Mar 6 02:59:35.124623 ntpd[1483]: basedate set to 2026-02-21 Mar 6 02:59:35.125250 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: basedate set to 2026-02-21 Mar 6 02:59:35.125250 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: gps base set to 2026-02-22 (week 2407) Mar 6 02:59:35.125250 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: Listen and drop on 0 v6wildcard [::]:123 Mar 6 02:59:35.125250 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 6 02:59:35.124662 ntpd[1483]: gps base set to 2026-02-22 (week 2407) Mar 6 02:59:35.124866 ntpd[1483]: Listen and drop on 0 v6wildcard [::]:123 Mar 6 02:59:35.124913 ntpd[1483]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 6 02:59:35.147321 kernel: ntpd[1483]: segfault at 24 ip 000055b959b4caeb sp 00007ffe930eecf0 error 4 in ntpd[68aeb,55b959aea000+80000] likely on CPU 0 (core 0, socket 0) Mar 6 02:59:35.147483 kernel: Code: 0f 1e fa 41 56 41 55 41 54 55 53 48 89 fb e8 8c eb f9 ff 44 8b 28 49 89 c4 e8 51 6b ff ff 48 89 c5 48 85 db 0f 84 a5 00 00 00 <0f> b7 0b 66 83 f9 02 0f 84 c0 00 00 00 66 83 f9 0a 74 32 66 85 c9 Mar 6 02:59:35.126361 ntpd[1483]: Listen normally on 2 lo 127.0.0.1:123 Mar 6 02:59:35.126695 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 6 02:59:35.147731 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: Listen normally on 2 lo 127.0.0.1:123 Mar 6 02:59:35.147731 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: Listen normally on 3 eth0 10.128.0.87:123 Mar 6 02:59:35.147731 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: Listen normally on 4 lo [::1]:123 Mar 6 02:59:35.147731 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: bind(21) AF_INET6 [fe80::4001:aff:fe80:57%2]:123 flags 0x811 failed: Cannot assign requested address Mar 6 02:59:35.147731 ntpd[1483]: 6 Mar 02:59:35 ntpd[1483]: unable to create socket on eth0 (5) for [fe80::4001:aff:fe80:57%2]:123 Mar 6 02:59:35.126415 ntpd[1483]: Listen normally on 3 eth0 10.128.0.87:123 Mar 6 02:59:35.129504 ntpd[1483]: Listen normally on 4 lo [::1]:123 Mar 6 02:59:35.129582 ntpd[1483]: bind(21) AF_INET6 [fe80::4001:aff:fe80:57%2]:123 flags 0x811 failed: Cannot assign requested address Mar 6 02:59:35.129619 ntpd[1483]: unable to create socket on eth0 (5) for [fe80::4001:aff:fe80:57%2]:123 Mar 6 02:59:35.152594 extend-filesystems[1477]: Resized partition /dev/sda9 Mar 6 02:59:35.253635 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input4 Mar 6 02:59:35.253716 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 3587067 blocks Mar 6 02:59:35.253749 kernel: ACPI: button: Sleep Button [SLPF] Mar 6 02:59:35.253777 kernel: EDAC MC: Ver: 3.0.0 Mar 6 02:59:35.254005 extend-filesystems[1516]: resize2fs 1.47.3 (8-Jul-2025) Mar 6 02:59:35.193319 systemd-coredump[1520]: Process 1483 (ntpd) of user 0 terminated abnormally with signal 11/SEGV, processing... Mar 6 02:59:35.245823 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 6 02:59:35.264997 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionSecurity=!tpm2). Mar 6 02:59:35.266201 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 6 02:59:35.270500 systemd[1]: Starting update-engine.service - Update Engine... Mar 6 02:59:35.283189 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 6 02:59:35.306052 kernel: EXT4-fs (sda9): resized filesystem to 3587067 Mar 6 02:59:35.312522 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 6 02:59:35.324150 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 6 02:59:35.324516 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 6 02:59:35.325157 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Mar 6 02:59:35.326661 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Mar 6 02:59:35.332627 extend-filesystems[1516]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 6 02:59:35.332627 extend-filesystems[1516]: old_desc_blocks = 1, new_desc_blocks = 2 Mar 6 02:59:35.332627 extend-filesystems[1516]: The filesystem on /dev/sda9 is now 3587067 (4k) blocks long. Mar 6 02:59:35.399676 extend-filesystems[1477]: Resized filesystem in /dev/sda9 Mar 6 02:59:35.408634 update_engine[1524]: I20260306 02:59:35.354962 1524 main.cc:92] Flatcar Update Engine starting Mar 6 02:59:35.336601 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 6 02:59:35.409112 jq[1525]: true Mar 6 02:59:35.345733 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 6 02:59:35.355612 systemd[1]: motdgen.service: Deactivated successfully. Mar 6 02:59:35.355937 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 6 02:59:35.367171 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 6 02:59:35.367556 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 6 02:59:35.475728 (ntainerd)[1532]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 6 02:59:35.492954 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - PersistentDisk OEM. Mar 6 02:59:35.504576 jq[1531]: true Mar 6 02:59:35.509769 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 6 02:59:35.541172 systemd[1]: Created slice system-systemd\x2dcoredump.slice - Slice /system/systemd-coredump. Mar 6 02:59:35.551931 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 6 02:59:35.558538 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 6 02:59:35.574749 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 6 02:59:35.586623 systemd[1]: Started systemd-coredump@0-1520-0.service - Process Core Dump (PID 1520/UID 0). Mar 6 02:59:35.594469 tar[1530]: linux-amd64/LICENSE Mar 6 02:59:35.600547 tar[1530]: linux-amd64/helm Mar 6 02:59:35.646839 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 6 02:59:35.767925 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 6 02:59:35.824615 systemd-networkd[1442]: eth0: Gained IPv6LL Mar 6 02:59:35.840030 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 6 02:59:35.840372 systemd[1]: Reached target network-online.target - Network is Online. Mar 6 02:59:35.847646 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:59:35.852326 bash[1572]: Updated "/home/core/.ssh/authorized_keys" Mar 6 02:59:35.852759 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 6 02:59:35.861379 systemd[1]: Starting oem-gce.service - GCE Linux Agent... Mar 6 02:59:35.862575 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 6 02:59:35.874258 systemd[1]: Starting sshkeys.service... Mar 6 02:59:35.884080 dbus-daemon[1474]: [system] SELinux support is enabled Mar 6 02:59:35.884399 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 6 02:59:35.893239 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 6 02:59:35.893284 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 6 02:59:35.893413 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 6 02:59:35.894481 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 6 02:59:35.937680 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 6 02:59:35.945791 dbus-daemon[1474]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1442 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 6 02:59:35.948583 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 6 02:59:35.972413 update_engine[1524]: I20260306 02:59:35.972091 1524 update_check_scheduler.cc:74] Next update check in 4m34s Mar 6 02:59:35.975529 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Mar 6 02:59:35.980687 systemd[1]: Started update-engine.service - Update Engine. Mar 6 02:59:36.012232 init.sh[1582]: + '[' -e /etc/default/instance_configs.cfg.template ']' Mar 6 02:59:36.013474 init.sh[1582]: + echo -e '[InstanceSetup]\nset_host_keys = false' Mar 6 02:59:36.014502 init.sh[1582]: + /usr/bin/google_instance_setup Mar 6 02:59:36.084059 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 6 02:59:36.181005 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 6 02:59:36.274494 sshd_keygen[1521]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 6 02:59:36.282764 coreos-metadata[1585]: Mar 06 02:59:36.282 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/sshKeys: Attempt #1 Mar 6 02:59:36.288005 coreos-metadata[1585]: Mar 06 02:59:36.287 INFO Fetch failed with 404: resource not found Mar 6 02:59:36.288005 coreos-metadata[1585]: Mar 06 02:59:36.287 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/ssh-keys: Attempt #1 Mar 6 02:59:36.304130 coreos-metadata[1585]: Mar 06 02:59:36.303 INFO Fetch successful Mar 6 02:59:36.304130 coreos-metadata[1585]: Mar 06 02:59:36.303 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/block-project-ssh-keys: Attempt #1 Mar 6 02:59:36.304237 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 6 02:59:36.304770 coreos-metadata[1585]: Mar 06 02:59:36.304 INFO Fetch failed with 404: resource not found Mar 6 02:59:36.304770 coreos-metadata[1585]: Mar 06 02:59:36.304 INFO Fetching http://169.254.169.254/computeMetadata/v1/project/attributes/sshKeys: Attempt #1 Mar 6 02:59:36.308998 coreos-metadata[1585]: Mar 06 02:59:36.307 INFO Fetch failed with 404: resource not found Mar 6 02:59:36.308998 coreos-metadata[1585]: Mar 06 02:59:36.307 INFO Fetching http://169.254.169.254/computeMetadata/v1/project/attributes/ssh-keys: Attempt #1 Mar 6 02:59:36.311950 coreos-metadata[1585]: Mar 06 02:59:36.311 INFO Fetch successful Mar 6 02:59:36.322521 unknown[1585]: wrote ssh authorized keys file for user: core Mar 6 02:59:36.372570 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 6 02:59:36.391407 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 6 02:59:36.439602 containerd[1532]: time="2026-03-06T02:59:36Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 6 02:59:36.440914 update-ssh-keys[1611]: Updated "/home/core/.ssh/authorized_keys" Mar 6 02:59:36.460281 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 6 02:59:36.470464 containerd[1532]: time="2026-03-06T02:59:36.470355274Z" level=info msg="starting containerd" revision=4ac6c20c7bbf8177f29e46bbdc658fec02ffb8ad version=v2.0.7 Mar 6 02:59:36.472760 systemd[1]: issuegen.service: Deactivated successfully. Mar 6 02:59:36.473547 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 6 02:59:36.484175 systemd[1]: Finished sshkeys.service. Mar 6 02:59:36.507625 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 6 02:59:36.529006 containerd[1532]: time="2026-03-06T02:59:36.528941067Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="15.328µs" Mar 6 02:59:36.529238 containerd[1532]: time="2026-03-06T02:59:36.529206269Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 6 02:59:36.529375 containerd[1532]: time="2026-03-06T02:59:36.529351926Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 6 02:59:36.530203 containerd[1532]: time="2026-03-06T02:59:36.530168959Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 6 02:59:36.530780 containerd[1532]: time="2026-03-06T02:59:36.530748485Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 6 02:59:36.530940 containerd[1532]: time="2026-03-06T02:59:36.530917077Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 6 02:59:36.531294 containerd[1532]: time="2026-03-06T02:59:36.531262356Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 6 02:59:36.531651 containerd[1532]: time="2026-03-06T02:59:36.531621336Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 6 02:59:36.532836 containerd[1532]: time="2026-03-06T02:59:36.532799258Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 6 02:59:36.532971 containerd[1532]: time="2026-03-06T02:59:36.532947787Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 6 02:59:36.533498 containerd[1532]: time="2026-03-06T02:59:36.533472846Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 6 02:59:36.533583 containerd[1532]: time="2026-03-06T02:59:36.533568965Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 6 02:59:36.533736 containerd[1532]: time="2026-03-06T02:59:36.533719702Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 6 02:59:36.534167 containerd[1532]: time="2026-03-06T02:59:36.534138289Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 6 02:59:36.535676 containerd[1532]: time="2026-03-06T02:59:36.535510203Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 6 02:59:36.535676 containerd[1532]: time="2026-03-06T02:59:36.535540090Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 6 02:59:36.535676 containerd[1532]: time="2026-03-06T02:59:36.535623122Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 6 02:59:36.538981 containerd[1532]: time="2026-03-06T02:59:36.538628559Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 6 02:59:36.538981 containerd[1532]: time="2026-03-06T02:59:36.538746314Z" level=info msg="metadata content store policy set" policy=shared Mar 6 02:59:36.555583 containerd[1532]: time="2026-03-06T02:59:36.555482786Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 6 02:59:36.555721 containerd[1532]: time="2026-03-06T02:59:36.555673185Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 6 02:59:36.555834 containerd[1532]: time="2026-03-06T02:59:36.555804595Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 6 02:59:36.555892 containerd[1532]: time="2026-03-06T02:59:36.555844265Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 6 02:59:36.555892 containerd[1532]: time="2026-03-06T02:59:36.555867818Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 6 02:59:36.555892 containerd[1532]: time="2026-03-06T02:59:36.555886712Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 6 02:59:36.556012 containerd[1532]: time="2026-03-06T02:59:36.555906049Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 6 02:59:36.556012 containerd[1532]: time="2026-03-06T02:59:36.555926697Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 6 02:59:36.556012 containerd[1532]: time="2026-03-06T02:59:36.555946856Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 6 02:59:36.556012 containerd[1532]: time="2026-03-06T02:59:36.555967470Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 6 02:59:36.556012 containerd[1532]: time="2026-03-06T02:59:36.555983973Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 6 02:59:36.556012 containerd[1532]: time="2026-03-06T02:59:36.556005106Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 6 02:59:36.556243 containerd[1532]: time="2026-03-06T02:59:36.556195321Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 6 02:59:36.556243 containerd[1532]: time="2026-03-06T02:59:36.556233102Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 6 02:59:36.556336 containerd[1532]: time="2026-03-06T02:59:36.556259545Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 6 02:59:36.556336 containerd[1532]: time="2026-03-06T02:59:36.556279450Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 6 02:59:36.556336 containerd[1532]: time="2026-03-06T02:59:36.556297247Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 6 02:59:36.556336 containerd[1532]: time="2026-03-06T02:59:36.556314369Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 6 02:59:36.556573 containerd[1532]: time="2026-03-06T02:59:36.556334250Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 6 02:59:36.556573 containerd[1532]: time="2026-03-06T02:59:36.556353686Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 6 02:59:36.556573 containerd[1532]: time="2026-03-06T02:59:36.556382329Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 6 02:59:36.556573 containerd[1532]: time="2026-03-06T02:59:36.556401062Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 6 02:59:36.556573 containerd[1532]: time="2026-03-06T02:59:36.556420641Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 6 02:59:36.559615 containerd[1532]: time="2026-03-06T02:59:36.558868289Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 6 02:59:36.559615 containerd[1532]: time="2026-03-06T02:59:36.558911756Z" level=info msg="Start snapshots syncer" Mar 6 02:59:36.559763 containerd[1532]: time="2026-03-06T02:59:36.559667054Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 6 02:59:36.560157 containerd[1532]: time="2026-03-06T02:59:36.560092363Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 6 02:59:36.560384 containerd[1532]: time="2026-03-06T02:59:36.560179677Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 6 02:59:36.560384 containerd[1532]: time="2026-03-06T02:59:36.560244835Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 6 02:59:36.564084 containerd[1532]: time="2026-03-06T02:59:36.560420272Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 6 02:59:36.564185 containerd[1532]: time="2026-03-06T02:59:36.564110165Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 6 02:59:36.564185 containerd[1532]: time="2026-03-06T02:59:36.564142171Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 6 02:59:36.564185 containerd[1532]: time="2026-03-06T02:59:36.564160027Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 6 02:59:36.564326 containerd[1532]: time="2026-03-06T02:59:36.564186029Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 6 02:59:36.564326 containerd[1532]: time="2026-03-06T02:59:36.564204590Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 6 02:59:36.564326 containerd[1532]: time="2026-03-06T02:59:36.564230465Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 6 02:59:36.564326 containerd[1532]: time="2026-03-06T02:59:36.564281215Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 6 02:59:36.564326 containerd[1532]: time="2026-03-06T02:59:36.564312306Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 6 02:59:36.564526 containerd[1532]: time="2026-03-06T02:59:36.564332831Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568147101Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568203082Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568222201Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568241302Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568256557Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568277216Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568308710Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568334837Z" level=info msg="runtime interface created" Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568344390Z" level=info msg="created NRI interface" Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568358904Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568382671Z" level=info msg="Connect containerd service" Mar 6 02:59:36.569470 containerd[1532]: time="2026-03-06T02:59:36.568427337Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 6 02:59:36.576857 containerd[1532]: time="2026-03-06T02:59:36.576770918Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 6 02:59:36.666850 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 6 02:59:36.694884 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 6 02:59:36.704825 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 6 02:59:36.713703 systemd[1]: Reached target getty.target - Login Prompts. Mar 6 02:59:36.789386 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 6 02:59:36.802605 systemd[1]: Started sshd@0-10.128.0.87:22-20.161.92.111:51508.service - OpenSSH per-connection server daemon (20.161.92.111:51508). Mar 6 02:59:36.875959 systemd-logind[1505]: Watching system buttons on /dev/input/event2 (Power Button) Mar 6 02:59:36.876001 systemd-logind[1505]: Watching system buttons on /dev/input/event3 (Sleep Button) Mar 6 02:59:36.876034 systemd-logind[1505]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 6 02:59:36.882200 systemd-logind[1505]: New seat seat0. Mar 6 02:59:36.898532 systemd[1]: Started systemd-logind.service - User Login Management. Mar 6 02:59:36.908248 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Mar 6 02:59:36.914095 dbus-daemon[1474]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 6 02:59:36.919188 dbus-daemon[1474]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1587 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 6 02:59:36.936164 systemd[1]: Starting polkit.service - Authorization Manager... Mar 6 02:59:36.994536 locksmithd[1590]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 6 02:59:37.217352 containerd[1532]: time="2026-03-06T02:59:37.217237461Z" level=info msg="Start subscribing containerd event" Mar 6 02:59:37.222179 containerd[1532]: time="2026-03-06T02:59:37.217568385Z" level=info msg="Start recovering state" Mar 6 02:59:37.222179 containerd[1532]: time="2026-03-06T02:59:37.217737147Z" level=info msg="Start event monitor" Mar 6 02:59:37.222179 containerd[1532]: time="2026-03-06T02:59:37.217763763Z" level=info msg="Start cni network conf syncer for default" Mar 6 02:59:37.222179 containerd[1532]: time="2026-03-06T02:59:37.217776774Z" level=info msg="Start streaming server" Mar 6 02:59:37.222179 containerd[1532]: time="2026-03-06T02:59:37.217793333Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 6 02:59:37.222179 containerd[1532]: time="2026-03-06T02:59:37.217808480Z" level=info msg="runtime interface starting up..." Mar 6 02:59:37.222179 containerd[1532]: time="2026-03-06T02:59:37.217819635Z" level=info msg="starting plugins..." Mar 6 02:59:37.222179 containerd[1532]: time="2026-03-06T02:59:37.217840842Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 6 02:59:37.226131 systemd-coredump[1553]: Process 1483 (ntpd) of user 0 dumped core. Module libnss_usrfiles.so.2 without build-id. Module libgcc_s.so.1 without build-id. Module ld-linux-x86-64.so.2 without build-id. Module libc.so.6 without build-id. Module libcrypto.so.3 without build-id. Module libm.so.6 without build-id. Module libcap.so.2 without build-id. Module ntpd without build-id. Stack trace of thread 1483: #0 0x000055b959b4caeb n/a (ntpd + 0x68aeb) #1 0x000055b959af5cdf n/a (ntpd + 0x11cdf) #2 0x000055b959af6575 n/a (ntpd + 0x12575) #3 0x000055b959af1d8a n/a (ntpd + 0xdd8a) #4 0x000055b959af35d3 n/a (ntpd + 0xf5d3) #5 0x000055b959afbfd1 n/a (ntpd + 0x17fd1) #6 0x000055b959aecc2d n/a (ntpd + 0x8c2d) #7 0x00007efc21cb716c n/a (libc.so.6 + 0x2716c) #8 0x00007efc21cb7229 __libc_start_main (libc.so.6 + 0x27229) #9 0x000055b959aecc55 n/a (ntpd + 0x8c55) ELF object binary architecture: AMD x86-64 Mar 6 02:59:37.229839 containerd[1532]: time="2026-03-06T02:59:37.227826492Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 6 02:59:37.229839 containerd[1532]: time="2026-03-06T02:59:37.227931884Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 6 02:59:37.240830 containerd[1532]: time="2026-03-06T02:59:37.233609755Z" level=info msg="containerd successfully booted in 0.798873s" Mar 6 02:59:37.232772 systemd[1]: ntpd.service: Main process exited, code=dumped, status=11/SEGV Mar 6 02:59:37.232969 systemd[1]: ntpd.service: Failed with result 'core-dump'. Mar 6 02:59:37.244222 systemd[1]: Started containerd.service - containerd container runtime. Mar 6 02:59:37.254370 systemd[1]: systemd-coredump@0-1520-0.service: Deactivated successfully. Mar 6 02:59:37.352045 systemd[1]: ntpd.service: Scheduled restart job, restart counter is at 1. Mar 6 02:59:37.360464 systemd[1]: Started ntpd.service - Network Time Service. Mar 6 02:59:37.416975 polkitd[1640]: Started polkitd version 126 Mar 6 02:59:37.446400 polkitd[1640]: Loading rules from directory /etc/polkit-1/rules.d Mar 6 02:59:37.449369 ntpd[1656]: ntpd 4.2.8p18@1.4062-o Thu Mar 5 21:56:02 UTC 2026 (1): Starting Mar 6 02:59:37.449862 sshd[1637]: Accepted publickey for core from 20.161.92.111 port 51508 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:37.450822 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: ntpd 4.2.8p18@1.4062-o Thu Mar 5 21:56:02 UTC 2026 (1): Starting Mar 6 02:59:37.451117 ntpd[1656]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: ---------------------------------------------------- Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: ntp-4 is maintained by Network Time Foundation, Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: corporation. Support and training for ntp-4 are Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: available at https://www.nwtime.org/support Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: ---------------------------------------------------- Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: proto: precision = 0.108 usec (-23) Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: basedate set to 2026-02-21 Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: gps base set to 2026-02-22 (week 2407) Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Listen and drop on 0 v6wildcard [::]:123 Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Listen normally on 2 lo 127.0.0.1:123 Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Listen normally on 3 eth0 10.128.0.87:123 Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Listen normally on 4 lo [::1]:123 Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Listen normally on 5 eth0 [fe80::4001:aff:fe80:57%2]:123 Mar 6 02:59:37.455596 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: Listening on routing socket on fd #22 for interface updates Mar 6 02:59:37.451143 ntpd[1656]: ---------------------------------------------------- Mar 6 02:59:37.451157 ntpd[1656]: ntp-4 is maintained by Network Time Foundation, Mar 6 02:59:37.451171 ntpd[1656]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Mar 6 02:59:37.451184 ntpd[1656]: corporation. Support and training for ntp-4 are Mar 6 02:59:37.451197 ntpd[1656]: available at https://www.nwtime.org/support Mar 6 02:59:37.451209 ntpd[1656]: ---------------------------------------------------- Mar 6 02:59:37.451714 polkitd[1640]: Loading rules from directory /run/polkit-1/rules.d Mar 6 02:59:37.451793 polkitd[1640]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Mar 6 02:59:37.452193 ntpd[1656]: proto: precision = 0.108 usec (-23) Mar 6 02:59:37.452386 polkitd[1640]: Loading rules from directory /usr/local/share/polkit-1/rules.d Mar 6 02:59:37.452424 polkitd[1640]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Mar 6 02:59:37.452510 polkitd[1640]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 6 02:59:37.452518 ntpd[1656]: basedate set to 2026-02-21 Mar 6 02:59:37.452537 ntpd[1656]: gps base set to 2026-02-22 (week 2407) Mar 6 02:59:37.452655 ntpd[1656]: Listen and drop on 0 v6wildcard [::]:123 Mar 6 02:59:37.452704 ntpd[1656]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Mar 6 02:59:37.452936 ntpd[1656]: Listen normally on 2 lo 127.0.0.1:123 Mar 6 02:59:37.452974 ntpd[1656]: Listen normally on 3 eth0 10.128.0.87:123 Mar 6 02:59:37.453015 ntpd[1656]: Listen normally on 4 lo [::1]:123 Mar 6 02:59:37.453056 ntpd[1656]: Listen normally on 5 eth0 [fe80::4001:aff:fe80:57%2]:123 Mar 6 02:59:37.453096 ntpd[1656]: Listening on routing socket on fd #22 for interface updates Mar 6 02:59:37.456154 sshd-session[1637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:37.463046 ntpd[1656]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 6 02:59:37.463483 systemd[1]: Started polkit.service - Authorization Manager. Mar 6 02:59:37.463082 polkitd[1640]: Finished loading, compiling and executing 2 rules Mar 6 02:59:37.464892 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 6 02:59:37.464989 ntpd[1656]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 6 02:59:37.465099 ntpd[1656]: 6 Mar 02:59:37 ntpd[1656]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Mar 6 02:59:37.466144 dbus-daemon[1474]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 6 02:59:37.468786 polkitd[1640]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 6 02:59:37.489154 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 6 02:59:37.501845 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 6 02:59:37.551807 systemd-logind[1505]: New session 1 of user core. Mar 6 02:59:37.572910 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 6 02:59:37.581255 tar[1530]: linux-amd64/README.md Mar 6 02:59:37.595233 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 6 02:59:37.597271 systemd-hostnamed[1587]: Hostname set to (transient) Mar 6 02:59:37.600281 systemd-resolved[1364]: System hostname changed to 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce'. Mar 6 02:59:37.625653 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 6 02:59:37.628193 (systemd)[1670]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 6 02:59:37.642286 systemd-logind[1505]: New session c1 of user core. Mar 6 02:59:37.925319 instance-setup[1591]: INFO Running google_set_multiqueue. Mar 6 02:59:37.965750 systemd[1670]: Queued start job for default target default.target. Mar 6 02:59:37.970493 instance-setup[1591]: INFO Set channels for eth0 to 2. Mar 6 02:59:37.974261 systemd[1670]: Created slice app.slice - User Application Slice. Mar 6 02:59:37.974525 systemd[1670]: Reached target paths.target - Paths. Mar 6 02:59:37.974746 systemd[1670]: Reached target timers.target - Timers. Mar 6 02:59:37.979579 systemd[1670]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 6 02:59:37.980588 instance-setup[1591]: INFO Setting /proc/irq/31/smp_affinity_list to 0 for device virtio1. Mar 6 02:59:37.984785 instance-setup[1591]: INFO /proc/irq/31/smp_affinity_list: real affinity 0 Mar 6 02:59:37.985311 instance-setup[1591]: INFO Setting /proc/irq/32/smp_affinity_list to 0 for device virtio1. Mar 6 02:59:37.987402 instance-setup[1591]: INFO /proc/irq/32/smp_affinity_list: real affinity 0 Mar 6 02:59:37.987837 instance-setup[1591]: INFO Setting /proc/irq/33/smp_affinity_list to 1 for device virtio1. Mar 6 02:59:37.989597 instance-setup[1591]: INFO /proc/irq/33/smp_affinity_list: real affinity 1 Mar 6 02:59:37.989915 instance-setup[1591]: INFO Setting /proc/irq/34/smp_affinity_list to 1 for device virtio1. Mar 6 02:59:37.993217 instance-setup[1591]: INFO /proc/irq/34/smp_affinity_list: real affinity 1 Mar 6 02:59:38.004651 instance-setup[1591]: INFO /usr/sbin/google_set_multiqueue: line 133: echo: write error: Value too large for defined data type Mar 6 02:59:38.009990 systemd[1670]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 6 02:59:38.010173 systemd[1670]: Reached target sockets.target - Sockets. Mar 6 02:59:38.010854 systemd[1670]: Reached target basic.target - Basic System. Mar 6 02:59:38.010937 systemd[1670]: Reached target default.target - Main User Target. Mar 6 02:59:38.010989 systemd[1670]: Startup finished in 353ms. Mar 6 02:59:38.011235 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 6 02:59:38.015741 instance-setup[1591]: INFO /usr/sbin/google_set_multiqueue: line 133: echo: write error: Value too large for defined data type Mar 6 02:59:38.017992 instance-setup[1591]: INFO Queue 0 XPS=1 for /sys/class/net/eth0/queues/tx-0/xps_cpus Mar 6 02:59:38.018060 instance-setup[1591]: INFO Queue 1 XPS=2 for /sys/class/net/eth0/queues/tx-1/xps_cpus Mar 6 02:59:38.029766 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 6 02:59:38.051124 init.sh[1582]: + /usr/bin/google_metadata_script_runner --script-type startup Mar 6 02:59:38.171964 systemd[1]: Started sshd@1-10.128.0.87:22-20.161.92.111:51520.service - OpenSSH per-connection server daemon (20.161.92.111:51520). Mar 6 02:59:38.278017 startup-script[1711]: INFO Starting startup scripts. Mar 6 02:59:38.284676 startup-script[1711]: INFO No startup scripts found in metadata. Mar 6 02:59:38.284778 startup-script[1711]: INFO Finished running startup scripts. Mar 6 02:59:38.311297 init.sh[1582]: + trap 'stopping=1 ; kill "${daemon_pids[@]}" || :' SIGTERM Mar 6 02:59:38.311601 init.sh[1582]: + daemon_pids=() Mar 6 02:59:38.311601 init.sh[1582]: + for d in accounts clock_skew network Mar 6 02:59:38.311754 init.sh[1582]: + daemon_pids+=($!) Mar 6 02:59:38.311848 init.sh[1582]: + for d in accounts clock_skew network Mar 6 02:59:38.312408 init.sh[1582]: + daemon_pids+=($!) Mar 6 02:59:38.312408 init.sh[1582]: + for d in accounts clock_skew network Mar 6 02:59:38.312540 init.sh[1720]: + /usr/bin/google_clock_skew_daemon Mar 6 02:59:38.313095 init.sh[1719]: + /usr/bin/google_accounts_daemon Mar 6 02:59:38.313398 init.sh[1721]: + /usr/bin/google_network_daemon Mar 6 02:59:38.313721 init.sh[1582]: + daemon_pids+=($!) Mar 6 02:59:38.313721 init.sh[1582]: + NOTIFY_SOCKET=/run/systemd/notify Mar 6 02:59:38.313721 init.sh[1582]: + /usr/bin/systemd-notify --ready Mar 6 02:59:38.327723 systemd[1]: Started oem-gce.service - GCE Linux Agent. Mar 6 02:59:38.338331 init.sh[1582]: + wait -n 1719 1720 1721 Mar 6 02:59:38.479667 sshd[1714]: Accepted publickey for core from 20.161.92.111 port 51520 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:38.484768 sshd-session[1714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:38.498819 systemd-logind[1505]: New session 2 of user core. Mar 6 02:59:38.504091 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 6 02:59:38.607191 google-clock-skew[1720]: INFO Starting Google Clock Skew daemon. Mar 6 02:59:38.614477 sshd[1725]: Connection closed by 20.161.92.111 port 51520 Mar 6 02:59:38.613152 sshd-session[1714]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:38.627931 systemd[1]: sshd@1-10.128.0.87:22-20.161.92.111:51520.service: Deactivated successfully. Mar 6 02:59:38.634605 systemd[1]: session-2.scope: Deactivated successfully. Mar 6 02:59:38.639998 systemd-logind[1505]: Session 2 logged out. Waiting for processes to exit. Mar 6 02:59:38.643490 systemd-logind[1505]: Removed session 2. Mar 6 02:59:38.644561 google-clock-skew[1720]: INFO Clock drift token has changed: 0. Mar 6 02:59:38.665825 systemd[1]: Started sshd@2-10.128.0.87:22-20.161.92.111:51524.service - OpenSSH per-connection server daemon (20.161.92.111:51524). Mar 6 02:59:38.821331 google-networking[1721]: INFO Starting Google Networking daemon. Mar 6 02:59:38.829957 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:38.842053 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 6 02:59:38.851387 systemd[1]: Startup finished in 4.066s (kernel) + 7.546s (initrd) + 8.576s (userspace) = 20.189s. Mar 6 02:59:38.858320 (kubelet)[1744]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:59:38.906947 groupadd[1746]: group added to /etc/group: name=google-sudoers, GID=1000 Mar 6 02:59:38.911326 groupadd[1746]: group added to /etc/gshadow: name=google-sudoers Mar 6 02:59:38.952341 sshd[1732]: Accepted publickey for core from 20.161.92.111 port 51524 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:38.954322 sshd-session[1732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:38.964318 systemd-logind[1505]: New session 3 of user core. Mar 6 02:59:38.969674 groupadd[1746]: new group: name=google-sudoers, GID=1000 Mar 6 02:59:38.971592 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 6 02:59:39.003971 google-accounts[1719]: INFO Starting Google Accounts daemon. Mar 6 02:59:39.018597 google-accounts[1719]: WARNING OS Login not installed. Mar 6 02:59:39.020712 google-accounts[1719]: INFO Creating a new user account for 0. Mar 6 02:59:39.027106 init.sh[1759]: useradd: invalid user name '0': use --badname to ignore Mar 6 02:59:39.027965 google-accounts[1719]: WARNING Could not create user 0. Command '['useradd', '-m', '-s', '/bin/bash', '-p', '*', '0']' returned non-zero exit status 3.. Mar 6 02:59:39.077821 sshd[1756]: Connection closed by 20.161.92.111 port 51524 Mar 6 02:59:39.079739 sshd-session[1732]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:39.088190 systemd[1]: sshd@2-10.128.0.87:22-20.161.92.111:51524.service: Deactivated successfully. Mar 6 02:59:39.091972 systemd[1]: session-3.scope: Deactivated successfully. Mar 6 02:59:39.094310 systemd-logind[1505]: Session 3 logged out. Waiting for processes to exit. Mar 6 02:59:39.096742 systemd-logind[1505]: Removed session 3. Mar 6 02:59:39.001110 systemd-resolved[1364]: Clock change detected. Flushing caches. Mar 6 02:59:39.017811 systemd-journald[1145]: Time jumped backwards, rotating. Mar 6 02:59:39.003068 google-clock-skew[1720]: INFO Synced system time with hardware clock. Mar 6 02:59:39.602201 kubelet[1744]: E0306 02:59:39.602106 1744 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:59:39.605147 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:59:39.605424 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:59:39.606058 systemd[1]: kubelet.service: Consumed 1.258s CPU time, 257.5M memory peak. Mar 6 02:59:49.027575 systemd[1]: Started sshd@3-10.128.0.87:22-20.161.92.111:41594.service - OpenSSH per-connection server daemon (20.161.92.111:41594). Mar 6 02:59:49.285119 sshd[1773]: Accepted publickey for core from 20.161.92.111 port 41594 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:49.286640 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:49.294428 systemd-logind[1505]: New session 4 of user core. Mar 6 02:59:49.301028 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 6 02:59:49.398899 sshd[1776]: Connection closed by 20.161.92.111 port 41594 Mar 6 02:59:49.400123 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:49.406708 systemd[1]: sshd@3-10.128.0.87:22-20.161.92.111:41594.service: Deactivated successfully. Mar 6 02:59:49.409169 systemd[1]: session-4.scope: Deactivated successfully. Mar 6 02:59:49.410362 systemd-logind[1505]: Session 4 logged out. Waiting for processes to exit. Mar 6 02:59:49.412485 systemd-logind[1505]: Removed session 4. Mar 6 02:59:49.449089 systemd[1]: Started sshd@4-10.128.0.87:22-20.161.92.111:41608.service - OpenSSH per-connection server daemon (20.161.92.111:41608). Mar 6 02:59:49.641572 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 6 02:59:49.646065 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 02:59:49.695436 sshd[1782]: Accepted publickey for core from 20.161.92.111 port 41608 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:49.697266 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:49.707222 systemd-logind[1505]: New session 5 of user core. Mar 6 02:59:49.714122 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 6 02:59:49.802332 sshd[1788]: Connection closed by 20.161.92.111 port 41608 Mar 6 02:59:49.803198 sshd-session[1782]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:49.809921 systemd-logind[1505]: Session 5 logged out. Waiting for processes to exit. Mar 6 02:59:49.810475 systemd[1]: sshd@4-10.128.0.87:22-20.161.92.111:41608.service: Deactivated successfully. Mar 6 02:59:49.814339 systemd[1]: session-5.scope: Deactivated successfully. Mar 6 02:59:49.818843 systemd-logind[1505]: Removed session 5. Mar 6 02:59:49.856023 systemd[1]: Started sshd@5-10.128.0.87:22-20.161.92.111:41616.service - OpenSSH per-connection server daemon (20.161.92.111:41616). Mar 6 02:59:50.053053 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 02:59:50.064460 (kubelet)[1802]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 02:59:50.103010 sshd[1794]: Accepted publickey for core from 20.161.92.111 port 41616 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:50.106583 sshd-session[1794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:50.116452 systemd-logind[1505]: New session 6 of user core. Mar 6 02:59:50.123033 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 6 02:59:50.133276 kubelet[1802]: E0306 02:59:50.133211 1802 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 02:59:50.137695 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 02:59:50.137978 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 02:59:50.138538 systemd[1]: kubelet.service: Consumed 229ms CPU time, 110.1M memory peak. Mar 6 02:59:50.218942 sshd[1809]: Connection closed by 20.161.92.111 port 41616 Mar 6 02:59:50.220146 sshd-session[1794]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:50.225984 systemd[1]: sshd@5-10.128.0.87:22-20.161.92.111:41616.service: Deactivated successfully. Mar 6 02:59:50.228411 systemd[1]: session-6.scope: Deactivated successfully. Mar 6 02:59:50.229577 systemd-logind[1505]: Session 6 logged out. Waiting for processes to exit. Mar 6 02:59:50.231689 systemd-logind[1505]: Removed session 6. Mar 6 02:59:50.263569 systemd[1]: Started sshd@6-10.128.0.87:22-20.161.92.111:50490.service - OpenSSH per-connection server daemon (20.161.92.111:50490). Mar 6 02:59:50.493235 sshd[1816]: Accepted publickey for core from 20.161.92.111 port 50490 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:50.494523 sshd-session[1816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:50.502836 systemd-logind[1505]: New session 7 of user core. Mar 6 02:59:50.509071 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 6 02:59:50.584557 sudo[1820]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 6 02:59:50.585110 sudo[1820]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:59:50.602417 sudo[1820]: pam_unix(sudo:session): session closed for user root Mar 6 02:59:50.633991 sshd[1819]: Connection closed by 20.161.92.111 port 50490 Mar 6 02:59:50.636150 sshd-session[1816]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:50.641507 systemd[1]: sshd@6-10.128.0.87:22-20.161.92.111:50490.service: Deactivated successfully. Mar 6 02:59:50.644613 systemd[1]: session-7.scope: Deactivated successfully. Mar 6 02:59:50.647661 systemd-logind[1505]: Session 7 logged out. Waiting for processes to exit. Mar 6 02:59:50.649292 systemd-logind[1505]: Removed session 7. Mar 6 02:59:50.682152 systemd[1]: Started sshd@7-10.128.0.87:22-20.161.92.111:50494.service - OpenSSH per-connection server daemon (20.161.92.111:50494). Mar 6 02:59:50.928415 sshd[1826]: Accepted publickey for core from 20.161.92.111 port 50494 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:50.929178 sshd-session[1826]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:50.938195 systemd-logind[1505]: New session 8 of user core. Mar 6 02:59:50.946090 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 6 02:59:51.012516 sudo[1831]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 6 02:59:51.013148 sudo[1831]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:59:51.020706 sudo[1831]: pam_unix(sudo:session): session closed for user root Mar 6 02:59:51.035359 sudo[1830]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 6 02:59:51.035866 sudo[1830]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:59:51.049566 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 6 02:59:51.116859 augenrules[1853]: No rules Mar 6 02:59:51.118559 systemd[1]: audit-rules.service: Deactivated successfully. Mar 6 02:59:51.118955 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 6 02:59:51.121037 sudo[1830]: pam_unix(sudo:session): session closed for user root Mar 6 02:59:51.157490 sshd[1829]: Connection closed by 20.161.92.111 port 50494 Mar 6 02:59:51.158342 sshd-session[1826]: pam_unix(sshd:session): session closed for user core Mar 6 02:59:51.165157 systemd[1]: sshd@7-10.128.0.87:22-20.161.92.111:50494.service: Deactivated successfully. Mar 6 02:59:51.167938 systemd[1]: session-8.scope: Deactivated successfully. Mar 6 02:59:51.169270 systemd-logind[1505]: Session 8 logged out. Waiting for processes to exit. Mar 6 02:59:51.171414 systemd-logind[1505]: Removed session 8. Mar 6 02:59:51.208600 systemd[1]: Started sshd@8-10.128.0.87:22-20.161.92.111:50500.service - OpenSSH per-connection server daemon (20.161.92.111:50500). Mar 6 02:59:51.455627 sshd[1862]: Accepted publickey for core from 20.161.92.111 port 50500 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 02:59:51.456544 sshd-session[1862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 02:59:51.464622 systemd-logind[1505]: New session 9 of user core. Mar 6 02:59:51.473127 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 6 02:59:51.539436 sudo[1866]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 6 02:59:51.540018 sudo[1866]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 6 02:59:52.069023 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 6 02:59:52.085544 (dockerd)[1884]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 6 02:59:52.470584 dockerd[1884]: time="2026-03-06T02:59:52.469558582Z" level=info msg="Starting up" Mar 6 02:59:52.471584 dockerd[1884]: time="2026-03-06T02:59:52.471548356Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 6 02:59:52.488885 dockerd[1884]: time="2026-03-06T02:59:52.488817708Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Mar 6 02:59:52.544434 dockerd[1884]: time="2026-03-06T02:59:52.544373215Z" level=info msg="Loading containers: start." Mar 6 02:59:52.561841 kernel: Initializing XFRM netlink socket Mar 6 02:59:52.941494 systemd-networkd[1442]: docker0: Link UP Mar 6 02:59:52.950052 dockerd[1884]: time="2026-03-06T02:59:52.949923002Z" level=info msg="Loading containers: done." Mar 6 02:59:52.972660 dockerd[1884]: time="2026-03-06T02:59:52.972494864Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 6 02:59:52.972660 dockerd[1884]: time="2026-03-06T02:59:52.972620250Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Mar 6 02:59:52.973037 dockerd[1884]: time="2026-03-06T02:59:52.972829374Z" level=info msg="Initializing buildkit" Mar 6 02:59:53.008680 dockerd[1884]: time="2026-03-06T02:59:53.008606795Z" level=info msg="Completed buildkit initialization" Mar 6 02:59:53.019270 dockerd[1884]: time="2026-03-06T02:59:53.019177652Z" level=info msg="Daemon has completed initialization" Mar 6 02:59:53.019935 dockerd[1884]: time="2026-03-06T02:59:53.019404757Z" level=info msg="API listen on /run/docker.sock" Mar 6 02:59:53.019524 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 6 02:59:53.847834 containerd[1532]: time="2026-03-06T02:59:53.847755987Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\"" Mar 6 02:59:54.275551 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount295498033.mount: Deactivated successfully. Mar 6 02:59:55.786799 containerd[1532]: time="2026-03-06T02:59:55.786700081Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:55.788458 containerd[1532]: time="2026-03-06T02:59:55.788412141Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.35.2: active requests=0, bytes read=27697898" Mar 6 02:59:55.789932 containerd[1532]: time="2026-03-06T02:59:55.789861939Z" level=info msg="ImageCreate event name:\"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:55.795514 containerd[1532]: time="2026-03-06T02:59:55.795417091Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:55.796915 containerd[1532]: time="2026-03-06T02:59:55.796871230Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.35.2\" with image id \"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\", repo tag \"registry.k8s.io/kube-apiserver:v1.35.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:68cdc586f13b13edb7aa30a18155be530136a39cfd5ef8672aad8ccc98f0a7f7\", size \"27693066\" in 1.949035854s" Mar 6 02:59:55.797019 containerd[1532]: time="2026-03-06T02:59:55.796926695Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.35.2\" returns image reference \"sha256:66108468ce51257077e642f2f509cd61d470029036a7954a1a47ca15b2706dda\"" Mar 6 02:59:55.798128 containerd[1532]: time="2026-03-06T02:59:55.797959216Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\"" Mar 6 02:59:57.070959 containerd[1532]: time="2026-03-06T02:59:57.070889557Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:57.072419 containerd[1532]: time="2026-03-06T02:59:57.072340536Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.35.2: active requests=0, bytes read=21450946" Mar 6 02:59:57.073660 containerd[1532]: time="2026-03-06T02:59:57.073574908Z" level=info msg="ImageCreate event name:\"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:57.077190 containerd[1532]: time="2026-03-06T02:59:57.077099299Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:57.078508 containerd[1532]: time="2026-03-06T02:59:57.078352267Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.35.2\" with image id \"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.35.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d9784320a41dd1b155c0ad8fdb5823d60c475870f3dd23865edde36b585748f2\", size \"23142311\" in 1.28035214s" Mar 6 02:59:57.078508 containerd[1532]: time="2026-03-06T02:59:57.078395731Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.35.2\" returns image reference \"sha256:0f2dd35011c05b55a97c9304ae1d36cfd58499cc1fd3dd8ccdf6efef1144e36a\"" Mar 6 02:59:57.078925 containerd[1532]: time="2026-03-06T02:59:57.078895483Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\"" Mar 6 02:59:58.059666 containerd[1532]: time="2026-03-06T02:59:58.059589927Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:58.060968 containerd[1532]: time="2026-03-06T02:59:58.060909236Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.35.2: active requests=0, bytes read=15548657" Mar 6 02:59:58.062100 containerd[1532]: time="2026-03-06T02:59:58.061991792Z" level=info msg="ImageCreate event name:\"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:58.066145 containerd[1532]: time="2026-03-06T02:59:58.065849559Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:58.067700 containerd[1532]: time="2026-03-06T02:59:58.067487605Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.35.2\" with image id \"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\", repo tag \"registry.k8s.io/kube-scheduler:v1.35.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:5833e2c4b779215efe7a48126c067de199e86aa5a86518693adeef16db0ff943\", size \"17240058\" in 988.454013ms" Mar 6 02:59:58.067700 containerd[1532]: time="2026-03-06T02:59:58.067532868Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.35.2\" returns image reference \"sha256:ee83c410d7938aa1752b4e79a8d51f03710b4becc23b2e095fba471049fb2914\"" Mar 6 02:59:58.068846 containerd[1532]: time="2026-03-06T02:59:58.068545426Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\"" Mar 6 02:59:59.153485 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1637755647.mount: Deactivated successfully. Mar 6 02:59:59.602007 containerd[1532]: time="2026-03-06T02:59:59.600725785Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.35.2: active requests=0, bytes read=25685639" Mar 6 02:59:59.602007 containerd[1532]: time="2026-03-06T02:59:59.601809626Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.35.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:59.603490 containerd[1532]: time="2026-03-06T02:59:59.603452006Z" level=info msg="ImageCreate event name:\"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 02:59:59.604527 containerd[1532]: time="2026-03-06T02:59:59.604481821Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.35.2\" with image id \"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\", repo tag \"registry.k8s.io/kube-proxy:v1.35.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\", size \"25684331\" in 1.535894288s" Mar 6 02:59:59.604622 containerd[1532]: time="2026-03-06T02:59:59.604531063Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.35.2\" returns image reference \"sha256:3c471cf273e44f68c91b48985c27627d581915b9ee5e72f7227bbf2146008b5e\"" Mar 6 02:59:59.605135 containerd[1532]: time="2026-03-06T02:59:59.605098940Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\"" Mar 6 02:59:59.605342 containerd[1532]: time="2026-03-06T02:59:59.605310883Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:015265214cc874b593a7adccdcfe4ac15d2b8e9ae89881bdcd5bcb99d42e1862\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:00.025194 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2518152371.mount: Deactivated successfully. Mar 6 03:00:00.153175 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 6 03:00:00.159041 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 03:00:00.554499 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 03:00:00.568967 (kubelet)[2193]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 6 03:00:00.665244 kubelet[2193]: E0306 03:00:00.665123 2193 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 6 03:00:00.670636 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 6 03:00:00.671051 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 6 03:00:00.672166 systemd[1]: kubelet.service: Consumed 266ms CPU time, 108.9M memory peak. Mar 6 03:00:01.487606 containerd[1532]: time="2026-03-06T03:00:01.487448107Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.13.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:01.489541 containerd[1532]: time="2026-03-06T03:00:01.489195663Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.13.1: active requests=0, bytes read=23558108" Mar 6 03:00:01.491224 containerd[1532]: time="2026-03-06T03:00:01.491169207Z" level=info msg="ImageCreate event name:\"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:01.495413 containerd[1532]: time="2026-03-06T03:00:01.495333538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:01.497071 containerd[1532]: time="2026-03-06T03:00:01.497004577Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.13.1\" with image id \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\", repo tag \"registry.k8s.io/coredns/coredns:v1.13.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9b9128672209474da07c91439bf15ed704ae05ad918dd6454e5b6ae14e35fee6\", size \"23553139\" in 1.891855976s" Mar 6 03:00:01.497071 containerd[1532]: time="2026-03-06T03:00:01.497069297Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.13.1\" returns image reference \"sha256:aa5e3ebc0dfed0566805186b9e47110d8f9122291d8bad1497e78873ad291139\"" Mar 6 03:00:01.498362 containerd[1532]: time="2026-03-06T03:00:01.498101057Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Mar 6 03:00:01.860623 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3220016303.mount: Deactivated successfully. Mar 6 03:00:01.867552 containerd[1532]: time="2026-03-06T03:00:01.867497968Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:01.868783 containerd[1532]: time="2026-03-06T03:00:01.868706075Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=321428" Mar 6 03:00:01.870245 containerd[1532]: time="2026-03-06T03:00:01.870177686Z" level=info msg="ImageCreate event name:\"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:01.873259 containerd[1532]: time="2026-03-06T03:00:01.873193527Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:01.874459 containerd[1532]: time="2026-03-06T03:00:01.874150146Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"320448\" in 376.00485ms" Mar 6 03:00:01.874459 containerd[1532]: time="2026-03-06T03:00:01.874194342Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:cd073f4c5f6a8e9dc6f3125ba00cf60819cae95c1ec84a1f146ee4a9cf9e803f\"" Mar 6 03:00:01.875267 containerd[1532]: time="2026-03-06T03:00:01.875238389Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\"" Mar 6 03:00:02.271724 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1389298800.mount: Deactivated successfully. Mar 6 03:00:03.382636 containerd[1532]: time="2026-03-06T03:00:03.382550274Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.6-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:03.384250 containerd[1532]: time="2026-03-06T03:00:03.384182049Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.6-0: active requests=0, bytes read=23631199" Mar 6 03:00:03.385537 containerd[1532]: time="2026-03-06T03:00:03.385490687Z" level=info msg="ImageCreate event name:\"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:03.391789 containerd[1532]: time="2026-03-06T03:00:03.391475538Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:03.397134 containerd[1532]: time="2026-03-06T03:00:03.397078566Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.6-0\" with image id \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\", repo tag \"registry.k8s.io/etcd:3.6.6-0\", repo digest \"registry.k8s.io/etcd@sha256:60a30b5d81b2217555e2cfb9537f655b7ba97220b99c39ee2e162a7127225890\", size \"23641797\" in 1.52179876s" Mar 6 03:00:03.397134 containerd[1532]: time="2026-03-06T03:00:03.397131562Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.6-0\" returns image reference \"sha256:0a108f7189562e99793bdecab61fdf1a7c9d913af3385de9da17fb9d6ff430e2\"" Mar 6 03:00:05.974822 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 03:00:05.975131 systemd[1]: kubelet.service: Consumed 266ms CPU time, 108.9M memory peak. Mar 6 03:00:05.978724 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 03:00:06.029241 systemd[1]: Reload requested from client PID 2331 ('systemctl') (unit session-9.scope)... Mar 6 03:00:06.029262 systemd[1]: Reloading... Mar 6 03:00:06.203808 zram_generator::config[2379]: No configuration found. Mar 6 03:00:06.534199 systemd[1]: Reloading finished in 504 ms. Mar 6 03:00:06.611752 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 6 03:00:06.611904 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 6 03:00:06.612306 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 03:00:06.612384 systemd[1]: kubelet.service: Consumed 174ms CPU time, 98.3M memory peak. Mar 6 03:00:06.614688 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 03:00:06.930236 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 03:00:06.955688 (kubelet)[2427]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 6 03:00:07.016361 kubelet[2427]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 6 03:00:07.319564 kubelet[2427]: I0306 03:00:07.319371 2427 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 6 03:00:07.319564 kubelet[2427]: I0306 03:00:07.319435 2427 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 6 03:00:07.319564 kubelet[2427]: I0306 03:00:07.319461 2427 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 6 03:00:07.319564 kubelet[2427]: I0306 03:00:07.319472 2427 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 6 03:00:07.320323 kubelet[2427]: I0306 03:00:07.320272 2427 server.go:951] "Client rotation is on, will bootstrap in background" Mar 6 03:00:07.328796 kubelet[2427]: E0306 03:00:07.328729 2427 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.128.0.87:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.128.0.87:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Mar 6 03:00:07.330249 kubelet[2427]: I0306 03:00:07.329494 2427 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 6 03:00:07.334390 kubelet[2427]: I0306 03:00:07.334352 2427 server.go:1418] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 6 03:00:07.340790 kubelet[2427]: I0306 03:00:07.340696 2427 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 6 03:00:07.343214 kubelet[2427]: I0306 03:00:07.342623 2427 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 6 03:00:07.343214 kubelet[2427]: I0306 03:00:07.342912 2427 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 6 03:00:07.343214 kubelet[2427]: I0306 03:00:07.343158 2427 topology_manager.go:143] "Creating topology manager with none policy" Mar 6 03:00:07.343214 kubelet[2427]: I0306 03:00:07.343184 2427 container_manager_linux.go:308] "Creating device plugin manager" Mar 6 03:00:07.343606 kubelet[2427]: I0306 03:00:07.343336 2427 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 6 03:00:07.345724 kubelet[2427]: I0306 03:00:07.345674 2427 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 6 03:00:07.345977 kubelet[2427]: I0306 03:00:07.345957 2427 kubelet.go:482] "Attempting to sync node with API server" Mar 6 03:00:07.346067 kubelet[2427]: I0306 03:00:07.345983 2427 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 6 03:00:07.346067 kubelet[2427]: I0306 03:00:07.346019 2427 kubelet.go:394] "Adding apiserver pod source" Mar 6 03:00:07.346067 kubelet[2427]: I0306 03:00:07.346034 2427 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 6 03:00:07.352364 kubelet[2427]: I0306 03:00:07.352322 2427 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 6 03:00:07.356547 kubelet[2427]: I0306 03:00:07.355945 2427 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 6 03:00:07.356547 kubelet[2427]: I0306 03:00:07.356003 2427 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 6 03:00:07.356547 kubelet[2427]: W0306 03:00:07.356088 2427 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 6 03:00:07.371724 kubelet[2427]: I0306 03:00:07.371688 2427 server.go:1257] "Started kubelet" Mar 6 03:00:07.374660 kubelet[2427]: I0306 03:00:07.374633 2427 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 6 03:00:07.375553 kubelet[2427]: I0306 03:00:07.375482 2427 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 6 03:00:07.377816 kubelet[2427]: I0306 03:00:07.377793 2427 server.go:317] "Adding debug handlers to kubelet server" Mar 6 03:00:07.381446 kubelet[2427]: E0306 03:00:07.378981 2427 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.128.0.87:6443/api/v1/namespaces/default/events\": dial tcp 10.128.0.87:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce.189a214a8b45fc1b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,UID:,APIVersion:v1,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,},FirstTimestamp:2026-03-06 03:00:07.371627547 +0000 UTC m=+0.409677203,LastTimestamp:2026-03-06 03:00:07.371627547 +0000 UTC m=+0.409677203,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,}" Mar 6 03:00:07.384966 kubelet[2427]: I0306 03:00:07.384867 2427 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 6 03:00:07.385077 kubelet[2427]: I0306 03:00:07.385021 2427 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 6 03:00:07.385432 kubelet[2427]: I0306 03:00:07.385411 2427 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 6 03:00:07.385982 kubelet[2427]: I0306 03:00:07.385943 2427 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 6 03:00:07.388815 kubelet[2427]: I0306 03:00:07.388430 2427 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 6 03:00:07.388815 kubelet[2427]: I0306 03:00:07.388543 2427 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 6 03:00:07.388815 kubelet[2427]: I0306 03:00:07.388608 2427 reconciler.go:29] "Reconciler: start to sync state" Mar 6 03:00:07.389268 kubelet[2427]: E0306 03:00:07.389234 2427 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:07.389639 kubelet[2427]: E0306 03:00:07.389594 2427 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce?timeout=10s\": dial tcp 10.128.0.87:6443: connect: connection refused" interval="200ms" Mar 6 03:00:07.391224 kubelet[2427]: I0306 03:00:07.391187 2427 factory.go:223] Registration of the systemd container factory successfully Mar 6 03:00:07.391454 kubelet[2427]: I0306 03:00:07.391430 2427 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 6 03:00:07.393315 kubelet[2427]: E0306 03:00:07.393286 2427 kubelet.go:1656] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 6 03:00:07.393577 kubelet[2427]: I0306 03:00:07.393553 2427 factory.go:223] Registration of the containerd container factory successfully Mar 6 03:00:07.398553 kubelet[2427]: I0306 03:00:07.398361 2427 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 6 03:00:07.435340 kubelet[2427]: I0306 03:00:07.435277 2427 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 6 03:00:07.435558 kubelet[2427]: I0306 03:00:07.435540 2427 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 6 03:00:07.436572 kubelet[2427]: I0306 03:00:07.435951 2427 kubelet.go:2501] "Starting kubelet main sync loop" Mar 6 03:00:07.436572 kubelet[2427]: E0306 03:00:07.436037 2427 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 6 03:00:07.439978 kubelet[2427]: I0306 03:00:07.439955 2427 cpu_manager.go:225] "Starting" policy="none" Mar 6 03:00:07.440490 kubelet[2427]: I0306 03:00:07.440461 2427 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 6 03:00:07.440583 kubelet[2427]: I0306 03:00:07.440501 2427 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 6 03:00:07.442994 kubelet[2427]: I0306 03:00:07.442946 2427 policy_none.go:50] "Start" Mar 6 03:00:07.442994 kubelet[2427]: I0306 03:00:07.442979 2427 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 6 03:00:07.442994 kubelet[2427]: I0306 03:00:07.442997 2427 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 6 03:00:07.445010 kubelet[2427]: I0306 03:00:07.444987 2427 policy_none.go:44] "Start" Mar 6 03:00:07.452397 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 6 03:00:07.468615 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 6 03:00:07.479881 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 6 03:00:07.482797 kubelet[2427]: E0306 03:00:07.482184 2427 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 6 03:00:07.482797 kubelet[2427]: I0306 03:00:07.482385 2427 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 6 03:00:07.482797 kubelet[2427]: I0306 03:00:07.482398 2427 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 6 03:00:07.483438 kubelet[2427]: I0306 03:00:07.483413 2427 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 6 03:00:07.486084 kubelet[2427]: E0306 03:00:07.486053 2427 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 6 03:00:07.486190 kubelet[2427]: E0306 03:00:07.486117 2427 eviction_manager.go:297] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:07.508476 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 6 03:00:07.560132 systemd[1]: Created slice kubepods-burstable-podcd7e45805ec9f31d3ae5f036c4ff52e9.slice - libcontainer container kubepods-burstable-podcd7e45805ec9f31d3ae5f036c4ff52e9.slice. Mar 6 03:00:07.579308 kubelet[2427]: E0306 03:00:07.579107 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.586576 systemd[1]: Created slice kubepods-burstable-pod685af48754a2db68a85300d4b435c9f6.slice - libcontainer container kubepods-burstable-pod685af48754a2db68a85300d4b435c9f6.slice. Mar 6 03:00:07.590710 kubelet[2427]: I0306 03:00:07.590250 2427 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.590864 kubelet[2427]: E0306 03:00:07.590810 2427 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.128.0.87:6443/api/v1/nodes\": dial tcp 10.128.0.87:6443: connect: connection refused" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.590920 kubelet[2427]: E0306 03:00:07.590891 2427 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce?timeout=10s\": dial tcp 10.128.0.87:6443: connect: connection refused" interval="400ms" Mar 6 03:00:07.592523 kubelet[2427]: E0306 03:00:07.592493 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.595393 systemd[1]: Created slice kubepods-burstable-pod47bf9fb3beb350154ee96edcc6a219ac.slice - libcontainer container kubepods-burstable-pod47bf9fb3beb350154ee96edcc6a219ac.slice. Mar 6 03:00:07.598034 kubelet[2427]: E0306 03:00:07.598002 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689344 kubelet[2427]: I0306 03:00:07.689212 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cd7e45805ec9f31d3ae5f036c4ff52e9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"cd7e45805ec9f31d3ae5f036c4ff52e9\") " pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689344 kubelet[2427]: I0306 03:00:07.689286 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-ca-certs\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689344 kubelet[2427]: I0306 03:00:07.689317 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689344 kubelet[2427]: I0306 03:00:07.689343 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689877 kubelet[2427]: I0306 03:00:07.689379 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689877 kubelet[2427]: I0306 03:00:07.689407 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/47bf9fb3beb350154ee96edcc6a219ac-kubeconfig\") pod \"kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"47bf9fb3beb350154ee96edcc6a219ac\") " pod="kube-system/kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689877 kubelet[2427]: I0306 03:00:07.689436 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cd7e45805ec9f31d3ae5f036c4ff52e9-ca-certs\") pod \"kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"cd7e45805ec9f31d3ae5f036c4ff52e9\") " pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689877 kubelet[2427]: I0306 03:00:07.689460 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cd7e45805ec9f31d3ae5f036c4ff52e9-k8s-certs\") pod \"kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"cd7e45805ec9f31d3ae5f036c4ff52e9\") " pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.689997 kubelet[2427]: I0306 03:00:07.689505 2427 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.797794 kubelet[2427]: I0306 03:00:07.797631 2427 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.798512 kubelet[2427]: E0306 03:00:07.798464 2427 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.128.0.87:6443/api/v1/nodes\": dial tcp 10.128.0.87:6443: connect: connection refused" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:07.885121 containerd[1532]: time="2026-03-06T03:00:07.884977694Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,Uid:cd7e45805ec9f31d3ae5f036c4ff52e9,Namespace:kube-system,Attempt:0,}" Mar 6 03:00:07.898202 containerd[1532]: time="2026-03-06T03:00:07.898138808Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,Uid:685af48754a2db68a85300d4b435c9f6,Namespace:kube-system,Attempt:0,}" Mar 6 03:00:07.901450 containerd[1532]: time="2026-03-06T03:00:07.901404520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,Uid:47bf9fb3beb350154ee96edcc6a219ac,Namespace:kube-system,Attempt:0,}" Mar 6 03:00:07.992523 kubelet[2427]: E0306 03:00:07.992420 2427 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce?timeout=10s\": dial tcp 10.128.0.87:6443: connect: connection refused" interval="800ms" Mar 6 03:00:08.203827 kubelet[2427]: I0306 03:00:08.203486 2427 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:08.204418 kubelet[2427]: E0306 03:00:08.204306 2427 kubelet_node_status.go:106] "Unable to register node with API server" err="Post \"https://10.128.0.87:6443/api/v1/nodes\": dial tcp 10.128.0.87:6443: connect: connection refused" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:08.246200 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3724792460.mount: Deactivated successfully. Mar 6 03:00:08.256919 containerd[1532]: time="2026-03-06T03:00:08.256855015Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 03:00:08.261680 containerd[1532]: time="2026-03-06T03:00:08.261492477Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321348" Mar 6 03:00:08.263035 containerd[1532]: time="2026-03-06T03:00:08.262972099Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 03:00:08.264319 containerd[1532]: time="2026-03-06T03:00:08.264238048Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 03:00:08.266934 containerd[1532]: time="2026-03-06T03:00:08.266878886Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 03:00:08.268069 containerd[1532]: time="2026-03-06T03:00:08.267981487Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 6 03:00:08.269197 containerd[1532]: time="2026-03-06T03:00:08.269143693Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 6 03:00:08.270781 containerd[1532]: time="2026-03-06T03:00:08.270542624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 6 03:00:08.272797 containerd[1532]: time="2026-03-06T03:00:08.271503752Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 383.893658ms" Mar 6 03:00:08.275843 containerd[1532]: time="2026-03-06T03:00:08.275699899Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 374.928857ms" Mar 6 03:00:08.307377 containerd[1532]: time="2026-03-06T03:00:08.306892760Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 403.286862ms" Mar 6 03:00:08.318697 containerd[1532]: time="2026-03-06T03:00:08.318628108Z" level=info msg="connecting to shim 58b1ebb96fd5239ca744b530a2dad3b781c6309c1418945a414e5e97f457900a" address="unix:///run/containerd/s/a31432aab6443fd9e511570d63fb688494491d223b76c65dc7df15e730a579a1" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:00:08.346863 containerd[1532]: time="2026-03-06T03:00:08.345921532Z" level=info msg="connecting to shim 2a61a5dd1c06d689fb6c061d4ab60f1727e1d0e09eee2d7389b086d2d15f55ac" address="unix:///run/containerd/s/7ce7081d7faa5e14034a7069477e8fbe46e70fa8feb0aeaadfe378295947990c" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:00:08.363923 containerd[1532]: time="2026-03-06T03:00:08.363686246Z" level=info msg="connecting to shim ac1052d97d755aafbc785b8554ee6d88a5871e3ddcea70c0a9502b19d0a10600" address="unix:///run/containerd/s/0151f8c130f7c46f85c15f0c30ae6c5ddd8e1e07be5d0159fe71d1f450e287fb" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:00:08.391216 systemd[1]: Started cri-containerd-58b1ebb96fd5239ca744b530a2dad3b781c6309c1418945a414e5e97f457900a.scope - libcontainer container 58b1ebb96fd5239ca744b530a2dad3b781c6309c1418945a414e5e97f457900a. Mar 6 03:00:08.427048 systemd[1]: Started cri-containerd-2a61a5dd1c06d689fb6c061d4ab60f1727e1d0e09eee2d7389b086d2d15f55ac.scope - libcontainer container 2a61a5dd1c06d689fb6c061d4ab60f1727e1d0e09eee2d7389b086d2d15f55ac. Mar 6 03:00:08.444320 systemd[1]: Started cri-containerd-ac1052d97d755aafbc785b8554ee6d88a5871e3ddcea70c0a9502b19d0a10600.scope - libcontainer container ac1052d97d755aafbc785b8554ee6d88a5871e3ddcea70c0a9502b19d0a10600. Mar 6 03:00:08.569191 containerd[1532]: time="2026-03-06T03:00:08.568840050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,Uid:cd7e45805ec9f31d3ae5f036c4ff52e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"58b1ebb96fd5239ca744b530a2dad3b781c6309c1418945a414e5e97f457900a\"" Mar 6 03:00:08.575909 kubelet[2427]: E0306 03:00:08.575839 2427 kubelet_pods.go:562] "Hostname for pod was too long, truncated it" podName="kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" hostnameMaxLen=63 truncatedHostname="kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7" Mar 6 03:00:08.577522 containerd[1532]: time="2026-03-06T03:00:08.577117063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,Uid:685af48754a2db68a85300d4b435c9f6,Namespace:kube-system,Attempt:0,} returns sandbox id \"2a61a5dd1c06d689fb6c061d4ab60f1727e1d0e09eee2d7389b086d2d15f55ac\"" Mar 6 03:00:08.581915 kubelet[2427]: E0306 03:00:08.581875 2427 kubelet_pods.go:562] "Hostname for pod was too long, truncated it" podName="kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" hostnameMaxLen=63 truncatedHostname="kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df" Mar 6 03:00:08.583180 containerd[1532]: time="2026-03-06T03:00:08.583120342Z" level=info msg="CreateContainer within sandbox \"58b1ebb96fd5239ca744b530a2dad3b781c6309c1418945a414e5e97f457900a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 6 03:00:08.587161 containerd[1532]: time="2026-03-06T03:00:08.587117315Z" level=info msg="CreateContainer within sandbox \"2a61a5dd1c06d689fb6c061d4ab60f1727e1d0e09eee2d7389b086d2d15f55ac\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 6 03:00:08.591975 containerd[1532]: time="2026-03-06T03:00:08.591926489Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce,Uid:47bf9fb3beb350154ee96edcc6a219ac,Namespace:kube-system,Attempt:0,} returns sandbox id \"ac1052d97d755aafbc785b8554ee6d88a5871e3ddcea70c0a9502b19d0a10600\"" Mar 6 03:00:08.595272 containerd[1532]: time="2026-03-06T03:00:08.595179157Z" level=info msg="Container a5c85f938b028875d2ed2923e3620561326babe6098681357a26a9fd30720b67: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:08.595374 kubelet[2427]: E0306 03:00:08.595035 2427 kubelet_pods.go:562] "Hostname for pod was too long, truncated it" podName="kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" hostnameMaxLen=63 truncatedHostname="kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7" Mar 6 03:00:08.601383 containerd[1532]: time="2026-03-06T03:00:08.601265192Z" level=info msg="CreateContainer within sandbox \"ac1052d97d755aafbc785b8554ee6d88a5871e3ddcea70c0a9502b19d0a10600\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 6 03:00:08.602913 containerd[1532]: time="2026-03-06T03:00:08.602719571Z" level=info msg="Container 7f73c617cca1cffe508ad9a382b5d60656f9f156bc7a0cd5c9a6051be4bc4e72: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:08.609113 containerd[1532]: time="2026-03-06T03:00:08.609056261Z" level=info msg="CreateContainer within sandbox \"58b1ebb96fd5239ca744b530a2dad3b781c6309c1418945a414e5e97f457900a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"a5c85f938b028875d2ed2923e3620561326babe6098681357a26a9fd30720b67\"" Mar 6 03:00:08.610323 containerd[1532]: time="2026-03-06T03:00:08.610293748Z" level=info msg="StartContainer for \"a5c85f938b028875d2ed2923e3620561326babe6098681357a26a9fd30720b67\"" Mar 6 03:00:08.612678 containerd[1532]: time="2026-03-06T03:00:08.612640567Z" level=info msg="connecting to shim a5c85f938b028875d2ed2923e3620561326babe6098681357a26a9fd30720b67" address="unix:///run/containerd/s/a31432aab6443fd9e511570d63fb688494491d223b76c65dc7df15e730a579a1" protocol=ttrpc version=3 Mar 6 03:00:08.617842 containerd[1532]: time="2026-03-06T03:00:08.617756313Z" level=info msg="CreateContainer within sandbox \"2a61a5dd1c06d689fb6c061d4ab60f1727e1d0e09eee2d7389b086d2d15f55ac\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"7f73c617cca1cffe508ad9a382b5d60656f9f156bc7a0cd5c9a6051be4bc4e72\"" Mar 6 03:00:08.618763 containerd[1532]: time="2026-03-06T03:00:08.618718157Z" level=info msg="StartContainer for \"7f73c617cca1cffe508ad9a382b5d60656f9f156bc7a0cd5c9a6051be4bc4e72\"" Mar 6 03:00:08.620432 containerd[1532]: time="2026-03-06T03:00:08.620388134Z" level=info msg="connecting to shim 7f73c617cca1cffe508ad9a382b5d60656f9f156bc7a0cd5c9a6051be4bc4e72" address="unix:///run/containerd/s/7ce7081d7faa5e14034a7069477e8fbe46e70fa8feb0aeaadfe378295947990c" protocol=ttrpc version=3 Mar 6 03:00:08.622024 containerd[1532]: time="2026-03-06T03:00:08.621987907Z" level=info msg="Container 700122dd352c49927d1261cfa3d0c32b7d8c680d5cd024e4f6811c875533755b: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:08.635556 containerd[1532]: time="2026-03-06T03:00:08.635501522Z" level=info msg="CreateContainer within sandbox \"ac1052d97d755aafbc785b8554ee6d88a5871e3ddcea70c0a9502b19d0a10600\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"700122dd352c49927d1261cfa3d0c32b7d8c680d5cd024e4f6811c875533755b\"" Mar 6 03:00:08.636457 containerd[1532]: time="2026-03-06T03:00:08.636419511Z" level=info msg="StartContainer for \"700122dd352c49927d1261cfa3d0c32b7d8c680d5cd024e4f6811c875533755b\"" Mar 6 03:00:08.639174 containerd[1532]: time="2026-03-06T03:00:08.639128529Z" level=info msg="connecting to shim 700122dd352c49927d1261cfa3d0c32b7d8c680d5cd024e4f6811c875533755b" address="unix:///run/containerd/s/0151f8c130f7c46f85c15f0c30ae6c5ddd8e1e07be5d0159fe71d1f450e287fb" protocol=ttrpc version=3 Mar 6 03:00:08.656378 systemd[1]: Started cri-containerd-a5c85f938b028875d2ed2923e3620561326babe6098681357a26a9fd30720b67.scope - libcontainer container a5c85f938b028875d2ed2923e3620561326babe6098681357a26a9fd30720b67. Mar 6 03:00:08.682085 systemd[1]: Started cri-containerd-7f73c617cca1cffe508ad9a382b5d60656f9f156bc7a0cd5c9a6051be4bc4e72.scope - libcontainer container 7f73c617cca1cffe508ad9a382b5d60656f9f156bc7a0cd5c9a6051be4bc4e72. Mar 6 03:00:08.695292 systemd[1]: Started cri-containerd-700122dd352c49927d1261cfa3d0c32b7d8c680d5cd024e4f6811c875533755b.scope - libcontainer container 700122dd352c49927d1261cfa3d0c32b7d8c680d5cd024e4f6811c875533755b. Mar 6 03:00:08.794499 kubelet[2427]: E0306 03:00:08.794397 2427 controller.go:201] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce?timeout=10s\": dial tcp 10.128.0.87:6443: connect: connection refused" interval="1.6s" Mar 6 03:00:08.830716 containerd[1532]: time="2026-03-06T03:00:08.829164628Z" level=info msg="StartContainer for \"a5c85f938b028875d2ed2923e3620561326babe6098681357a26a9fd30720b67\" returns successfully" Mar 6 03:00:08.833485 containerd[1532]: time="2026-03-06T03:00:08.833356633Z" level=info msg="StartContainer for \"7f73c617cca1cffe508ad9a382b5d60656f9f156bc7a0cd5c9a6051be4bc4e72\" returns successfully" Mar 6 03:00:08.851984 containerd[1532]: time="2026-03-06T03:00:08.851928726Z" level=info msg="StartContainer for \"700122dd352c49927d1261cfa3d0c32b7d8c680d5cd024e4f6811c875533755b\" returns successfully" Mar 6 03:00:09.010344 kubelet[2427]: I0306 03:00:09.010296 2427 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:09.466867 kubelet[2427]: E0306 03:00:09.466377 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:09.476296 kubelet[2427]: E0306 03:00:09.476251 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:09.495720 kubelet[2427]: E0306 03:00:09.495657 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:10.483806 kubelet[2427]: E0306 03:00:10.483619 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:10.486178 kubelet[2427]: E0306 03:00:10.485805 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:10.752061 kubelet[2427]: E0306 03:00:10.751593 2427 nodelease.go:50] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:10.864308 kubelet[2427]: I0306 03:00:10.864234 2427 kubelet_node_status.go:77] "Successfully registered node" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:10.864308 kubelet[2427]: E0306 03:00:10.864295 2427 kubelet_node_status.go:474] "Error updating node status, will retry" err="error getting node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\": node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:10.894906 kubelet[2427]: E0306 03:00:10.894843 2427 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:10.995558 kubelet[2427]: E0306 03:00:10.995486 2427 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:11.095871 kubelet[2427]: E0306 03:00:11.095665 2427 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:11.196529 kubelet[2427]: E0306 03:00:11.196467 2427 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:11.297031 kubelet[2427]: E0306 03:00:11.296967 2427 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:11.398256 kubelet[2427]: E0306 03:00:11.398098 2427 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:11.490845 kubelet[2427]: E0306 03:00:11.490806 2427 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:11.499018 kubelet[2427]: E0306 03:00:11.498956 2427 kubelet_node_status.go:392] "Error getting the current node from lister" err="node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" not found" Mar 6 03:00:11.589807 kubelet[2427]: I0306 03:00:11.589457 2427 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:11.601344 kubelet[2427]: I0306 03:00:11.601270 2427 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters]" Mar 6 03:00:11.601520 kubelet[2427]: I0306 03:00:11.601450 2427 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:11.607833 kubelet[2427]: I0306 03:00:11.607472 2427 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters]" Mar 6 03:00:11.607833 kubelet[2427]: I0306 03:00:11.607616 2427 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:11.613970 kubelet[2427]: I0306 03:00:11.613668 2427 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters]" Mar 6 03:00:12.355957 kubelet[2427]: I0306 03:00:12.355864 2427 apiserver.go:52] "Watching apiserver" Mar 6 03:00:12.389697 kubelet[2427]: I0306 03:00:12.389628 2427 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 6 03:00:12.931680 systemd[1]: Reload requested from client PID 2708 ('systemctl') (unit session-9.scope)... Mar 6 03:00:12.931702 systemd[1]: Reloading... Mar 6 03:00:13.102815 zram_generator::config[2755]: No configuration found. Mar 6 03:00:13.510494 systemd[1]: Reloading finished in 578 ms. Mar 6 03:00:13.550030 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 03:00:13.564509 systemd[1]: kubelet.service: Deactivated successfully. Mar 6 03:00:13.564982 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 03:00:13.565089 systemd[1]: kubelet.service: Consumed 980ms CPU time, 126.2M memory peak. Mar 6 03:00:13.568421 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 6 03:00:13.954041 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 6 03:00:13.967433 (kubelet)[2800]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 6 03:00:14.034934 kubelet[2800]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 6 03:00:14.052437 kubelet[2800]: I0306 03:00:14.051616 2800 server.go:525] "Kubelet version" kubeletVersion="v1.35.1" Mar 6 03:00:14.052437 kubelet[2800]: I0306 03:00:14.051689 2800 server.go:527] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 6 03:00:14.052437 kubelet[2800]: I0306 03:00:14.051713 2800 watchdog_linux.go:95] "Systemd watchdog is not enabled" Mar 6 03:00:14.052437 kubelet[2800]: I0306 03:00:14.051722 2800 watchdog_linux.go:138] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 6 03:00:14.055089 kubelet[2800]: I0306 03:00:14.053080 2800 server.go:951] "Client rotation is on, will bootstrap in background" Mar 6 03:00:14.055161 kubelet[2800]: I0306 03:00:14.055094 2800 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Mar 6 03:00:14.063850 kubelet[2800]: I0306 03:00:14.063714 2800 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 6 03:00:14.072738 kubelet[2800]: I0306 03:00:14.072536 2800 server.go:1418] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 6 03:00:14.077807 kubelet[2800]: I0306 03:00:14.077710 2800 server.go:775] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Mar 6 03:00:14.078206 kubelet[2800]: I0306 03:00:14.078154 2800 container_manager_linux.go:272] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 6 03:00:14.080846 kubelet[2800]: I0306 03:00:14.078222 2800 container_manager_linux.go:277] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 6 03:00:14.080846 kubelet[2800]: I0306 03:00:14.079209 2800 topology_manager.go:143] "Creating topology manager with none policy" Mar 6 03:00:14.080846 kubelet[2800]: I0306 03:00:14.079227 2800 container_manager_linux.go:308] "Creating device plugin manager" Mar 6 03:00:14.080846 kubelet[2800]: I0306 03:00:14.079264 2800 container_manager_linux.go:317] "Creating Dynamic Resource Allocation (DRA) manager" Mar 6 03:00:14.081182 kubelet[2800]: I0306 03:00:14.079545 2800 state_mem.go:41] "Initialized" logger="CPUManager state memory" Mar 6 03:00:14.081182 kubelet[2800]: I0306 03:00:14.079752 2800 kubelet.go:482] "Attempting to sync node with API server" Mar 6 03:00:14.081182 kubelet[2800]: I0306 03:00:14.079810 2800 kubelet.go:383] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 6 03:00:14.081182 kubelet[2800]: I0306 03:00:14.079841 2800 kubelet.go:394] "Adding apiserver pod source" Mar 6 03:00:14.081182 kubelet[2800]: I0306 03:00:14.079858 2800 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 6 03:00:14.084394 kubelet[2800]: I0306 03:00:14.082295 2800 kuberuntime_manager.go:294] "Container runtime initialized" containerRuntime="containerd" version="v2.0.7" apiVersion="v1" Mar 6 03:00:14.087834 kubelet[2800]: I0306 03:00:14.087804 2800 kubelet.go:943] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Mar 6 03:00:14.088131 kubelet[2800]: I0306 03:00:14.088003 2800 kubelet.go:970] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Mar 6 03:00:14.125935 kubelet[2800]: I0306 03:00:14.125481 2800 server.go:1257] "Started kubelet" Mar 6 03:00:14.128964 kubelet[2800]: I0306 03:00:14.128913 2800 server.go:182] "Starting to listen" address="0.0.0.0" port=10250 Mar 6 03:00:14.132343 kubelet[2800]: I0306 03:00:14.131445 2800 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 6 03:00:14.132343 kubelet[2800]: I0306 03:00:14.131635 2800 server_v1.go:49] "podresources" method="list" useActivePods=true Mar 6 03:00:14.132343 kubelet[2800]: I0306 03:00:14.132074 2800 server.go:254] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 6 03:00:14.144264 kubelet[2800]: I0306 03:00:14.143121 2800 fs_resource_analyzer.go:69] "Starting FS ResourceAnalyzer" Mar 6 03:00:14.152700 kubelet[2800]: I0306 03:00:14.151148 2800 server.go:317] "Adding debug handlers to kubelet server" Mar 6 03:00:14.156604 kubelet[2800]: I0306 03:00:14.154614 2800 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 6 03:00:14.163123 kubelet[2800]: I0306 03:00:14.161213 2800 volume_manager.go:311] "Starting Kubelet Volume Manager" Mar 6 03:00:14.165391 kubelet[2800]: I0306 03:00:14.165354 2800 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 6 03:00:14.165811 kubelet[2800]: I0306 03:00:14.165792 2800 reconciler.go:29] "Reconciler: start to sync state" Mar 6 03:00:14.176208 kubelet[2800]: I0306 03:00:14.176178 2800 factory.go:223] Registration of the containerd container factory successfully Mar 6 03:00:14.176624 kubelet[2800]: I0306 03:00:14.176597 2800 factory.go:223] Registration of the systemd container factory successfully Mar 6 03:00:14.176901 kubelet[2800]: I0306 03:00:14.176871 2800 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 6 03:00:14.211290 kubelet[2800]: I0306 03:00:14.210941 2800 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Mar 6 03:00:14.224368 kubelet[2800]: I0306 03:00:14.224331 2800 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Mar 6 03:00:14.224626 kubelet[2800]: I0306 03:00:14.224610 2800 status_manager.go:249] "Starting to sync pod status with apiserver" Mar 6 03:00:14.225739 kubelet[2800]: I0306 03:00:14.225620 2800 kubelet.go:2501] "Starting kubelet main sync loop" Mar 6 03:00:14.226027 kubelet[2800]: E0306 03:00:14.225963 2800 kubelet.go:2525] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 6 03:00:14.296458 kubelet[2800]: I0306 03:00:14.296382 2800 cpu_manager.go:225] "Starting" policy="none" Mar 6 03:00:14.296458 kubelet[2800]: I0306 03:00:14.296409 2800 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 6 03:00:14.298086 kubelet[2800]: I0306 03:00:14.296754 2800 state_mem.go:41] "Initialized" logger="CPUManager state checkpoint.CPUManager state memory" Mar 6 03:00:14.298086 kubelet[2800]: I0306 03:00:14.296985 2800 state_mem.go:94] "Updated default CPUSet" logger="CPUManager state checkpoint.CPUManager state memory" cpuSet="" Mar 6 03:00:14.298086 kubelet[2800]: I0306 03:00:14.297004 2800 state_mem.go:102] "Updated CPUSet assignments" logger="CPUManager state checkpoint.CPUManager state memory" assignments={} Mar 6 03:00:14.298086 kubelet[2800]: I0306 03:00:14.297032 2800 policy_none.go:50] "Start" Mar 6 03:00:14.298086 kubelet[2800]: I0306 03:00:14.297047 2800 memory_manager.go:187] "Starting memorymanager" policy="None" Mar 6 03:00:14.298086 kubelet[2800]: I0306 03:00:14.297064 2800 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Mar 6 03:00:14.298086 kubelet[2800]: I0306 03:00:14.297219 2800 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Mar 6 03:00:14.298086 kubelet[2800]: I0306 03:00:14.297231 2800 policy_none.go:44] "Start" Mar 6 03:00:14.306568 kubelet[2800]: E0306 03:00:14.306520 2800 manager.go:525] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Mar 6 03:00:14.309361 kubelet[2800]: I0306 03:00:14.309321 2800 eviction_manager.go:194] "Eviction manager: starting control loop" Mar 6 03:00:14.309463 kubelet[2800]: I0306 03:00:14.309364 2800 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 6 03:00:14.311302 kubelet[2800]: I0306 03:00:14.311240 2800 plugin_manager.go:121] "Starting Kubelet Plugin Manager" Mar 6 03:00:14.317391 kubelet[2800]: E0306 03:00:14.317348 2800 eviction_manager.go:272] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 6 03:00:14.330553 kubelet[2800]: I0306 03:00:14.329525 2800 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.331393 kubelet[2800]: I0306 03:00:14.331355 2800 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.332785 kubelet[2800]: I0306 03:00:14.332060 2800 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.349174 kubelet[2800]: I0306 03:00:14.348612 2800 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters]" Mar 6 03:00:14.349174 kubelet[2800]: E0306 03:00:14.348742 2800 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" already exists" pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.350217 kubelet[2800]: I0306 03:00:14.349659 2800 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters]" Mar 6 03:00:14.350217 kubelet[2800]: E0306 03:00:14.349756 2800 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" already exists" pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.352008 kubelet[2800]: I0306 03:00:14.351875 2800 warnings.go:107] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters]" Mar 6 03:00:14.352008 kubelet[2800]: E0306 03:00:14.351938 2800 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" already exists" pod="kube-system/kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367063 kubelet[2800]: I0306 03:00:14.367024 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-ca-certs\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367672 kubelet[2800]: I0306 03:00:14.367280 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-k8s-certs\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367672 kubelet[2800]: I0306 03:00:14.367324 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367672 kubelet[2800]: I0306 03:00:14.367356 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/47bf9fb3beb350154ee96edcc6a219ac-kubeconfig\") pod \"kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"47bf9fb3beb350154ee96edcc6a219ac\") " pod="kube-system/kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367672 kubelet[2800]: I0306 03:00:14.367385 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cd7e45805ec9f31d3ae5f036c4ff52e9-ca-certs\") pod \"kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"cd7e45805ec9f31d3ae5f036c4ff52e9\") " pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367932 kubelet[2800]: I0306 03:00:14.367415 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cd7e45805ec9f31d3ae5f036c4ff52e9-k8s-certs\") pod \"kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"cd7e45805ec9f31d3ae5f036c4ff52e9\") " pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367932 kubelet[2800]: I0306 03:00:14.367444 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cd7e45805ec9f31d3ae5f036c4ff52e9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"cd7e45805ec9f31d3ae5f036c4ff52e9\") " pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367932 kubelet[2800]: I0306 03:00:14.367473 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.367932 kubelet[2800]: I0306 03:00:14.367503 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/685af48754a2db68a85300d4b435c9f6-kubeconfig\") pod \"kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" (UID: \"685af48754a2db68a85300d4b435c9f6\") " pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.428361 kubelet[2800]: I0306 03:00:14.428320 2800 kubelet_node_status.go:74] "Attempting to register node" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.440284 kubelet[2800]: I0306 03:00:14.438922 2800 kubelet_node_status.go:123] "Node was previously registered" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:14.440284 kubelet[2800]: I0306 03:00:14.439029 2800 kubelet_node_status.go:77] "Successfully registered node" node="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:00:15.081130 kubelet[2800]: I0306 03:00:15.081030 2800 apiserver.go:52] "Watching apiserver" Mar 6 03:00:15.165828 kubelet[2800]: I0306 03:00:15.165749 2800 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 6 03:00:15.359490 kubelet[2800]: I0306 03:00:15.359038 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" podStartSLOduration=4.358990357 podStartE2EDuration="4.358990357s" podCreationTimestamp="2026-03-06 03:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:00:15.345122558 +0000 UTC m=+1.370080670" watchObservedRunningTime="2026-03-06 03:00:15.358990357 +0000 UTC m=+1.383948470" Mar 6 03:00:15.389356 kubelet[2800]: I0306 03:00:15.389253 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" podStartSLOduration=4.389229473 podStartE2EDuration="4.389229473s" podCreationTimestamp="2026-03-06 03:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:00:15.385288343 +0000 UTC m=+1.410246456" watchObservedRunningTime="2026-03-06 03:00:15.389229473 +0000 UTC m=+1.414187576" Mar 6 03:00:15.389609 kubelet[2800]: I0306 03:00:15.389541 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" podStartSLOduration=4.3895247600000005 podStartE2EDuration="4.38952476s" podCreationTimestamp="2026-03-06 03:00:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:00:15.361993986 +0000 UTC m=+1.386952098" watchObservedRunningTime="2026-03-06 03:00:15.38952476 +0000 UTC m=+1.414482874" Mar 6 03:00:18.235250 kubelet[2800]: I0306 03:00:18.235150 2800 kuberuntime_manager.go:2062] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 6 03:00:18.236490 containerd[1532]: time="2026-03-06T03:00:18.236398721Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 6 03:00:18.238643 kubelet[2800]: I0306 03:00:18.238596 2800 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 6 03:00:19.212826 systemd[1]: Created slice kubepods-besteffort-pod65ec1ba5_0117_4e42_b5b4_6fc9ec5199e0.slice - libcontainer container kubepods-besteffort-pod65ec1ba5_0117_4e42_b5b4_6fc9ec5199e0.slice. Mar 6 03:00:19.310708 kubelet[2800]: I0306 03:00:19.310586 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0-xtables-lock\") pod \"kube-proxy-4xpbq\" (UID: \"65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0\") " pod="kube-system/kube-proxy-4xpbq" Mar 6 03:00:19.312310 kubelet[2800]: I0306 03:00:19.310832 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0-lib-modules\") pod \"kube-proxy-4xpbq\" (UID: \"65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0\") " pod="kube-system/kube-proxy-4xpbq" Mar 6 03:00:19.312310 kubelet[2800]: I0306 03:00:19.311024 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kmwl\" (UniqueName: \"kubernetes.io/projected/65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0-kube-api-access-4kmwl\") pod \"kube-proxy-4xpbq\" (UID: \"65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0\") " pod="kube-system/kube-proxy-4xpbq" Mar 6 03:00:19.312310 kubelet[2800]: I0306 03:00:19.311374 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0-kube-proxy\") pod \"kube-proxy-4xpbq\" (UID: \"65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0\") " pod="kube-system/kube-proxy-4xpbq" Mar 6 03:00:19.517237 systemd[1]: Created slice kubepods-besteffort-podae6e60af_7a5f_4bc0_8de2_b800d88e8f19.slice - libcontainer container kubepods-besteffort-podae6e60af_7a5f_4bc0_8de2_b800d88e8f19.slice. Mar 6 03:00:19.529426 containerd[1532]: time="2026-03-06T03:00:19.529373422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4xpbq,Uid:65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0,Namespace:kube-system,Attempt:0,}" Mar 6 03:00:19.561696 containerd[1532]: time="2026-03-06T03:00:19.561570964Z" level=info msg="connecting to shim 160a8eb0b45b7a925ccd100c7f392b84fd42b22b09a3813d6b5a6006821130ea" address="unix:///run/containerd/s/dccf30dfb9f149d9f2e3fac051fbecb1f1b9c18e51dc46d1c46f5a8e233cb6a8" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:00:19.605155 systemd[1]: Started cri-containerd-160a8eb0b45b7a925ccd100c7f392b84fd42b22b09a3813d6b5a6006821130ea.scope - libcontainer container 160a8eb0b45b7a925ccd100c7f392b84fd42b22b09a3813d6b5a6006821130ea. Mar 6 03:00:19.614257 kubelet[2800]: I0306 03:00:19.614156 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ae6e60af-7a5f-4bc0-8de2-b800d88e8f19-var-lib-calico\") pod \"tigera-operator-6cf4cccc57-hmkfd\" (UID: \"ae6e60af-7a5f-4bc0-8de2-b800d88e8f19\") " pod="tigera-operator/tigera-operator-6cf4cccc57-hmkfd" Mar 6 03:00:19.614449 kubelet[2800]: I0306 03:00:19.614314 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xl8h\" (UniqueName: \"kubernetes.io/projected/ae6e60af-7a5f-4bc0-8de2-b800d88e8f19-kube-api-access-4xl8h\") pod \"tigera-operator-6cf4cccc57-hmkfd\" (UID: \"ae6e60af-7a5f-4bc0-8de2-b800d88e8f19\") " pod="tigera-operator/tigera-operator-6cf4cccc57-hmkfd" Mar 6 03:00:19.655677 containerd[1532]: time="2026-03-06T03:00:19.655607773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4xpbq,Uid:65ec1ba5-0117-4e42-b5b4-6fc9ec5199e0,Namespace:kube-system,Attempt:0,} returns sandbox id \"160a8eb0b45b7a925ccd100c7f392b84fd42b22b09a3813d6b5a6006821130ea\"" Mar 6 03:00:19.666904 containerd[1532]: time="2026-03-06T03:00:19.666850600Z" level=info msg="CreateContainer within sandbox \"160a8eb0b45b7a925ccd100c7f392b84fd42b22b09a3813d6b5a6006821130ea\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 6 03:00:19.684708 containerd[1532]: time="2026-03-06T03:00:19.684654441Z" level=info msg="Container ffc5989e6e1826b7d938031d1e611a697b63fd4a36f694ed416e752780204cc9: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:19.694897 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4033404403.mount: Deactivated successfully. Mar 6 03:00:19.704793 containerd[1532]: time="2026-03-06T03:00:19.703988161Z" level=info msg="CreateContainer within sandbox \"160a8eb0b45b7a925ccd100c7f392b84fd42b22b09a3813d6b5a6006821130ea\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ffc5989e6e1826b7d938031d1e611a697b63fd4a36f694ed416e752780204cc9\"" Mar 6 03:00:19.706801 containerd[1532]: time="2026-03-06T03:00:19.705828338Z" level=info msg="StartContainer for \"ffc5989e6e1826b7d938031d1e611a697b63fd4a36f694ed416e752780204cc9\"" Mar 6 03:00:19.712095 containerd[1532]: time="2026-03-06T03:00:19.712010961Z" level=info msg="connecting to shim ffc5989e6e1826b7d938031d1e611a697b63fd4a36f694ed416e752780204cc9" address="unix:///run/containerd/s/dccf30dfb9f149d9f2e3fac051fbecb1f1b9c18e51dc46d1c46f5a8e233cb6a8" protocol=ttrpc version=3 Mar 6 03:00:19.747098 systemd[1]: Started cri-containerd-ffc5989e6e1826b7d938031d1e611a697b63fd4a36f694ed416e752780204cc9.scope - libcontainer container ffc5989e6e1826b7d938031d1e611a697b63fd4a36f694ed416e752780204cc9. Mar 6 03:00:19.829191 containerd[1532]: time="2026-03-06T03:00:19.829026328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-hmkfd,Uid:ae6e60af-7a5f-4bc0-8de2-b800d88e8f19,Namespace:tigera-operator,Attempt:0,}" Mar 6 03:00:19.869717 containerd[1532]: time="2026-03-06T03:00:19.869044174Z" level=info msg="connecting to shim 53e2d49c03e7defe47dab5ed17bbfabce5ffc366aac36e29d969bd83301a0ffd" address="unix:///run/containerd/s/221e623ae73491d36a043f73cdb325777385284a5686f3412652192880aa32fc" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:00:19.880075 containerd[1532]: time="2026-03-06T03:00:19.880027254Z" level=info msg="StartContainer for \"ffc5989e6e1826b7d938031d1e611a697b63fd4a36f694ed416e752780204cc9\" returns successfully" Mar 6 03:00:19.926350 systemd[1]: Started cri-containerd-53e2d49c03e7defe47dab5ed17bbfabce5ffc366aac36e29d969bd83301a0ffd.scope - libcontainer container 53e2d49c03e7defe47dab5ed17bbfabce5ffc366aac36e29d969bd83301a0ffd. Mar 6 03:00:20.031109 containerd[1532]: time="2026-03-06T03:00:20.031027194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6cf4cccc57-hmkfd,Uid:ae6e60af-7a5f-4bc0-8de2-b800d88e8f19,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"53e2d49c03e7defe47dab5ed17bbfabce5ffc366aac36e29d969bd83301a0ffd\"" Mar 6 03:00:20.035351 containerd[1532]: time="2026-03-06T03:00:20.035222643Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\"" Mar 6 03:00:20.624813 kubelet[2800]: I0306 03:00:20.624027 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/kube-proxy-4xpbq" podStartSLOduration=1.624004594 podStartE2EDuration="1.624004594s" podCreationTimestamp="2026-03-06 03:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:00:20.320896758 +0000 UTC m=+6.345854871" watchObservedRunningTime="2026-03-06 03:00:20.624004594 +0000 UTC m=+6.648962705" Mar 6 03:00:21.560841 update_engine[1524]: I20260306 03:00:21.560416 1524 update_attempter.cc:509] Updating boot flags... Mar 6 03:00:22.129034 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2112238305.mount: Deactivated successfully. Mar 6 03:00:24.317072 containerd[1532]: time="2026-03-06T03:00:24.316942398Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.40.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:24.319201 containerd[1532]: time="2026-03-06T03:00:24.318981846Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.40.7: active requests=0, bytes read=40846156" Mar 6 03:00:24.320801 containerd[1532]: time="2026-03-06T03:00:24.320702805Z" level=info msg="ImageCreate event name:\"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:24.323549 containerd[1532]: time="2026-03-06T03:00:24.323194982Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:24.324736 containerd[1532]: time="2026-03-06T03:00:24.324235104Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.40.7\" with image id \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\", repo tag \"quay.io/tigera/operator:v1.40.7\", repo digest \"quay.io/tigera/operator@sha256:53260704fc6e638633b243729411222e01e1898647352a6e1a09cc046887973a\", size \"40842151\" in 4.288941935s" Mar 6 03:00:24.324736 containerd[1532]: time="2026-03-06T03:00:24.324278417Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.40.7\" returns image reference \"sha256:de04da31b5feb10fd313c39b7ac72d47ce9b5b8eb06161142e2e2283059a52c2\"" Mar 6 03:00:24.333109 containerd[1532]: time="2026-03-06T03:00:24.333055690Z" level=info msg="CreateContainer within sandbox \"53e2d49c03e7defe47dab5ed17bbfabce5ffc366aac36e29d969bd83301a0ffd\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 6 03:00:24.347799 containerd[1532]: time="2026-03-06T03:00:24.344906176Z" level=info msg="Container 3d8e8883971a81e929875e59b7eb1cbd4669366570d44b8cae807b4c22248c4f: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:24.359699 containerd[1532]: time="2026-03-06T03:00:24.359631529Z" level=info msg="CreateContainer within sandbox \"53e2d49c03e7defe47dab5ed17bbfabce5ffc366aac36e29d969bd83301a0ffd\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3d8e8883971a81e929875e59b7eb1cbd4669366570d44b8cae807b4c22248c4f\"" Mar 6 03:00:24.361380 containerd[1532]: time="2026-03-06T03:00:24.360493202Z" level=info msg="StartContainer for \"3d8e8883971a81e929875e59b7eb1cbd4669366570d44b8cae807b4c22248c4f\"" Mar 6 03:00:24.362817 containerd[1532]: time="2026-03-06T03:00:24.362693107Z" level=info msg="connecting to shim 3d8e8883971a81e929875e59b7eb1cbd4669366570d44b8cae807b4c22248c4f" address="unix:///run/containerd/s/221e623ae73491d36a043f73cdb325777385284a5686f3412652192880aa32fc" protocol=ttrpc version=3 Mar 6 03:00:24.401017 systemd[1]: Started cri-containerd-3d8e8883971a81e929875e59b7eb1cbd4669366570d44b8cae807b4c22248c4f.scope - libcontainer container 3d8e8883971a81e929875e59b7eb1cbd4669366570d44b8cae807b4c22248c4f. Mar 6 03:00:24.461223 containerd[1532]: time="2026-03-06T03:00:24.461079893Z" level=info msg="StartContainer for \"3d8e8883971a81e929875e59b7eb1cbd4669366570d44b8cae807b4c22248c4f\" returns successfully" Mar 6 03:00:25.522652 kubelet[2800]: I0306 03:00:25.522173 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6cf4cccc57-hmkfd" podStartSLOduration=2.230572197 podStartE2EDuration="6.522149677s" podCreationTimestamp="2026-03-06 03:00:19 +0000 UTC" firstStartedPulling="2026-03-06 03:00:20.034298838 +0000 UTC m=+6.059256951" lastFinishedPulling="2026-03-06 03:00:24.325876325 +0000 UTC m=+10.350834431" observedRunningTime="2026-03-06 03:00:25.333420968 +0000 UTC m=+11.358379080" watchObservedRunningTime="2026-03-06 03:00:25.522149677 +0000 UTC m=+11.547107790" Mar 6 03:00:31.882226 sudo[1866]: pam_unix(sudo:session): session closed for user root Mar 6 03:00:31.919930 sshd[1865]: Connection closed by 20.161.92.111 port 50500 Mar 6 03:00:31.922118 sshd-session[1862]: pam_unix(sshd:session): session closed for user core Mar 6 03:00:31.931919 systemd[1]: sshd@8-10.128.0.87:22-20.161.92.111:50500.service: Deactivated successfully. Mar 6 03:00:31.938743 systemd[1]: session-9.scope: Deactivated successfully. Mar 6 03:00:31.939383 systemd[1]: session-9.scope: Consumed 5.730s CPU time, 229.4M memory peak. Mar 6 03:00:31.945784 systemd-logind[1505]: Session 9 logged out. Waiting for processes to exit. Mar 6 03:00:31.949054 systemd-logind[1505]: Removed session 9. Mar 6 03:00:33.708179 systemd[1]: Started sshd@9-10.128.0.87:22-80.94.95.115:26282.service - OpenSSH per-connection server daemon (80.94.95.115:26282). Mar 6 03:00:36.894902 systemd[1]: Created slice kubepods-besteffort-pod2127933e_b432_4e17_8cc3_e2a10e385723.slice - libcontainer container kubepods-besteffort-pod2127933e_b432_4e17_8cc3_e2a10e385723.slice. Mar 6 03:00:36.934792 kubelet[2800]: I0306 03:00:36.934602 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2127933e-b432-4e17-8cc3-e2a10e385723-tigera-ca-bundle\") pod \"calico-typha-67677bfd4b-q27ls\" (UID: \"2127933e-b432-4e17-8cc3-e2a10e385723\") " pod="calico-system/calico-typha-67677bfd4b-q27ls" Mar 6 03:00:36.934792 kubelet[2800]: I0306 03:00:36.934664 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmmz6\" (UniqueName: \"kubernetes.io/projected/2127933e-b432-4e17-8cc3-e2a10e385723-kube-api-access-xmmz6\") pod \"calico-typha-67677bfd4b-q27ls\" (UID: \"2127933e-b432-4e17-8cc3-e2a10e385723\") " pod="calico-system/calico-typha-67677bfd4b-q27ls" Mar 6 03:00:36.934792 kubelet[2800]: I0306 03:00:36.934700 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/2127933e-b432-4e17-8cc3-e2a10e385723-typha-certs\") pod \"calico-typha-67677bfd4b-q27ls\" (UID: \"2127933e-b432-4e17-8cc3-e2a10e385723\") " pod="calico-system/calico-typha-67677bfd4b-q27ls" Mar 6 03:00:37.214792 containerd[1532]: time="2026-03-06T03:00:37.213602541Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-67677bfd4b-q27ls,Uid:2127933e-b432-4e17-8cc3-e2a10e385723,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:37.228732 systemd[1]: Created slice kubepods-besteffort-pod8f38af03_3728_4d2d_b5a7_dd489d7289be.slice - libcontainer container kubepods-besteffort-pod8f38af03_3728_4d2d_b5a7_dd489d7289be.slice. Mar 6 03:00:37.238221 kubelet[2800]: I0306 03:00:37.238072 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-lib-modules\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.239925 kubelet[2800]: I0306 03:00:37.239473 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8f38af03-3728-4d2d-b5a7-dd489d7289be-node-certs\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.239925 kubelet[2800]: I0306 03:00:37.239545 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-policysync\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.239925 kubelet[2800]: I0306 03:00:37.239575 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f38af03-3728-4d2d-b5a7-dd489d7289be-tigera-ca-bundle\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.239925 kubelet[2800]: I0306 03:00:37.239604 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-xtables-lock\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.239925 kubelet[2800]: I0306 03:00:37.239639 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-var-lib-calico\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.240398 kubelet[2800]: I0306 03:00:37.239671 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bpffs\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-bpffs\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.240398 kubelet[2800]: I0306 03:00:37.239705 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-flexvol-driver-host\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.241651 kubelet[2800]: I0306 03:00:37.240699 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttls\" (UniqueName: \"kubernetes.io/projected/8f38af03-3728-4d2d-b5a7-dd489d7289be-kube-api-access-2ttls\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.241651 kubelet[2800]: I0306 03:00:37.240756 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nodeproc\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-nodeproc\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.241651 kubelet[2800]: I0306 03:00:37.240922 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-cni-bin-dir\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.241651 kubelet[2800]: I0306 03:00:37.240968 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-cni-log-dir\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.241651 kubelet[2800]: I0306 03:00:37.241008 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-cni-net-dir\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.241963 kubelet[2800]: I0306 03:00:37.241033 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-var-run-calico\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.241963 kubelet[2800]: I0306 03:00:37.241060 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8f38af03-3728-4d2d-b5a7-dd489d7289be-sys-fs\") pod \"calico-node-bpq4z\" (UID: \"8f38af03-3728-4d2d-b5a7-dd489d7289be\") " pod="calico-system/calico-node-bpq4z" Mar 6 03:00:37.266413 containerd[1532]: time="2026-03-06T03:00:37.266240432Z" level=info msg="connecting to shim 04416149ecd9d0bc53b7fa7c719378ee19e88688ebde5192c9661496dfbf3305" address="unix:///run/containerd/s/35d9c9688d0cf83cd798dfbba5e7d7614cede13015523c993ecbfab294825ad0" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:00:37.326148 systemd[1]: Started cri-containerd-04416149ecd9d0bc53b7fa7c719378ee19e88688ebde5192c9661496dfbf3305.scope - libcontainer container 04416149ecd9d0bc53b7fa7c719378ee19e88688ebde5192c9661496dfbf3305. Mar 6 03:00:37.345071 kubelet[2800]: E0306 03:00:37.344997 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:37.348063 kubelet[2800]: E0306 03:00:37.347203 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.348063 kubelet[2800]: W0306 03:00:37.347256 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.348063 kubelet[2800]: E0306 03:00:37.347286 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.348063 kubelet[2800]: E0306 03:00:37.347970 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.348063 kubelet[2800]: W0306 03:00:37.347988 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.348063 kubelet[2800]: E0306 03:00:37.348010 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.351243 kubelet[2800]: E0306 03:00:37.351040 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.351243 kubelet[2800]: W0306 03:00:37.351083 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.351243 kubelet[2800]: E0306 03:00:37.351110 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.353675 kubelet[2800]: E0306 03:00:37.353029 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.353675 kubelet[2800]: W0306 03:00:37.353050 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.353675 kubelet[2800]: E0306 03:00:37.353071 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.358642 kubelet[2800]: E0306 03:00:37.356626 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.358642 kubelet[2800]: W0306 03:00:37.356651 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.358642 kubelet[2800]: E0306 03:00:37.356673 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.359291 kubelet[2800]: E0306 03:00:37.359268 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.359536 kubelet[2800]: W0306 03:00:37.359477 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.359905 kubelet[2800]: E0306 03:00:37.359872 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.364559 kubelet[2800]: E0306 03:00:37.363887 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.364559 kubelet[2800]: W0306 03:00:37.363909 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.364559 kubelet[2800]: E0306 03:00:37.363930 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.364559 kubelet[2800]: E0306 03:00:37.364253 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.364559 kubelet[2800]: W0306 03:00:37.364267 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.364559 kubelet[2800]: E0306 03:00:37.364290 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.367508 kubelet[2800]: E0306 03:00:37.367254 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.367508 kubelet[2800]: W0306 03:00:37.367276 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.367508 kubelet[2800]: E0306 03:00:37.367296 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.368460 kubelet[2800]: E0306 03:00:37.368382 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.368460 kubelet[2800]: W0306 03:00:37.368403 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.368460 kubelet[2800]: E0306 03:00:37.368423 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.370067 kubelet[2800]: E0306 03:00:37.369541 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.370067 kubelet[2800]: W0306 03:00:37.369561 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.370067 kubelet[2800]: E0306 03:00:37.369586 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.370597 kubelet[2800]: E0306 03:00:37.370472 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.370597 kubelet[2800]: W0306 03:00:37.370491 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.370597 kubelet[2800]: E0306 03:00:37.370510 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.372519 kubelet[2800]: E0306 03:00:37.372157 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.372519 kubelet[2800]: W0306 03:00:37.372173 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.372519 kubelet[2800]: E0306 03:00:37.372192 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.373572 kubelet[2800]: E0306 03:00:37.372932 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.373572 kubelet[2800]: W0306 03:00:37.372958 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.373572 kubelet[2800]: E0306 03:00:37.372977 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.374108 kubelet[2800]: E0306 03:00:37.374077 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.374108 kubelet[2800]: W0306 03:00:37.374107 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.374313 kubelet[2800]: E0306 03:00:37.374126 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.375915 kubelet[2800]: E0306 03:00:37.375888 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.375915 kubelet[2800]: W0306 03:00:37.375914 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.376262 kubelet[2800]: E0306 03:00:37.375933 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.378064 kubelet[2800]: E0306 03:00:37.377918 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.378064 kubelet[2800]: W0306 03:00:37.377945 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.378064 kubelet[2800]: E0306 03:00:37.377966 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.378855 kubelet[2800]: E0306 03:00:37.378274 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.378855 kubelet[2800]: W0306 03:00:37.378290 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.378855 kubelet[2800]: E0306 03:00:37.378306 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.378855 kubelet[2800]: E0306 03:00:37.378596 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.378855 kubelet[2800]: W0306 03:00:37.378609 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.378855 kubelet[2800]: E0306 03:00:37.378624 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.380135 kubelet[2800]: E0306 03:00:37.380000 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.380135 kubelet[2800]: W0306 03:00:37.380021 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.380135 kubelet[2800]: E0306 03:00:37.380043 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.380548 kubelet[2800]: E0306 03:00:37.380365 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.380548 kubelet[2800]: W0306 03:00:37.380392 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.380548 kubelet[2800]: E0306 03:00:37.380412 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.381512 kubelet[2800]: E0306 03:00:37.381486 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.381512 kubelet[2800]: W0306 03:00:37.381505 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.382376 kubelet[2800]: E0306 03:00:37.381524 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.382836 kubelet[2800]: E0306 03:00:37.382695 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.382836 kubelet[2800]: W0306 03:00:37.382725 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.382836 kubelet[2800]: E0306 03:00:37.382743 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.383640 kubelet[2800]: E0306 03:00:37.383559 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.383640 kubelet[2800]: W0306 03:00:37.383582 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.383640 kubelet[2800]: E0306 03:00:37.383599 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.384785 kubelet[2800]: E0306 03:00:37.384745 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.385380 kubelet[2800]: W0306 03:00:37.384980 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.385380 kubelet[2800]: E0306 03:00:37.385011 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.386595 kubelet[2800]: E0306 03:00:37.386409 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.386595 kubelet[2800]: W0306 03:00:37.386429 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.386595 kubelet[2800]: E0306 03:00:37.386447 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.391039 kubelet[2800]: E0306 03:00:37.391007 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.391192 kubelet[2800]: W0306 03:00:37.391047 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.391192 kubelet[2800]: E0306 03:00:37.391102 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.392398 kubelet[2800]: E0306 03:00:37.392299 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.392398 kubelet[2800]: W0306 03:00:37.392325 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.392398 kubelet[2800]: E0306 03:00:37.392345 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.393899 kubelet[2800]: E0306 03:00:37.393870 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.393899 kubelet[2800]: W0306 03:00:37.393898 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.394176 kubelet[2800]: E0306 03:00:37.393918 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.395657 kubelet[2800]: E0306 03:00:37.395630 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.395657 kubelet[2800]: W0306 03:00:37.395656 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.395844 kubelet[2800]: E0306 03:00:37.395683 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.396890 kubelet[2800]: E0306 03:00:37.396821 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.396890 kubelet[2800]: W0306 03:00:37.396842 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.396890 kubelet[2800]: E0306 03:00:37.396864 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.398137 kubelet[2800]: E0306 03:00:37.398105 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.398137 kubelet[2800]: W0306 03:00:37.398130 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.398339 kubelet[2800]: E0306 03:00:37.398149 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.399158 kubelet[2800]: E0306 03:00:37.398960 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.399158 kubelet[2800]: W0306 03:00:37.399098 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.399158 kubelet[2800]: E0306 03:00:37.399121 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.400493 kubelet[2800]: E0306 03:00:37.400457 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.400493 kubelet[2800]: W0306 03:00:37.400486 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.400635 kubelet[2800]: E0306 03:00:37.400505 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.401521 kubelet[2800]: E0306 03:00:37.401494 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.401521 kubelet[2800]: W0306 03:00:37.401518 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.401670 kubelet[2800]: E0306 03:00:37.401536 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.402590 kubelet[2800]: E0306 03:00:37.402554 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.402590 kubelet[2800]: W0306 03:00:37.402575 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.402726 kubelet[2800]: E0306 03:00:37.402595 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.404044 kubelet[2800]: E0306 03:00:37.403996 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.404044 kubelet[2800]: W0306 03:00:37.404042 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.404330 kubelet[2800]: E0306 03:00:37.404061 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.405508 kubelet[2800]: E0306 03:00:37.405477 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.405508 kubelet[2800]: W0306 03:00:37.405505 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.405638 kubelet[2800]: E0306 03:00:37.405525 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.406161 kubelet[2800]: E0306 03:00:37.406139 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.406161 kubelet[2800]: W0306 03:00:37.406160 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.406299 kubelet[2800]: E0306 03:00:37.406177 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.407203 kubelet[2800]: E0306 03:00:37.407179 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.407203 kubelet[2800]: W0306 03:00:37.407202 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.407341 kubelet[2800]: E0306 03:00:37.407219 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.408374 kubelet[2800]: E0306 03:00:37.408345 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.408374 kubelet[2800]: W0306 03:00:37.408372 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.408873 kubelet[2800]: E0306 03:00:37.408390 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.408873 kubelet[2800]: E0306 03:00:37.408697 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.408873 kubelet[2800]: W0306 03:00:37.408712 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.408873 kubelet[2800]: E0306 03:00:37.408729 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.409837 kubelet[2800]: E0306 03:00:37.409814 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.409837 kubelet[2800]: W0306 03:00:37.409835 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.409994 kubelet[2800]: E0306 03:00:37.409853 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.410681 kubelet[2800]: E0306 03:00:37.410657 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.410681 kubelet[2800]: W0306 03:00:37.410680 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.410873 kubelet[2800]: E0306 03:00:37.410698 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.415548 kubelet[2800]: E0306 03:00:37.414761 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.415672 kubelet[2800]: W0306 03:00:37.415549 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.415672 kubelet[2800]: E0306 03:00:37.415590 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.416088 kubelet[2800]: E0306 03:00:37.416063 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.416196 kubelet[2800]: W0306 03:00:37.416101 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.416196 kubelet[2800]: E0306 03:00:37.416121 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.417088 kubelet[2800]: E0306 03:00:37.417050 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.417088 kubelet[2800]: W0306 03:00:37.417077 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.417231 kubelet[2800]: E0306 03:00:37.417104 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.418700 kubelet[2800]: E0306 03:00:37.418671 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.418700 kubelet[2800]: W0306 03:00:37.418697 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.418897 kubelet[2800]: E0306 03:00:37.418716 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.420729 kubelet[2800]: E0306 03:00:37.420319 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.420729 kubelet[2800]: W0306 03:00:37.420341 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.420729 kubelet[2800]: E0306 03:00:37.420361 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.422048 kubelet[2800]: E0306 03:00:37.421736 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.422048 kubelet[2800]: W0306 03:00:37.421759 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.422048 kubelet[2800]: E0306 03:00:37.421810 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.423894 kubelet[2800]: E0306 03:00:37.423867 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.423894 kubelet[2800]: W0306 03:00:37.423893 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.424046 kubelet[2800]: E0306 03:00:37.423923 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.429727 kubelet[2800]: E0306 03:00:37.429698 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.429942 kubelet[2800]: W0306 03:00:37.429920 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.430151 kubelet[2800]: E0306 03:00:37.430054 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.437884 kubelet[2800]: E0306 03:00:37.435609 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.437884 kubelet[2800]: W0306 03:00:37.435635 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.437884 kubelet[2800]: E0306 03:00:37.435662 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.439315 kubelet[2800]: E0306 03:00:37.439114 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.439315 kubelet[2800]: W0306 03:00:37.439141 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.439315 kubelet[2800]: E0306 03:00:37.439171 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.440010 kubelet[2800]: E0306 03:00:37.439971 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.440722 kubelet[2800]: W0306 03:00:37.440185 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.440722 kubelet[2800]: E0306 03:00:37.440218 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.441637 kubelet[2800]: E0306 03:00:37.441519 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.441910 kubelet[2800]: W0306 03:00:37.441886 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.442207 kubelet[2800]: E0306 03:00:37.442140 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.443351 kubelet[2800]: E0306 03:00:37.443238 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.443664 kubelet[2800]: W0306 03:00:37.443542 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.443970 kubelet[2800]: E0306 03:00:37.443926 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.445350 kubelet[2800]: E0306 03:00:37.445190 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.447268 kubelet[2800]: W0306 03:00:37.445579 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.447268 kubelet[2800]: E0306 03:00:37.445622 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.448157 kubelet[2800]: E0306 03:00:37.448028 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.448391 kubelet[2800]: W0306 03:00:37.448364 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.448615 kubelet[2800]: E0306 03:00:37.448577 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.449731 kubelet[2800]: E0306 03:00:37.449604 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.449949 kubelet[2800]: W0306 03:00:37.449893 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.450486 kubelet[2800]: E0306 03:00:37.450119 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.451442 kubelet[2800]: E0306 03:00:37.451419 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.451581 kubelet[2800]: W0306 03:00:37.451559 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.451924 kubelet[2800]: E0306 03:00:37.451720 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.452972 kubelet[2800]: E0306 03:00:37.452733 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.452972 kubelet[2800]: W0306 03:00:37.452753 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.453636 kubelet[2800]: E0306 03:00:37.453279 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.454298 kubelet[2800]: E0306 03:00:37.454170 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.454505 kubelet[2800]: W0306 03:00:37.454484 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.455810 kubelet[2800]: E0306 03:00:37.454816 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.456146 kubelet[2800]: E0306 03:00:37.456127 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.456260 kubelet[2800]: W0306 03:00:37.456239 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.456603 kubelet[2800]: E0306 03:00:37.456366 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.457661 kubelet[2800]: E0306 03:00:37.457292 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.457872 kubelet[2800]: W0306 03:00:37.457826 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.458031 kubelet[2800]: E0306 03:00:37.457973 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.459311 kubelet[2800]: E0306 03:00:37.459173 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.459311 kubelet[2800]: W0306 03:00:37.459191 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.459311 kubelet[2800]: E0306 03:00:37.459208 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.459901 kubelet[2800]: E0306 03:00:37.459688 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.459901 kubelet[2800]: W0306 03:00:37.459705 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.459901 kubelet[2800]: E0306 03:00:37.459722 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.461155 kubelet[2800]: E0306 03:00:37.460995 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.461155 kubelet[2800]: W0306 03:00:37.461014 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.461155 kubelet[2800]: E0306 03:00:37.461031 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.461680 kubelet[2800]: E0306 03:00:37.461511 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.461680 kubelet[2800]: W0306 03:00:37.461527 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.461680 kubelet[2800]: E0306 03:00:37.461543 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.462667 kubelet[2800]: E0306 03:00:37.462520 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.462667 kubelet[2800]: W0306 03:00:37.462538 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.462667 kubelet[2800]: E0306 03:00:37.462555 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.464739 kubelet[2800]: E0306 03:00:37.464534 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.464739 kubelet[2800]: W0306 03:00:37.464554 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.464739 kubelet[2800]: E0306 03:00:37.464572 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.465476 kubelet[2800]: E0306 03:00:37.465217 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.465476 kubelet[2800]: W0306 03:00:37.465312 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.465476 kubelet[2800]: E0306 03:00:37.465335 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.466157 kubelet[2800]: E0306 03:00:37.465941 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.466157 kubelet[2800]: W0306 03:00:37.466002 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.466157 kubelet[2800]: E0306 03:00:37.466021 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.467212 kubelet[2800]: E0306 03:00:37.467080 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.467212 kubelet[2800]: W0306 03:00:37.467098 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.467212 kubelet[2800]: E0306 03:00:37.467115 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.467615 kubelet[2800]: E0306 03:00:37.467570 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.468140 kubelet[2800]: W0306 03:00:37.467798 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.468140 kubelet[2800]: E0306 03:00:37.467820 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.468677 kubelet[2800]: E0306 03:00:37.468541 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.469134 kubelet[2800]: W0306 03:00:37.468803 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.469134 kubelet[2800]: E0306 03:00:37.468831 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.469698 kubelet[2800]: E0306 03:00:37.469662 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.470829 kubelet[2800]: W0306 03:00:37.469898 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.470829 kubelet[2800]: E0306 03:00:37.469921 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.471192 kubelet[2800]: E0306 03:00:37.471173 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.471294 kubelet[2800]: W0306 03:00:37.471278 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.471495 kubelet[2800]: E0306 03:00:37.471377 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.471953 kubelet[2800]: E0306 03:00:37.471759 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.472442 kubelet[2800]: W0306 03:00:37.472165 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.472442 kubelet[2800]: E0306 03:00:37.472189 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.473481 kubelet[2800]: E0306 03:00:37.473279 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.473481 kubelet[2800]: W0306 03:00:37.473320 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.473481 kubelet[2800]: E0306 03:00:37.473342 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.475942 kubelet[2800]: E0306 03:00:37.475869 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.475942 kubelet[2800]: W0306 03:00:37.475891 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.477167 kubelet[2800]: E0306 03:00:37.476036 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.485552 kubelet[2800]: E0306 03:00:37.485520 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.485864 kubelet[2800]: W0306 03:00:37.485719 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.485864 kubelet[2800]: E0306 03:00:37.485756 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.496092 kubelet[2800]: E0306 03:00:37.495356 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.496092 kubelet[2800]: W0306 03:00:37.495821 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.496092 kubelet[2800]: E0306 03:00:37.495862 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.496092 kubelet[2800]: I0306 03:00:37.495909 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3669bc8e-a01a-47d2-a2d9-628a14d7e7f5-kubelet-dir\") pod \"csi-node-driver-2m65n\" (UID: \"3669bc8e-a01a-47d2-a2d9-628a14d7e7f5\") " pod="calico-system/csi-node-driver-2m65n" Mar 6 03:00:37.496937 kubelet[2800]: E0306 03:00:37.496913 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.498105 kubelet[2800]: W0306 03:00:37.497095 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.498105 kubelet[2800]: E0306 03:00:37.497135 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.498105 kubelet[2800]: I0306 03:00:37.497237 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3669bc8e-a01a-47d2-a2d9-628a14d7e7f5-socket-dir\") pod \"csi-node-driver-2m65n\" (UID: \"3669bc8e-a01a-47d2-a2d9-628a14d7e7f5\") " pod="calico-system/csi-node-driver-2m65n" Mar 6 03:00:37.498795 kubelet[2800]: E0306 03:00:37.498651 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.498795 kubelet[2800]: W0306 03:00:37.498672 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.498795 kubelet[2800]: E0306 03:00:37.498693 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.498795 kubelet[2800]: I0306 03:00:37.498740 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bfn4\" (UniqueName: \"kubernetes.io/projected/3669bc8e-a01a-47d2-a2d9-628a14d7e7f5-kube-api-access-9bfn4\") pod \"csi-node-driver-2m65n\" (UID: \"3669bc8e-a01a-47d2-a2d9-628a14d7e7f5\") " pod="calico-system/csi-node-driver-2m65n" Mar 6 03:00:37.499154 kubelet[2800]: E0306 03:00:37.499133 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.499154 kubelet[2800]: W0306 03:00:37.499153 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.499851 kubelet[2800]: E0306 03:00:37.499173 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.499851 kubelet[2800]: E0306 03:00:37.499497 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.499851 kubelet[2800]: W0306 03:00:37.499511 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.499851 kubelet[2800]: E0306 03:00:37.499531 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.501036 kubelet[2800]: E0306 03:00:37.500998 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.501036 kubelet[2800]: W0306 03:00:37.501034 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.501608 kubelet[2800]: E0306 03:00:37.501076 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.501608 kubelet[2800]: E0306 03:00:37.501469 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.501608 kubelet[2800]: W0306 03:00:37.501483 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.501608 kubelet[2800]: E0306 03:00:37.501501 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.501887 kubelet[2800]: E0306 03:00:37.501853 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.501887 kubelet[2800]: W0306 03:00:37.501867 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.501887 kubelet[2800]: E0306 03:00:37.501884 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.502036 kubelet[2800]: I0306 03:00:37.501947 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3669bc8e-a01a-47d2-a2d9-628a14d7e7f5-varrun\") pod \"csi-node-driver-2m65n\" (UID: \"3669bc8e-a01a-47d2-a2d9-628a14d7e7f5\") " pod="calico-system/csi-node-driver-2m65n" Mar 6 03:00:37.503239 kubelet[2800]: E0306 03:00:37.503143 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.503239 kubelet[2800]: W0306 03:00:37.503185 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.503239 kubelet[2800]: E0306 03:00:37.503217 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.503602 kubelet[2800]: I0306 03:00:37.503383 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3669bc8e-a01a-47d2-a2d9-628a14d7e7f5-registration-dir\") pod \"csi-node-driver-2m65n\" (UID: \"3669bc8e-a01a-47d2-a2d9-628a14d7e7f5\") " pod="calico-system/csi-node-driver-2m65n" Mar 6 03:00:37.505068 kubelet[2800]: E0306 03:00:37.505038 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.505218 kubelet[2800]: W0306 03:00:37.505064 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.505218 kubelet[2800]: E0306 03:00:37.505146 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.506277 kubelet[2800]: E0306 03:00:37.506250 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.506277 kubelet[2800]: W0306 03:00:37.506275 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.506446 kubelet[2800]: E0306 03:00:37.506295 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.507587 kubelet[2800]: E0306 03:00:37.507555 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.507587 kubelet[2800]: W0306 03:00:37.507583 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.507858 kubelet[2800]: E0306 03:00:37.507603 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.507967 kubelet[2800]: E0306 03:00:37.507946 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.508202 kubelet[2800]: W0306 03:00:37.507967 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.508202 kubelet[2800]: E0306 03:00:37.507984 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.509086 kubelet[2800]: E0306 03:00:37.509047 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.509086 kubelet[2800]: W0306 03:00:37.509084 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.509965 kubelet[2800]: E0306 03:00:37.509104 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.510040 kubelet[2800]: E0306 03:00:37.510014 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.510040 kubelet[2800]: W0306 03:00:37.510029 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.510155 kubelet[2800]: E0306 03:00:37.510048 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.547811 containerd[1532]: time="2026-03-06T03:00:37.547392211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bpq4z,Uid:8f38af03-3728-4d2d-b5a7-dd489d7289be,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:37.585435 containerd[1532]: time="2026-03-06T03:00:37.585264712Z" level=info msg="connecting to shim 7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3" address="unix:///run/containerd/s/d5a22d18ef363f2a2c2aa1791f6b7129ec86ecca9d5cc0382c7937a24e239493" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:00:37.605793 kubelet[2800]: E0306 03:00:37.605648 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.605793 kubelet[2800]: W0306 03:00:37.605677 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.605793 kubelet[2800]: E0306 03:00:37.605707 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.606507 kubelet[2800]: E0306 03:00:37.606391 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.606507 kubelet[2800]: W0306 03:00:37.606412 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.606507 kubelet[2800]: E0306 03:00:37.606431 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.606901 kubelet[2800]: E0306 03:00:37.606880 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.606901 kubelet[2800]: W0306 03:00:37.606902 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.607107 kubelet[2800]: E0306 03:00:37.606921 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.607369 kubelet[2800]: E0306 03:00:37.607343 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.607369 kubelet[2800]: W0306 03:00:37.607363 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.607646 kubelet[2800]: E0306 03:00:37.607500 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.608295 kubelet[2800]: E0306 03:00:37.608248 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.608295 kubelet[2800]: W0306 03:00:37.608272 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.608295 kubelet[2800]: E0306 03:00:37.608290 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.609037 kubelet[2800]: E0306 03:00:37.609006 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.609037 kubelet[2800]: W0306 03:00:37.609031 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.609374 kubelet[2800]: E0306 03:00:37.609049 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.609718 kubelet[2800]: E0306 03:00:37.609695 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.609718 kubelet[2800]: W0306 03:00:37.609714 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.609983 kubelet[2800]: E0306 03:00:37.609921 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.610703 kubelet[2800]: E0306 03:00:37.610612 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.610703 kubelet[2800]: W0306 03:00:37.610634 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.610703 kubelet[2800]: E0306 03:00:37.610654 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.611151 kubelet[2800]: E0306 03:00:37.611083 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.611151 kubelet[2800]: W0306 03:00:37.611106 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.611151 kubelet[2800]: E0306 03:00:37.611124 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.611655 kubelet[2800]: E0306 03:00:37.611615 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.611655 kubelet[2800]: W0306 03:00:37.611638 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.611655 kubelet[2800]: E0306 03:00:37.611655 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.612060 kubelet[2800]: E0306 03:00:37.612009 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.612060 kubelet[2800]: W0306 03:00:37.612023 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.612060 kubelet[2800]: E0306 03:00:37.612039 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.612410 kubelet[2800]: E0306 03:00:37.612378 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.612491 kubelet[2800]: W0306 03:00:37.612413 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.612491 kubelet[2800]: E0306 03:00:37.612431 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.612885 kubelet[2800]: E0306 03:00:37.612856 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.612885 kubelet[2800]: W0306 03:00:37.612873 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.612996 kubelet[2800]: E0306 03:00:37.612891 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.613427 kubelet[2800]: E0306 03:00:37.613363 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.613427 kubelet[2800]: W0306 03:00:37.613386 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.613427 kubelet[2800]: E0306 03:00:37.613404 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.613934 kubelet[2800]: E0306 03:00:37.613908 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.613934 kubelet[2800]: W0306 03:00:37.613931 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.613934 kubelet[2800]: E0306 03:00:37.613948 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.614402 kubelet[2800]: E0306 03:00:37.614285 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.614402 kubelet[2800]: W0306 03:00:37.614302 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.614402 kubelet[2800]: E0306 03:00:37.614318 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.614893 kubelet[2800]: E0306 03:00:37.614853 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.615004 kubelet[2800]: W0306 03:00:37.614874 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.615004 kubelet[2800]: E0306 03:00:37.614928 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.615503 kubelet[2800]: E0306 03:00:37.615478 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.615503 kubelet[2800]: W0306 03:00:37.615496 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.615652 kubelet[2800]: E0306 03:00:37.615513 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.616083 kubelet[2800]: E0306 03:00:37.616050 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.616083 kubelet[2800]: W0306 03:00:37.616081 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.616213 kubelet[2800]: E0306 03:00:37.616099 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.616633 kubelet[2800]: E0306 03:00:37.616610 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.616633 kubelet[2800]: W0306 03:00:37.616632 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.616820 kubelet[2800]: E0306 03:00:37.616649 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.617102 kubelet[2800]: E0306 03:00:37.617058 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.617102 kubelet[2800]: W0306 03:00:37.617090 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.617220 kubelet[2800]: E0306 03:00:37.617108 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.617475 kubelet[2800]: E0306 03:00:37.617455 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.617475 kubelet[2800]: W0306 03:00:37.617473 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.617612 kubelet[2800]: E0306 03:00:37.617490 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.617945 kubelet[2800]: E0306 03:00:37.617921 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.617945 kubelet[2800]: W0306 03:00:37.617944 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.618066 kubelet[2800]: E0306 03:00:37.617961 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.618389 kubelet[2800]: E0306 03:00:37.618368 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.618389 kubelet[2800]: W0306 03:00:37.618387 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.618514 kubelet[2800]: E0306 03:00:37.618403 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.618784 kubelet[2800]: E0306 03:00:37.618746 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.618856 kubelet[2800]: W0306 03:00:37.618779 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.618856 kubelet[2800]: E0306 03:00:37.618807 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.645044 kubelet[2800]: E0306 03:00:37.644952 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:37.645044 kubelet[2800]: W0306 03:00:37.644987 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:37.645044 kubelet[2800]: E0306 03:00:37.645016 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:37.654282 containerd[1532]: time="2026-03-06T03:00:37.654058710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-67677bfd4b-q27ls,Uid:2127933e-b432-4e17-8cc3-e2a10e385723,Namespace:calico-system,Attempt:0,} returns sandbox id \"04416149ecd9d0bc53b7fa7c719378ee19e88688ebde5192c9661496dfbf3305\"" Mar 6 03:00:37.661968 containerd[1532]: time="2026-03-06T03:00:37.661450058Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\"" Mar 6 03:00:37.675046 systemd[1]: Started cri-containerd-7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3.scope - libcontainer container 7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3. Mar 6 03:00:37.722304 containerd[1532]: time="2026-03-06T03:00:37.722077954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bpq4z,Uid:8f38af03-3728-4d2d-b5a7-dd489d7289be,Namespace:calico-system,Attempt:0,} returns sandbox id \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\"" Mar 6 03:00:38.712174 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1221202084.mount: Deactivated successfully. Mar 6 03:00:39.227199 kubelet[2800]: E0306 03:00:39.227139 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:39.981406 containerd[1532]: time="2026-03-06T03:00:39.981330868Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:39.983366 containerd[1532]: time="2026-03-06T03:00:39.983108306Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.31.4: active requests=0, bytes read=36107596" Mar 6 03:00:39.985086 containerd[1532]: time="2026-03-06T03:00:39.985036116Z" level=info msg="ImageCreate event name:\"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:39.988970 containerd[1532]: time="2026-03-06T03:00:39.988862396Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:39.989879 containerd[1532]: time="2026-03-06T03:00:39.989762175Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.31.4\" with image id \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d9396cfcd63dfcf72a65903042e473bb0bafc0cceb56bd71cd84078498a87130\", size \"36107450\" in 2.328058387s" Mar 6 03:00:39.989879 containerd[1532]: time="2026-03-06T03:00:39.989829313Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.31.4\" returns image reference \"sha256:46766605472b59b9c16342b2cc74da11f598baa9ba6d1e8b07b3f8ab4f29c55b\"" Mar 6 03:00:39.994797 containerd[1532]: time="2026-03-06T03:00:39.994658464Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\"" Mar 6 03:00:40.021276 containerd[1532]: time="2026-03-06T03:00:40.020609041Z" level=info msg="CreateContainer within sandbox \"04416149ecd9d0bc53b7fa7c719378ee19e88688ebde5192c9661496dfbf3305\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 6 03:00:40.037609 containerd[1532]: time="2026-03-06T03:00:40.037550378Z" level=info msg="Container db2cb372e8e03ccda95fbccb90e2a1f1f9cb53999dc3c71a30171ef4c6ff4cb2: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:40.051522 containerd[1532]: time="2026-03-06T03:00:40.051379698Z" level=info msg="CreateContainer within sandbox \"04416149ecd9d0bc53b7fa7c719378ee19e88688ebde5192c9661496dfbf3305\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"db2cb372e8e03ccda95fbccb90e2a1f1f9cb53999dc3c71a30171ef4c6ff4cb2\"" Mar 6 03:00:40.052365 containerd[1532]: time="2026-03-06T03:00:40.052297193Z" level=info msg="StartContainer for \"db2cb372e8e03ccda95fbccb90e2a1f1f9cb53999dc3c71a30171ef4c6ff4cb2\"" Mar 6 03:00:40.055754 containerd[1532]: time="2026-03-06T03:00:40.055612369Z" level=info msg="connecting to shim db2cb372e8e03ccda95fbccb90e2a1f1f9cb53999dc3c71a30171ef4c6ff4cb2" address="unix:///run/containerd/s/35d9c9688d0cf83cd798dfbba5e7d7614cede13015523c993ecbfab294825ad0" protocol=ttrpc version=3 Mar 6 03:00:40.096316 systemd[1]: Started cri-containerd-db2cb372e8e03ccda95fbccb90e2a1f1f9cb53999dc3c71a30171ef4c6ff4cb2.scope - libcontainer container db2cb372e8e03ccda95fbccb90e2a1f1f9cb53999dc3c71a30171ef4c6ff4cb2. Mar 6 03:00:40.183789 containerd[1532]: time="2026-03-06T03:00:40.183695144Z" level=info msg="StartContainer for \"db2cb372e8e03ccda95fbccb90e2a1f1f9cb53999dc3c71a30171ef4c6ff4cb2\" returns successfully" Mar 6 03:00:40.500728 kubelet[2800]: E0306 03:00:40.500674 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.500728 kubelet[2800]: W0306 03:00:40.500718 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.501479 kubelet[2800]: E0306 03:00:40.500762 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.502251 kubelet[2800]: E0306 03:00:40.501929 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.502251 kubelet[2800]: W0306 03:00:40.501953 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.502251 kubelet[2800]: E0306 03:00:40.501988 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.502465 kubelet[2800]: E0306 03:00:40.502303 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.502465 kubelet[2800]: W0306 03:00:40.502318 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.502465 kubelet[2800]: E0306 03:00:40.502344 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.504185 kubelet[2800]: E0306 03:00:40.504150 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.504185 kubelet[2800]: W0306 03:00:40.504179 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.504389 kubelet[2800]: E0306 03:00:40.504210 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.505706 kubelet[2800]: E0306 03:00:40.505517 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.505706 kubelet[2800]: W0306 03:00:40.505541 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.505706 kubelet[2800]: E0306 03:00:40.505563 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.506986 kubelet[2800]: E0306 03:00:40.505973 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.506986 kubelet[2800]: W0306 03:00:40.505987 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.506986 kubelet[2800]: E0306 03:00:40.506005 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.506986 kubelet[2800]: E0306 03:00:40.506898 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.506986 kubelet[2800]: W0306 03:00:40.506914 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.506986 kubelet[2800]: E0306 03:00:40.506939 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.507601 kubelet[2800]: E0306 03:00:40.507551 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.508007 kubelet[2800]: W0306 03:00:40.507976 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.508124 kubelet[2800]: E0306 03:00:40.508012 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.508729 kubelet[2800]: E0306 03:00:40.508703 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.508832 kubelet[2800]: W0306 03:00:40.508747 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.508915 kubelet[2800]: E0306 03:00:40.508886 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.509615 kubelet[2800]: E0306 03:00:40.509459 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.509615 kubelet[2800]: W0306 03:00:40.509547 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.509615 kubelet[2800]: E0306 03:00:40.509597 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.511531 kubelet[2800]: E0306 03:00:40.511474 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.511702 kubelet[2800]: W0306 03:00:40.511531 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.511702 kubelet[2800]: E0306 03:00:40.511564 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.513276 kubelet[2800]: E0306 03:00:40.513249 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.513276 kubelet[2800]: W0306 03:00:40.513274 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.513985 kubelet[2800]: E0306 03:00:40.513303 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.513985 kubelet[2800]: E0306 03:00:40.513905 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.513985 kubelet[2800]: W0306 03:00:40.513934 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.513985 kubelet[2800]: E0306 03:00:40.513954 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.514609 kubelet[2800]: E0306 03:00:40.514584 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.514609 kubelet[2800]: W0306 03:00:40.514608 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.514963 kubelet[2800]: E0306 03:00:40.514627 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.516265 kubelet[2800]: E0306 03:00:40.516226 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.516265 kubelet[2800]: W0306 03:00:40.516247 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.516265 kubelet[2800]: E0306 03:00:40.516266 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.534679 kubelet[2800]: E0306 03:00:40.534615 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.534679 kubelet[2800]: W0306 03:00:40.534651 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.534679 kubelet[2800]: E0306 03:00:40.534683 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.535814 kubelet[2800]: E0306 03:00:40.535138 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.535814 kubelet[2800]: W0306 03:00:40.535176 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.535814 kubelet[2800]: E0306 03:00:40.535195 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.536343 kubelet[2800]: E0306 03:00:40.536295 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.536343 kubelet[2800]: W0306 03:00:40.536313 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.536343 kubelet[2800]: E0306 03:00:40.536335 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.537891 kubelet[2800]: E0306 03:00:40.537857 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.537891 kubelet[2800]: W0306 03:00:40.537882 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.538108 kubelet[2800]: E0306 03:00:40.537901 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.539049 kubelet[2800]: E0306 03:00:40.539015 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.539049 kubelet[2800]: W0306 03:00:40.539037 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.539230 kubelet[2800]: E0306 03:00:40.539056 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.539723 kubelet[2800]: E0306 03:00:40.539680 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.539723 kubelet[2800]: W0306 03:00:40.539707 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.539723 kubelet[2800]: E0306 03:00:40.539727 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.540253 kubelet[2800]: E0306 03:00:40.540212 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.540253 kubelet[2800]: W0306 03:00:40.540239 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.540549 kubelet[2800]: E0306 03:00:40.540258 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.540942 kubelet[2800]: E0306 03:00:40.540918 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.540942 kubelet[2800]: W0306 03:00:40.540941 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.541092 kubelet[2800]: E0306 03:00:40.540958 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.541855 kubelet[2800]: E0306 03:00:40.541814 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.541855 kubelet[2800]: W0306 03:00:40.541839 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.541855 kubelet[2800]: E0306 03:00:40.541858 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.542543 kubelet[2800]: E0306 03:00:40.542504 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.542543 kubelet[2800]: W0306 03:00:40.542527 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.542543 kubelet[2800]: E0306 03:00:40.542546 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.543756 kubelet[2800]: E0306 03:00:40.543723 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.543756 kubelet[2800]: W0306 03:00:40.543746 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.544953 kubelet[2800]: E0306 03:00:40.544803 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.546067 kubelet[2800]: E0306 03:00:40.546042 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.546067 kubelet[2800]: W0306 03:00:40.546066 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.546213 kubelet[2800]: E0306 03:00:40.546087 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.548013 kubelet[2800]: E0306 03:00:40.547986 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.548013 kubelet[2800]: W0306 03:00:40.548010 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.548736 kubelet[2800]: E0306 03:00:40.548033 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.549889 kubelet[2800]: E0306 03:00:40.549857 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.549889 kubelet[2800]: W0306 03:00:40.549887 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.550026 kubelet[2800]: E0306 03:00:40.549912 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.550351 kubelet[2800]: E0306 03:00:40.550270 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.550351 kubelet[2800]: W0306 03:00:40.550306 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.550351 kubelet[2800]: E0306 03:00:40.550326 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.550796 kubelet[2800]: E0306 03:00:40.550759 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.550870 kubelet[2800]: W0306 03:00:40.550809 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.550870 kubelet[2800]: E0306 03:00:40.550828 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.551197 kubelet[2800]: E0306 03:00:40.551173 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.551197 kubelet[2800]: W0306 03:00:40.551195 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.551307 kubelet[2800]: E0306 03:00:40.551211 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.551880 kubelet[2800]: E0306 03:00:40.551847 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:40.551880 kubelet[2800]: W0306 03:00:40.551872 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:40.551998 kubelet[2800]: E0306 03:00:40.551892 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:40.652975 sshd[3214]: Connection closed by authenticating user root 80.94.95.115 port 26282 [preauth] Mar 6 03:00:40.660563 systemd[1]: sshd@9-10.128.0.87:22-80.94.95.115:26282.service: Deactivated successfully. Mar 6 03:00:41.226819 kubelet[2800]: E0306 03:00:41.226592 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:41.403185 kubelet[2800]: I0306 03:00:41.403125 2800 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 6 03:00:41.423566 kubelet[2800]: E0306 03:00:41.423517 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.423566 kubelet[2800]: W0306 03:00:41.423550 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.423883 kubelet[2800]: E0306 03:00:41.423583 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.423944 kubelet[2800]: E0306 03:00:41.423934 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.424001 kubelet[2800]: W0306 03:00:41.423948 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.424001 kubelet[2800]: E0306 03:00:41.423966 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.424298 kubelet[2800]: E0306 03:00:41.424267 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.424298 kubelet[2800]: W0306 03:00:41.424284 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.424524 kubelet[2800]: E0306 03:00:41.424300 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.424681 kubelet[2800]: E0306 03:00:41.424655 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.424681 kubelet[2800]: W0306 03:00:41.424670 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.424891 kubelet[2800]: E0306 03:00:41.424687 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.425035 kubelet[2800]: E0306 03:00:41.424994 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.425035 kubelet[2800]: W0306 03:00:41.425012 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.425035 kubelet[2800]: E0306 03:00:41.425029 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.425336 kubelet[2800]: E0306 03:00:41.425306 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.425336 kubelet[2800]: W0306 03:00:41.425326 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.425477 kubelet[2800]: E0306 03:00:41.425342 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.425702 kubelet[2800]: E0306 03:00:41.425685 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.425835 kubelet[2800]: W0306 03:00:41.425703 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.425835 kubelet[2800]: E0306 03:00:41.425719 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.426086 kubelet[2800]: E0306 03:00:41.426047 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.426086 kubelet[2800]: W0306 03:00:41.426065 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.426086 kubelet[2800]: E0306 03:00:41.426081 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.426426 kubelet[2800]: E0306 03:00:41.426410 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.426426 kubelet[2800]: W0306 03:00:41.426427 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.426633 kubelet[2800]: E0306 03:00:41.426443 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.426754 kubelet[2800]: E0306 03:00:41.426723 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.426754 kubelet[2800]: W0306 03:00:41.426736 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.426987 kubelet[2800]: E0306 03:00:41.426751 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.427117 kubelet[2800]: E0306 03:00:41.427047 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.427117 kubelet[2800]: W0306 03:00:41.427060 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.427117 kubelet[2800]: E0306 03:00:41.427075 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.427362 kubelet[2800]: E0306 03:00:41.427343 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.427525 kubelet[2800]: W0306 03:00:41.427371 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.427525 kubelet[2800]: E0306 03:00:41.427387 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.427742 kubelet[2800]: E0306 03:00:41.427668 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.427742 kubelet[2800]: W0306 03:00:41.427683 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.427742 kubelet[2800]: E0306 03:00:41.427699 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.428052 kubelet[2800]: E0306 03:00:41.427989 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.428052 kubelet[2800]: W0306 03:00:41.428003 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.428052 kubelet[2800]: E0306 03:00:41.428019 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.428304 kubelet[2800]: E0306 03:00:41.428274 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.428304 kubelet[2800]: W0306 03:00:41.428293 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.428444 kubelet[2800]: E0306 03:00:41.428308 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.444636 kubelet[2800]: E0306 03:00:41.444590 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.444636 kubelet[2800]: W0306 03:00:41.444624 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.444636 kubelet[2800]: E0306 03:00:41.444652 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.445169 kubelet[2800]: E0306 03:00:41.445124 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.445169 kubelet[2800]: W0306 03:00:41.445167 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.445496 kubelet[2800]: E0306 03:00:41.445188 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.445677 kubelet[2800]: E0306 03:00:41.445643 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.445677 kubelet[2800]: W0306 03:00:41.445659 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.445842 kubelet[2800]: E0306 03:00:41.445678 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.446122 kubelet[2800]: E0306 03:00:41.446100 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.446122 kubelet[2800]: W0306 03:00:41.446121 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.446813 kubelet[2800]: E0306 03:00:41.446139 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.446813 kubelet[2800]: E0306 03:00:41.446597 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.446813 kubelet[2800]: W0306 03:00:41.446617 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.446813 kubelet[2800]: E0306 03:00:41.446636 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.447245 kubelet[2800]: E0306 03:00:41.446981 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.447245 kubelet[2800]: W0306 03:00:41.446995 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.447245 kubelet[2800]: E0306 03:00:41.447011 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.447830 kubelet[2800]: E0306 03:00:41.447315 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.447830 kubelet[2800]: W0306 03:00:41.447330 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.447830 kubelet[2800]: E0306 03:00:41.447360 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.447830 kubelet[2800]: E0306 03:00:41.447656 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.447830 kubelet[2800]: W0306 03:00:41.447670 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.447830 kubelet[2800]: E0306 03:00:41.447685 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.448517 kubelet[2800]: E0306 03:00:41.448037 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.448517 kubelet[2800]: W0306 03:00:41.448050 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.448517 kubelet[2800]: E0306 03:00:41.448065 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.449082 kubelet[2800]: E0306 03:00:41.449051 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.449082 kubelet[2800]: W0306 03:00:41.449078 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.449491 kubelet[2800]: E0306 03:00:41.449098 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.449674 kubelet[2800]: E0306 03:00:41.449511 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.449674 kubelet[2800]: W0306 03:00:41.449531 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.449674 kubelet[2800]: E0306 03:00:41.449583 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.449954 kubelet[2800]: E0306 03:00:41.449943 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.450031 kubelet[2800]: W0306 03:00:41.449958 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.450031 kubelet[2800]: E0306 03:00:41.449975 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.450402 kubelet[2800]: E0306 03:00:41.450351 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.450402 kubelet[2800]: W0306 03:00:41.450396 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.450675 kubelet[2800]: E0306 03:00:41.450416 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.450803 kubelet[2800]: E0306 03:00:41.450759 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.450803 kubelet[2800]: W0306 03:00:41.450796 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.450935 kubelet[2800]: E0306 03:00:41.450814 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.451504 kubelet[2800]: E0306 03:00:41.451483 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.451628 kubelet[2800]: W0306 03:00:41.451590 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.451628 kubelet[2800]: E0306 03:00:41.451621 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.451986 kubelet[2800]: E0306 03:00:41.451951 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.451986 kubelet[2800]: W0306 03:00:41.451971 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.451986 kubelet[2800]: E0306 03:00:41.451988 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.452323 kubelet[2800]: E0306 03:00:41.452304 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.452323 kubelet[2800]: W0306 03:00:41.452322 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.452437 kubelet[2800]: E0306 03:00:41.452339 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.452954 kubelet[2800]: E0306 03:00:41.452934 2800 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 6 03:00:41.452954 kubelet[2800]: W0306 03:00:41.452951 2800 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 6 03:00:41.453081 kubelet[2800]: E0306 03:00:41.452967 2800 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 6 03:00:41.670708 containerd[1532]: time="2026-03-06T03:00:41.670633016Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:41.672322 containerd[1532]: time="2026-03-06T03:00:41.672237570Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4: active requests=0, bytes read=4630250" Mar 6 03:00:41.674832 containerd[1532]: time="2026-03-06T03:00:41.674720533Z" level=info msg="ImageCreate event name:\"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:41.680015 containerd[1532]: time="2026-03-06T03:00:41.679946130Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:41.681457 containerd[1532]: time="2026-03-06T03:00:41.681205777Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" with image id \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:5fa3492ac4dfef9cc34fe70a51289118e1f715a89133ea730eef81ad789dadbc\", size \"6186255\" in 1.686490724s" Mar 6 03:00:41.681457 containerd[1532]: time="2026-03-06T03:00:41.681258575Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.31.4\" returns image reference \"sha256:a6ea0cf732d820506ae9f1d7e7433a14009026b894fbbb8f346b9a5f5335c47e\"" Mar 6 03:00:41.690530 containerd[1532]: time="2026-03-06T03:00:41.690454655Z" level=info msg="CreateContainer within sandbox \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 6 03:00:41.710801 containerd[1532]: time="2026-03-06T03:00:41.707878217Z" level=info msg="Container e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:41.725236 containerd[1532]: time="2026-03-06T03:00:41.725173160Z" level=info msg="CreateContainer within sandbox \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786\"" Mar 6 03:00:41.726091 containerd[1532]: time="2026-03-06T03:00:41.725943140Z" level=info msg="StartContainer for \"e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786\"" Mar 6 03:00:41.730013 containerd[1532]: time="2026-03-06T03:00:41.729956047Z" level=info msg="connecting to shim e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786" address="unix:///run/containerd/s/d5a22d18ef363f2a2c2aa1791f6b7129ec86ecca9d5cc0382c7937a24e239493" protocol=ttrpc version=3 Mar 6 03:00:41.767076 systemd[1]: Started cri-containerd-e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786.scope - libcontainer container e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786. Mar 6 03:00:41.879439 containerd[1532]: time="2026-03-06T03:00:41.879380677Z" level=info msg="StartContainer for \"e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786\" returns successfully" Mar 6 03:00:41.901787 systemd[1]: cri-containerd-e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786.scope: Deactivated successfully. Mar 6 03:00:41.906737 containerd[1532]: time="2026-03-06T03:00:41.906667531Z" level=info msg="received container exit event container_id:\"e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786\" id:\"e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786\" pid:3587 exited_at:{seconds:1772766041 nanos:905482104}" Mar 6 03:00:41.954601 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e5669ca33a84c7e6e68b0c75b56f88dc6461d35af4b7d6e60f1451ebbb0a5786-rootfs.mount: Deactivated successfully. Mar 6 03:00:42.440572 kubelet[2800]: I0306 03:00:42.436203 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-typha-67677bfd4b-q27ls" podStartSLOduration=4.103315807 podStartE2EDuration="6.43618496s" podCreationTimestamp="2026-03-06 03:00:36 +0000 UTC" firstStartedPulling="2026-03-06 03:00:37.659121832 +0000 UTC m=+23.684079917" lastFinishedPulling="2026-03-06 03:00:39.991990968 +0000 UTC m=+26.016949070" observedRunningTime="2026-03-06 03:00:40.475603433 +0000 UTC m=+26.500561543" watchObservedRunningTime="2026-03-06 03:00:42.43618496 +0000 UTC m=+28.461143073" Mar 6 03:00:43.226519 kubelet[2800]: E0306 03:00:43.226428 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:43.418330 containerd[1532]: time="2026-03-06T03:00:43.417993876Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\"" Mar 6 03:00:45.227797 kubelet[2800]: E0306 03:00:45.226939 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:47.227800 kubelet[2800]: E0306 03:00:47.226490 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:49.226980 kubelet[2800]: E0306 03:00:49.226922 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:50.737485 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount632405783.mount: Deactivated successfully. Mar 6 03:00:50.775421 containerd[1532]: time="2026-03-06T03:00:50.775353050Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:50.777612 containerd[1532]: time="2026-03-06T03:00:50.777576196Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.31.4: active requests=0, bytes read=159838564" Mar 6 03:00:50.778465 containerd[1532]: time="2026-03-06T03:00:50.778408061Z" level=info msg="ImageCreate event name:\"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:50.781073 containerd[1532]: time="2026-03-06T03:00:50.781007569Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:50.781995 containerd[1532]: time="2026-03-06T03:00:50.781850485Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.31.4\" with image id \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\", repo tag \"ghcr.io/flatcar/calico/node:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:22b9d32dc7480c96272121d5682d53424c6e58653c60fa869b61a1758a11d77f\", size \"159838426\" in 7.363804647s" Mar 6 03:00:50.781995 containerd[1532]: time="2026-03-06T03:00:50.781896148Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.31.4\" returns image reference \"sha256:e6536b93706eda782f82ebadcac3559cb61801d09f982cc0533a134e6a8e1acf\"" Mar 6 03:00:50.790020 containerd[1532]: time="2026-03-06T03:00:50.789974475Z" level=info msg="CreateContainer within sandbox \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\" for container &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,}" Mar 6 03:00:50.803259 containerd[1532]: time="2026-03-06T03:00:50.802989377Z" level=info msg="Container 6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:50.817541 containerd[1532]: time="2026-03-06T03:00:50.817472793Z" level=info msg="CreateContainer within sandbox \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\" for &ContainerMetadata{Name:ebpf-bootstrap,Attempt:0,} returns container id \"6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a\"" Mar 6 03:00:50.818801 containerd[1532]: time="2026-03-06T03:00:50.818428946Z" level=info msg="StartContainer for \"6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a\"" Mar 6 03:00:50.821224 containerd[1532]: time="2026-03-06T03:00:50.821180225Z" level=info msg="connecting to shim 6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a" address="unix:///run/containerd/s/d5a22d18ef363f2a2c2aa1791f6b7129ec86ecca9d5cc0382c7937a24e239493" protocol=ttrpc version=3 Mar 6 03:00:50.853033 systemd[1]: Started cri-containerd-6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a.scope - libcontainer container 6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a. Mar 6 03:00:50.960465 containerd[1532]: time="2026-03-06T03:00:50.960409544Z" level=info msg="StartContainer for \"6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a\" returns successfully" Mar 6 03:00:51.023279 systemd[1]: cri-containerd-6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a.scope: Deactivated successfully. Mar 6 03:00:51.026329 containerd[1532]: time="2026-03-06T03:00:51.026264738Z" level=info msg="received container exit event container_id:\"6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a\" id:\"6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a\" pid:3646 exited_at:{seconds:1772766051 nanos:25887357}" Mar 6 03:00:51.065067 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6a69942cd6b63b8c94fe4e036e65ffe55c8bc8bd351065e5d4e75e755bbab07a-rootfs.mount: Deactivated successfully. Mar 6 03:00:51.226447 kubelet[2800]: E0306 03:00:51.226360 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:53.226800 kubelet[2800]: E0306 03:00:53.226697 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:53.468973 containerd[1532]: time="2026-03-06T03:00:53.468800919Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\"" Mar 6 03:00:55.226972 kubelet[2800]: E0306 03:00:55.226845 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:57.226281 kubelet[2800]: E0306 03:00:57.226211 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:57.680009 containerd[1532]: time="2026-03-06T03:00:57.679941465Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:57.681577 containerd[1532]: time="2026-03-06T03:00:57.681344922Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.31.4: active requests=0, bytes read=70611671" Mar 6 03:00:57.682623 containerd[1532]: time="2026-03-06T03:00:57.682583467Z" level=info msg="ImageCreate event name:\"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:57.687216 containerd[1532]: time="2026-03-06T03:00:57.687076820Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.31.4\" with image id \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\", repo tag \"ghcr.io/flatcar/calico/cni:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\", size \"72167716\" in 4.218225414s" Mar 6 03:00:57.687216 containerd[1532]: time="2026-03-06T03:00:57.687118317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.31.4\" returns image reference \"sha256:c433a27dd94ce9242338eece49f11629412dd42552fed314746fcf16ea958b2b\"" Mar 6 03:00:57.687692 containerd[1532]: time="2026-03-06T03:00:57.687661931Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:f1c5d9a6df01061c5faec4c4b59fb9ba69f8f5164b51e01ea8daa8e373111a04\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:00:57.694523 containerd[1532]: time="2026-03-06T03:00:57.694473612Z" level=info msg="CreateContainer within sandbox \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 6 03:00:57.706802 containerd[1532]: time="2026-03-06T03:00:57.705982492Z" level=info msg="Container 6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:00:57.720977 containerd[1532]: time="2026-03-06T03:00:57.720914678Z" level=info msg="CreateContainer within sandbox \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5\"" Mar 6 03:00:57.721956 containerd[1532]: time="2026-03-06T03:00:57.721733013Z" level=info msg="StartContainer for \"6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5\"" Mar 6 03:00:57.724466 containerd[1532]: time="2026-03-06T03:00:57.724426786Z" level=info msg="connecting to shim 6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5" address="unix:///run/containerd/s/d5a22d18ef363f2a2c2aa1791f6b7129ec86ecca9d5cc0382c7937a24e239493" protocol=ttrpc version=3 Mar 6 03:00:57.757042 systemd[1]: Started cri-containerd-6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5.scope - libcontainer container 6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5. Mar 6 03:00:57.862611 containerd[1532]: time="2026-03-06T03:00:57.862561493Z" level=info msg="StartContainer for \"6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5\" returns successfully" Mar 6 03:00:58.891367 containerd[1532]: time="2026-03-06T03:00:58.891272884Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 6 03:00:58.895024 systemd[1]: cri-containerd-6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5.scope: Deactivated successfully. Mar 6 03:00:58.896814 containerd[1532]: time="2026-03-06T03:00:58.895008748Z" level=info msg="received container exit event container_id:\"6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5\" id:\"6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5\" pid:3708 exited_at:{seconds:1772766058 nanos:894728864}" Mar 6 03:00:58.895754 systemd[1]: cri-containerd-6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5.scope: Consumed 690ms CPU time, 195.3M memory peak, 177M written to disk. Mar 6 03:00:58.919236 kubelet[2800]: I0306 03:00:58.919127 2800 kubelet_node_status.go:427] "Fast updating node status as it just became ready" Mar 6 03:00:58.935890 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6b644bcb615ed33ef267453ea655dfe403f9181ad1e14c6b1f3d5a7085abc1e5-rootfs.mount: Deactivated successfully. Mar 6 03:00:59.039995 systemd[1]: Created slice kubepods-burstable-pod0228e152_72ed_4332_a06f_c4b8ac58bc00.slice - libcontainer container kubepods-burstable-pod0228e152_72ed_4332_a06f_c4b8ac58bc00.slice. Mar 6 03:00:59.079737 kubelet[2800]: I0306 03:00:59.079679 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0228e152-72ed-4332-a06f-c4b8ac58bc00-config-volume\") pod \"coredns-7d764666f9-bvfqk\" (UID: \"0228e152-72ed-4332-a06f-c4b8ac58bc00\") " pod="kube-system/coredns-7d764666f9-bvfqk" Mar 6 03:00:59.079737 kubelet[2800]: I0306 03:00:59.079747 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnngl\" (UniqueName: \"kubernetes.io/projected/0228e152-72ed-4332-a06f-c4b8ac58bc00-kube-api-access-qnngl\") pod \"coredns-7d764666f9-bvfqk\" (UID: \"0228e152-72ed-4332-a06f-c4b8ac58bc00\") " pod="kube-system/coredns-7d764666f9-bvfqk" Mar 6 03:00:59.290335 systemd[1]: Created slice kubepods-burstable-pod7742d6cc_2fcb_4303_99a2_f9d5a0e44f95.slice - libcontainer container kubepods-burstable-pod7742d6cc_2fcb_4303_99a2_f9d5a0e44f95.slice. Mar 6 03:00:59.381721 kubelet[2800]: I0306 03:00:59.381635 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntjv\" (UniqueName: \"kubernetes.io/projected/7742d6cc-2fcb-4303-99a2-f9d5a0e44f95-kube-api-access-fntjv\") pod \"coredns-7d764666f9-9kf8c\" (UID: \"7742d6cc-2fcb-4303-99a2-f9d5a0e44f95\") " pod="kube-system/coredns-7d764666f9-9kf8c" Mar 6 03:00:59.381721 kubelet[2800]: I0306 03:00:59.381711 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7742d6cc-2fcb-4303-99a2-f9d5a0e44f95-config-volume\") pod \"coredns-7d764666f9-9kf8c\" (UID: \"7742d6cc-2fcb-4303-99a2-f9d5a0e44f95\") " pod="kube-system/coredns-7d764666f9-9kf8c" Mar 6 03:00:59.503054 containerd[1532]: time="2026-03-06T03:00:59.502738972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-bvfqk,Uid:0228e152-72ed-4332-a06f-c4b8ac58bc00,Namespace:kube-system,Attempt:0,}" Mar 6 03:00:59.565067 systemd[1]: Created slice kubepods-besteffort-pod548ef683_b05f_40c7_8c04_ab888c23b7fb.slice - libcontainer container kubepods-besteffort-pod548ef683_b05f_40c7_8c04_ab888c23b7fb.slice. Mar 6 03:00:59.586804 kubelet[2800]: I0306 03:00:59.584048 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-ca-bundle\") pod \"whisker-65d8796896-4zmxc\" (UID: \"11f62943-290a-4f7b-a132-33bce0f8a5ea\") " pod="calico-system/whisker-65d8796896-4zmxc" Mar 6 03:00:59.586804 kubelet[2800]: I0306 03:00:59.586013 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2658b953-d894-4e8f-ae5f-28690d98a0a9-config\") pod \"goldmane-9f7667bb8-x4r4t\" (UID: \"2658b953-d894-4e8f-ae5f-28690d98a0a9\") " pod="calico-system/goldmane-9f7667bb8-x4r4t" Mar 6 03:00:59.587813 kubelet[2800]: I0306 03:00:59.587324 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cf647f0e-6583-4488-a6c3-323a650da7e7-calico-apiserver-certs\") pod \"calico-apiserver-6857b68657-cl7tg\" (UID: \"cf647f0e-6583-4488-a6c3-323a650da7e7\") " pod="calico-system/calico-apiserver-6857b68657-cl7tg" Mar 6 03:00:59.589121 kubelet[2800]: I0306 03:00:59.587817 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsqw4\" (UniqueName: \"kubernetes.io/projected/cf647f0e-6583-4488-a6c3-323a650da7e7-kube-api-access-jsqw4\") pod \"calico-apiserver-6857b68657-cl7tg\" (UID: \"cf647f0e-6583-4488-a6c3-323a650da7e7\") " pod="calico-system/calico-apiserver-6857b68657-cl7tg" Mar 6 03:00:59.588722 systemd[1]: Created slice kubepods-besteffort-podac46cc05_0e3c_439d_bc6a_7388a9b0ece7.slice - libcontainer container kubepods-besteffort-podac46cc05_0e3c_439d_bc6a_7388a9b0ece7.slice. Mar 6 03:00:59.590996 kubelet[2800]: I0306 03:00:59.587873 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2658b953-d894-4e8f-ae5f-28690d98a0a9-goldmane-key-pair\") pod \"goldmane-9f7667bb8-x4r4t\" (UID: \"2658b953-d894-4e8f-ae5f-28690d98a0a9\") " pod="calico-system/goldmane-9f7667bb8-x4r4t" Mar 6 03:00:59.590996 kubelet[2800]: I0306 03:00:59.589593 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb9gk\" (UniqueName: \"kubernetes.io/projected/548ef683-b05f-40c7-8c04-ab888c23b7fb-kube-api-access-gb9gk\") pod \"calico-kube-controllers-5c9c5b7bf5-r8t6w\" (UID: \"548ef683-b05f-40c7-8c04-ab888c23b7fb\") " pod="calico-system/calico-kube-controllers-5c9c5b7bf5-r8t6w" Mar 6 03:00:59.590996 kubelet[2800]: I0306 03:00:59.589642 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztpdk\" (UniqueName: \"kubernetes.io/projected/2658b953-d894-4e8f-ae5f-28690d98a0a9-kube-api-access-ztpdk\") pod \"goldmane-9f7667bb8-x4r4t\" (UID: \"2658b953-d894-4e8f-ae5f-28690d98a0a9\") " pod="calico-system/goldmane-9f7667bb8-x4r4t" Mar 6 03:00:59.590996 kubelet[2800]: I0306 03:00:59.589677 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-nginx-config\") pod \"whisker-65d8796896-4zmxc\" (UID: \"11f62943-290a-4f7b-a132-33bce0f8a5ea\") " pod="calico-system/whisker-65d8796896-4zmxc" Mar 6 03:00:59.590996 kubelet[2800]: I0306 03:00:59.589711 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-backend-key-pair\") pod \"whisker-65d8796896-4zmxc\" (UID: \"11f62943-290a-4f7b-a132-33bce0f8a5ea\") " pod="calico-system/whisker-65d8796896-4zmxc" Mar 6 03:00:59.591927 kubelet[2800]: I0306 03:00:59.589741 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s67zb\" (UniqueName: \"kubernetes.io/projected/11f62943-290a-4f7b-a132-33bce0f8a5ea-kube-api-access-s67zb\") pod \"whisker-65d8796896-4zmxc\" (UID: \"11f62943-290a-4f7b-a132-33bce0f8a5ea\") " pod="calico-system/whisker-65d8796896-4zmxc" Mar 6 03:00:59.594055 kubelet[2800]: I0306 03:00:59.594013 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2658b953-d894-4e8f-ae5f-28690d98a0a9-goldmane-ca-bundle\") pod \"goldmane-9f7667bb8-x4r4t\" (UID: \"2658b953-d894-4e8f-ae5f-28690d98a0a9\") " pod="calico-system/goldmane-9f7667bb8-x4r4t" Mar 6 03:00:59.594270 kubelet[2800]: I0306 03:00:59.594215 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/548ef683-b05f-40c7-8c04-ab888c23b7fb-tigera-ca-bundle\") pod \"calico-kube-controllers-5c9c5b7bf5-r8t6w\" (UID: \"548ef683-b05f-40c7-8c04-ab888c23b7fb\") " pod="calico-system/calico-kube-controllers-5c9c5b7bf5-r8t6w" Mar 6 03:00:59.595496 kubelet[2800]: I0306 03:00:59.594376 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fmcg\" (UniqueName: \"kubernetes.io/projected/ac46cc05-0e3c-439d-bc6a-7388a9b0ece7-kube-api-access-8fmcg\") pod \"calico-apiserver-6857b68657-htzf9\" (UID: \"ac46cc05-0e3c-439d-bc6a-7388a9b0ece7\") " pod="calico-system/calico-apiserver-6857b68657-htzf9" Mar 6 03:00:59.595496 kubelet[2800]: I0306 03:00:59.594455 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ac46cc05-0e3c-439d-bc6a-7388a9b0ece7-calico-apiserver-certs\") pod \"calico-apiserver-6857b68657-htzf9\" (UID: \"ac46cc05-0e3c-439d-bc6a-7388a9b0ece7\") " pod="calico-system/calico-apiserver-6857b68657-htzf9" Mar 6 03:00:59.605536 containerd[1532]: time="2026-03-06T03:00:59.605451344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-9kf8c,Uid:7742d6cc-2fcb-4303-99a2-f9d5a0e44f95,Namespace:kube-system,Attempt:0,}" Mar 6 03:00:59.607710 systemd[1]: Created slice kubepods-besteffort-pod3669bc8e_a01a_47d2_a2d9_628a14d7e7f5.slice - libcontainer container kubepods-besteffort-pod3669bc8e_a01a_47d2_a2d9_628a14d7e7f5.slice. Mar 6 03:00:59.628045 containerd[1532]: time="2026-03-06T03:00:59.625288573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2m65n,Uid:3669bc8e-a01a-47d2-a2d9-628a14d7e7f5,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:59.627428 systemd[1]: Created slice kubepods-besteffort-podcf647f0e_6583_4488_a6c3_323a650da7e7.slice - libcontainer container kubepods-besteffort-podcf647f0e_6583_4488_a6c3_323a650da7e7.slice. Mar 6 03:00:59.657108 systemd[1]: Created slice kubepods-besteffort-pod11f62943_290a_4f7b_a132_33bce0f8a5ea.slice - libcontainer container kubepods-besteffort-pod11f62943_290a_4f7b_a132_33bce0f8a5ea.slice. Mar 6 03:00:59.670828 systemd[1]: Created slice kubepods-besteffort-pod2658b953_d894_4e8f_ae5f_28690d98a0a9.slice - libcontainer container kubepods-besteffort-pod2658b953_d894_4e8f_ae5f_28690d98a0a9.slice. Mar 6 03:00:59.805059 containerd[1532]: time="2026-03-06T03:00:59.804991630Z" level=error msg="Failed to destroy network for sandbox \"a60b46ff2285677dbf4d1b5b60d436029484428ca3100af1728221ce8c4b0a9a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.809378 containerd[1532]: time="2026-03-06T03:00:59.807874621Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-bvfqk,Uid:0228e152-72ed-4332-a06f-c4b8ac58bc00,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a60b46ff2285677dbf4d1b5b60d436029484428ca3100af1728221ce8c4b0a9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.810151 kubelet[2800]: E0306 03:00:59.809853 2800 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a60b46ff2285677dbf4d1b5b60d436029484428ca3100af1728221ce8c4b0a9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.810475 kubelet[2800]: E0306 03:00:59.810365 2800 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a60b46ff2285677dbf4d1b5b60d436029484428ca3100af1728221ce8c4b0a9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-bvfqk" Mar 6 03:00:59.810475 kubelet[2800]: E0306 03:00:59.810431 2800 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a60b46ff2285677dbf4d1b5b60d436029484428ca3100af1728221ce8c4b0a9a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-bvfqk" Mar 6 03:00:59.810928 kubelet[2800]: E0306 03:00:59.810848 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-bvfqk_kube-system(0228e152-72ed-4332-a06f-c4b8ac58bc00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-bvfqk_kube-system(0228e152-72ed-4332-a06f-c4b8ac58bc00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a60b46ff2285677dbf4d1b5b60d436029484428ca3100af1728221ce8c4b0a9a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-bvfqk" podUID="0228e152-72ed-4332-a06f-c4b8ac58bc00" Mar 6 03:00:59.863362 containerd[1532]: time="2026-03-06T03:00:59.862002137Z" level=error msg="Failed to destroy network for sandbox \"dd24b86e152cb78c5b69efe52b954b25ccabb52eb76b34cd7e090f9c608e1186\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.865697 containerd[1532]: time="2026-03-06T03:00:59.865500161Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-9kf8c,Uid:7742d6cc-2fcb-4303-99a2-f9d5a0e44f95,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd24b86e152cb78c5b69efe52b954b25ccabb52eb76b34cd7e090f9c608e1186\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.866300 kubelet[2800]: E0306 03:00:59.866237 2800 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd24b86e152cb78c5b69efe52b954b25ccabb52eb76b34cd7e090f9c608e1186\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.866831 kubelet[2800]: E0306 03:00:59.866312 2800 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd24b86e152cb78c5b69efe52b954b25ccabb52eb76b34cd7e090f9c608e1186\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-9kf8c" Mar 6 03:00:59.866831 kubelet[2800]: E0306 03:00:59.866341 2800 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd24b86e152cb78c5b69efe52b954b25ccabb52eb76b34cd7e090f9c608e1186\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7d764666f9-9kf8c" Mar 6 03:00:59.866831 kubelet[2800]: E0306 03:00:59.866424 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7d764666f9-9kf8c_kube-system(7742d6cc-2fcb-4303-99a2-f9d5a0e44f95)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7d764666f9-9kf8c_kube-system(7742d6cc-2fcb-4303-99a2-f9d5a0e44f95)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd24b86e152cb78c5b69efe52b954b25ccabb52eb76b34cd7e090f9c608e1186\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7d764666f9-9kf8c" podUID="7742d6cc-2fcb-4303-99a2-f9d5a0e44f95" Mar 6 03:00:59.870622 containerd[1532]: time="2026-03-06T03:00:59.870561485Z" level=error msg="Failed to destroy network for sandbox \"cc273599aa2875012f207676bc255362cde3eb6f20ab4f3ca295ebe4c793f57c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.872184 containerd[1532]: time="2026-03-06T03:00:59.872130041Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2m65n,Uid:3669bc8e-a01a-47d2-a2d9-628a14d7e7f5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc273599aa2875012f207676bc255362cde3eb6f20ab4f3ca295ebe4c793f57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.872664 kubelet[2800]: E0306 03:00:59.872550 2800 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc273599aa2875012f207676bc255362cde3eb6f20ab4f3ca295ebe4c793f57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:00:59.872946 kubelet[2800]: E0306 03:00:59.872908 2800 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc273599aa2875012f207676bc255362cde3eb6f20ab4f3ca295ebe4c793f57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2m65n" Mar 6 03:00:59.873606 kubelet[2800]: E0306 03:00:59.872951 2800 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cc273599aa2875012f207676bc255362cde3eb6f20ab4f3ca295ebe4c793f57c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2m65n" Mar 6 03:00:59.873606 kubelet[2800]: E0306 03:00:59.873180 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2m65n_calico-system(3669bc8e-a01a-47d2-a2d9-628a14d7e7f5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2m65n_calico-system(3669bc8e-a01a-47d2-a2d9-628a14d7e7f5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cc273599aa2875012f207676bc255362cde3eb6f20ab4f3ca295ebe4c793f57c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2m65n" podUID="3669bc8e-a01a-47d2-a2d9-628a14d7e7f5" Mar 6 03:00:59.884339 containerd[1532]: time="2026-03-06T03:00:59.884277399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c9c5b7bf5-r8t6w,Uid:548ef683-b05f-40c7-8c04-ab888c23b7fb,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:59.905793 containerd[1532]: time="2026-03-06T03:00:59.905708729Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6857b68657-htzf9,Uid:ac46cc05-0e3c-439d-bc6a-7388a9b0ece7,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:59.965810 containerd[1532]: time="2026-03-06T03:00:59.964811565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6857b68657-cl7tg,Uid:cf647f0e-6583-4488-a6c3-323a650da7e7,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:59.973091 systemd[1]: run-netns-cni\x2de89fb748\x2dd261\x2d2df7\x2d4b0d\x2d42c9532e5478.mount: Deactivated successfully. Mar 6 03:00:59.973262 systemd[1]: run-netns-cni\x2d661d5901\x2d0310\x2d27e2\x2d33e1\x2da64efbf37b1f.mount: Deactivated successfully. Mar 6 03:00:59.990404 containerd[1532]: time="2026-03-06T03:00:59.990305167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-x4r4t,Uid:2658b953-d894-4e8f-ae5f-28690d98a0a9,Namespace:calico-system,Attempt:0,}" Mar 6 03:00:59.994989 containerd[1532]: time="2026-03-06T03:00:59.992925769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65d8796896-4zmxc,Uid:11f62943-290a-4f7b-a132-33bce0f8a5ea,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:00.147288 containerd[1532]: time="2026-03-06T03:01:00.147121426Z" level=error msg="Failed to destroy network for sandbox \"866fd63d79d7bf8781e78b29d9286f5694ded47783d9ba3ec5a22689d0bbc697\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.154961 containerd[1532]: time="2026-03-06T03:01:00.154909767Z" level=error msg="Failed to destroy network for sandbox \"ec8cc49e4165baec9c213f2c8fa9ecddbee33d5bcaa94592656e31c32b0e524c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.155480 systemd[1]: run-netns-cni\x2d7337895b\x2d8ff6\x2d4b8a\x2d0994\x2d43787e681777.mount: Deactivated successfully. Mar 6 03:01:00.156403 containerd[1532]: time="2026-03-06T03:01:00.156349232Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c9c5b7bf5-r8t6w,Uid:548ef683-b05f-40c7-8c04-ab888c23b7fb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"866fd63d79d7bf8781e78b29d9286f5694ded47783d9ba3ec5a22689d0bbc697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.157725 kubelet[2800]: E0306 03:01:00.157673 2800 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"866fd63d79d7bf8781e78b29d9286f5694ded47783d9ba3ec5a22689d0bbc697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.159254 kubelet[2800]: E0306 03:01:00.157753 2800 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"866fd63d79d7bf8781e78b29d9286f5694ded47783d9ba3ec5a22689d0bbc697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5c9c5b7bf5-r8t6w" Mar 6 03:01:00.159254 kubelet[2800]: E0306 03:01:00.157808 2800 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"866fd63d79d7bf8781e78b29d9286f5694ded47783d9ba3ec5a22689d0bbc697\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5c9c5b7bf5-r8t6w" Mar 6 03:01:00.159254 kubelet[2800]: E0306 03:01:00.158996 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5c9c5b7bf5-r8t6w_calico-system(548ef683-b05f-40c7-8c04-ab888c23b7fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5c9c5b7bf5-r8t6w_calico-system(548ef683-b05f-40c7-8c04-ab888c23b7fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"866fd63d79d7bf8781e78b29d9286f5694ded47783d9ba3ec5a22689d0bbc697\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5c9c5b7bf5-r8t6w" podUID="548ef683-b05f-40c7-8c04-ab888c23b7fb" Mar 6 03:01:00.162006 kubelet[2800]: E0306 03:01:00.159915 2800 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec8cc49e4165baec9c213f2c8fa9ecddbee33d5bcaa94592656e31c32b0e524c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.162006 kubelet[2800]: E0306 03:01:00.159967 2800 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec8cc49e4165baec9c213f2c8fa9ecddbee33d5bcaa94592656e31c32b0e524c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6857b68657-htzf9" Mar 6 03:01:00.162006 kubelet[2800]: E0306 03:01:00.159995 2800 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec8cc49e4165baec9c213f2c8fa9ecddbee33d5bcaa94592656e31c32b0e524c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6857b68657-htzf9" Mar 6 03:01:00.162215 containerd[1532]: time="2026-03-06T03:01:00.159289818Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6857b68657-htzf9,Uid:ac46cc05-0e3c-439d-bc6a-7388a9b0ece7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ec8cc49e4165baec9c213f2c8fa9ecddbee33d5bcaa94592656e31c32b0e524c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.162345 kubelet[2800]: E0306 03:01:00.160064 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6857b68657-htzf9_calico-system(ac46cc05-0e3c-439d-bc6a-7388a9b0ece7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6857b68657-htzf9_calico-system(ac46cc05-0e3c-439d-bc6a-7388a9b0ece7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ec8cc49e4165baec9c213f2c8fa9ecddbee33d5bcaa94592656e31c32b0e524c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6857b68657-htzf9" podUID="ac46cc05-0e3c-439d-bc6a-7388a9b0ece7" Mar 6 03:01:00.188938 containerd[1532]: time="2026-03-06T03:01:00.188731485Z" level=error msg="Failed to destroy network for sandbox \"4857e8c07404633513506c5fde838f24a8fbbe823758ebee70916a0f774359fc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.191080 containerd[1532]: time="2026-03-06T03:01:00.190999688Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65d8796896-4zmxc,Uid:11f62943-290a-4f7b-a132-33bce0f8a5ea,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4857e8c07404633513506c5fde838f24a8fbbe823758ebee70916a0f774359fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.191501 kubelet[2800]: E0306 03:01:00.191447 2800 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4857e8c07404633513506c5fde838f24a8fbbe823758ebee70916a0f774359fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.193663 kubelet[2800]: E0306 03:01:00.191824 2800 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4857e8c07404633513506c5fde838f24a8fbbe823758ebee70916a0f774359fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65d8796896-4zmxc" Mar 6 03:01:00.193663 kubelet[2800]: E0306 03:01:00.192838 2800 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4857e8c07404633513506c5fde838f24a8fbbe823758ebee70916a0f774359fc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65d8796896-4zmxc" Mar 6 03:01:00.193663 kubelet[2800]: E0306 03:01:00.192985 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-65d8796896-4zmxc_calico-system(11f62943-290a-4f7b-a132-33bce0f8a5ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-65d8796896-4zmxc_calico-system(11f62943-290a-4f7b-a132-33bce0f8a5ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4857e8c07404633513506c5fde838f24a8fbbe823758ebee70916a0f774359fc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-65d8796896-4zmxc" podUID="11f62943-290a-4f7b-a132-33bce0f8a5ea" Mar 6 03:01:00.236112 containerd[1532]: time="2026-03-06T03:01:00.236057118Z" level=error msg="Failed to destroy network for sandbox \"73cb7a487fa961b2ce6016e504d494d1f7137a953f58766c43bf8e58adf94c7a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.237521 containerd[1532]: time="2026-03-06T03:01:00.237479027Z" level=error msg="Failed to destroy network for sandbox \"c865462b0907283185f716c9e713f1de28bd8ba618b864bd0dc7ac1fd6c9e9cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.237909 containerd[1532]: time="2026-03-06T03:01:00.237864168Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-x4r4t,Uid:2658b953-d894-4e8f-ae5f-28690d98a0a9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"73cb7a487fa961b2ce6016e504d494d1f7137a953f58766c43bf8e58adf94c7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.238245 kubelet[2800]: E0306 03:01:00.238174 2800 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73cb7a487fa961b2ce6016e504d494d1f7137a953f58766c43bf8e58adf94c7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.238530 kubelet[2800]: E0306 03:01:00.238240 2800 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73cb7a487fa961b2ce6016e504d494d1f7137a953f58766c43bf8e58adf94c7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-x4r4t" Mar 6 03:01:00.238530 kubelet[2800]: E0306 03:01:00.238268 2800 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"73cb7a487fa961b2ce6016e504d494d1f7137a953f58766c43bf8e58adf94c7a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-9f7667bb8-x4r4t" Mar 6 03:01:00.238530 kubelet[2800]: E0306 03:01:00.238345 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-9f7667bb8-x4r4t_calico-system(2658b953-d894-4e8f-ae5f-28690d98a0a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-9f7667bb8-x4r4t_calico-system(2658b953-d894-4e8f-ae5f-28690d98a0a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"73cb7a487fa961b2ce6016e504d494d1f7137a953f58766c43bf8e58adf94c7a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-9f7667bb8-x4r4t" podUID="2658b953-d894-4e8f-ae5f-28690d98a0a9" Mar 6 03:01:00.239739 containerd[1532]: time="2026-03-06T03:01:00.238973535Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6857b68657-cl7tg,Uid:cf647f0e-6583-4488-a6c3-323a650da7e7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c865462b0907283185f716c9e713f1de28bd8ba618b864bd0dc7ac1fd6c9e9cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.240113 kubelet[2800]: E0306 03:01:00.239818 2800 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c865462b0907283185f716c9e713f1de28bd8ba618b864bd0dc7ac1fd6c9e9cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 6 03:01:00.240113 kubelet[2800]: E0306 03:01:00.239993 2800 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c865462b0907283185f716c9e713f1de28bd8ba618b864bd0dc7ac1fd6c9e9cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6857b68657-cl7tg" Mar 6 03:01:00.240113 kubelet[2800]: E0306 03:01:00.240026 2800 kuberuntime_manager.go:1558] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c865462b0907283185f716c9e713f1de28bd8ba618b864bd0dc7ac1fd6c9e9cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-apiserver-6857b68657-cl7tg" Mar 6 03:01:00.240687 kubelet[2800]: E0306 03:01:00.240345 2800 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6857b68657-cl7tg_calico-system(cf647f0e-6583-4488-a6c3-323a650da7e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6857b68657-cl7tg_calico-system(cf647f0e-6583-4488-a6c3-323a650da7e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c865462b0907283185f716c9e713f1de28bd8ba618b864bd0dc7ac1fd6c9e9cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-apiserver-6857b68657-cl7tg" podUID="cf647f0e-6583-4488-a6c3-323a650da7e7" Mar 6 03:01:00.532130 containerd[1532]: time="2026-03-06T03:01:00.532070314Z" level=info msg="CreateContainer within sandbox \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 6 03:01:00.546492 containerd[1532]: time="2026-03-06T03:01:00.544971399Z" level=info msg="Container ddcf62315f4351363b03d805640462a8680cc7df440608f84b6721ddfe18d580: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:00.560361 containerd[1532]: time="2026-03-06T03:01:00.560296747Z" level=info msg="CreateContainer within sandbox \"7152190cb7c4250d22ebd0d93a9591b8c654d7498a9378c90ee3d347a84243c3\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ddcf62315f4351363b03d805640462a8680cc7df440608f84b6721ddfe18d580\"" Mar 6 03:01:00.561393 containerd[1532]: time="2026-03-06T03:01:00.561339793Z" level=info msg="StartContainer for \"ddcf62315f4351363b03d805640462a8680cc7df440608f84b6721ddfe18d580\"" Mar 6 03:01:00.563682 containerd[1532]: time="2026-03-06T03:01:00.563631652Z" level=info msg="connecting to shim ddcf62315f4351363b03d805640462a8680cc7df440608f84b6721ddfe18d580" address="unix:///run/containerd/s/d5a22d18ef363f2a2c2aa1791f6b7129ec86ecca9d5cc0382c7937a24e239493" protocol=ttrpc version=3 Mar 6 03:01:00.597153 systemd[1]: Started cri-containerd-ddcf62315f4351363b03d805640462a8680cc7df440608f84b6721ddfe18d580.scope - libcontainer container ddcf62315f4351363b03d805640462a8680cc7df440608f84b6721ddfe18d580. Mar 6 03:01:00.715830 containerd[1532]: time="2026-03-06T03:01:00.714271643Z" level=info msg="StartContainer for \"ddcf62315f4351363b03d805640462a8680cc7df440608f84b6721ddfe18d580\" returns successfully" Mar 6 03:01:00.951537 systemd[1]: run-netns-cni\x2dccb5761e\x2de150\x2d54ec\x2d0111\x2d4261e29c99f9.mount: Deactivated successfully. Mar 6 03:01:00.951694 systemd[1]: run-netns-cni\x2d937c1b4b\x2ddb86\x2df720\x2d24db\x2d16a66839e826.mount: Deactivated successfully. Mar 6 03:01:00.951806 systemd[1]: run-netns-cni\x2d5a0ba813\x2d28d5\x2d8052\x2dea43\x2d12210e27255e.mount: Deactivated successfully. Mar 6 03:01:00.951901 systemd[1]: run-netns-cni\x2d9c7588d4\x2d42c6\x2d5b5d\x2d655b\x2d00bfe6850462.mount: Deactivated successfully. Mar 6 03:01:01.005799 kubelet[2800]: I0306 03:01:01.005146 2800 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-nginx-config\" (UniqueName: \"kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-nginx-config\") pod \"11f62943-290a-4f7b-a132-33bce0f8a5ea\" (UID: \"11f62943-290a-4f7b-a132-33bce0f8a5ea\") " Mar 6 03:01:01.005799 kubelet[2800]: I0306 03:01:01.005215 2800 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/secret/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-backend-key-pair\") pod \"11f62943-290a-4f7b-a132-33bce0f8a5ea\" (UID: \"11f62943-290a-4f7b-a132-33bce0f8a5ea\") " Mar 6 03:01:01.005799 kubelet[2800]: I0306 03:01:01.005258 2800 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-ca-bundle\") pod \"11f62943-290a-4f7b-a132-33bce0f8a5ea\" (UID: \"11f62943-290a-4f7b-a132-33bce0f8a5ea\") " Mar 6 03:01:01.005799 kubelet[2800]: I0306 03:01:01.005313 2800 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kubernetes.io/projected/11f62943-290a-4f7b-a132-33bce0f8a5ea-kube-api-access-s67zb\" (UniqueName: \"kubernetes.io/projected/11f62943-290a-4f7b-a132-33bce0f8a5ea-kube-api-access-s67zb\") pod \"11f62943-290a-4f7b-a132-33bce0f8a5ea\" (UID: \"11f62943-290a-4f7b-a132-33bce0f8a5ea\") " Mar 6 03:01:01.007834 kubelet[2800]: I0306 03:01:01.007797 2800 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-nginx-config" pod "11f62943-290a-4f7b-a132-33bce0f8a5ea" (UID: "11f62943-290a-4f7b-a132-33bce0f8a5ea"). InnerVolumeSpecName "nginx-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 6 03:01:01.008410 kubelet[2800]: I0306 03:01:01.008334 2800 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-ca-bundle" pod "11f62943-290a-4f7b-a132-33bce0f8a5ea" (UID: "11f62943-290a-4f7b-a132-33bce0f8a5ea"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Mar 6 03:01:01.016193 kubelet[2800]: I0306 03:01:01.016148 2800 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-backend-key-pair" pod "11f62943-290a-4f7b-a132-33bce0f8a5ea" (UID: "11f62943-290a-4f7b-a132-33bce0f8a5ea"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Mar 6 03:01:01.018987 kubelet[2800]: I0306 03:01:01.018929 2800 operation_generator.go:779] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11f62943-290a-4f7b-a132-33bce0f8a5ea-kube-api-access-s67zb" pod "11f62943-290a-4f7b-a132-33bce0f8a5ea" (UID: "11f62943-290a-4f7b-a132-33bce0f8a5ea"). InnerVolumeSpecName "kube-api-access-s67zb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Mar 6 03:01:01.019970 systemd[1]: var-lib-kubelet-pods-11f62943\x2d290a\x2d4f7b\x2da132\x2d33bce0f8a5ea-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ds67zb.mount: Deactivated successfully. Mar 6 03:01:01.020147 systemd[1]: var-lib-kubelet-pods-11f62943\x2d290a\x2d4f7b\x2da132\x2d33bce0f8a5ea-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Mar 6 03:01:01.106191 kubelet[2800]: I0306 03:01:01.106035 2800 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s67zb\" (UniqueName: \"kubernetes.io/projected/11f62943-290a-4f7b-a132-33bce0f8a5ea-kube-api-access-s67zb\") on node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" DevicePath \"\"" Mar 6 03:01:01.106191 kubelet[2800]: I0306 03:01:01.106083 2800 reconciler_common.go:299] "Volume detached for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-nginx-config\") on node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" DevicePath \"\"" Mar 6 03:01:01.106191 kubelet[2800]: I0306 03:01:01.106105 2800 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-backend-key-pair\") on node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" DevicePath \"\"" Mar 6 03:01:01.106191 kubelet[2800]: I0306 03:01:01.106121 2800 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11f62943-290a-4f7b-a132-33bce0f8a5ea-whisker-ca-bundle\") on node \"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce\" DevicePath \"\"" Mar 6 03:01:01.534465 systemd[1]: Removed slice kubepods-besteffort-pod11f62943_290a_4f7b_a132_33bce0f8a5ea.slice - libcontainer container kubepods-besteffort-pod11f62943_290a_4f7b_a132_33bce0f8a5ea.slice. Mar 6 03:01:01.567895 kubelet[2800]: I0306 03:01:01.564992 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-node-bpq4z" podStartSLOduration=1.786823922 podStartE2EDuration="24.564960182s" podCreationTimestamp="2026-03-06 03:00:37 +0000 UTC" firstStartedPulling="2026-03-06 03:00:37.730156123 +0000 UTC m=+23.755114226" lastFinishedPulling="2026-03-06 03:01:00.508292377 +0000 UTC m=+46.533250486" observedRunningTime="2026-03-06 03:01:01.563119002 +0000 UTC m=+47.588077113" watchObservedRunningTime="2026-03-06 03:01:01.564960182 +0000 UTC m=+47.589918294" Mar 6 03:01:01.678220 systemd[1]: Created slice kubepods-besteffort-pod0d6b2b1e_2946_4519_a470_0b9fbcf949bd.slice - libcontainer container kubepods-besteffort-pod0d6b2b1e_2946_4519_a470_0b9fbcf949bd.slice. Mar 6 03:01:01.713405 kubelet[2800]: I0306 03:01:01.713211 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kslxc\" (UniqueName: \"kubernetes.io/projected/0d6b2b1e-2946-4519-a470-0b9fbcf949bd-kube-api-access-kslxc\") pod \"whisker-9986b44bf-kkx8p\" (UID: \"0d6b2b1e-2946-4519-a470-0b9fbcf949bd\") " pod="calico-system/whisker-9986b44bf-kkx8p" Mar 6 03:01:01.713663 kubelet[2800]: I0306 03:01:01.713473 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-config\" (UniqueName: \"kubernetes.io/configmap/0d6b2b1e-2946-4519-a470-0b9fbcf949bd-nginx-config\") pod \"whisker-9986b44bf-kkx8p\" (UID: \"0d6b2b1e-2946-4519-a470-0b9fbcf949bd\") " pod="calico-system/whisker-9986b44bf-kkx8p" Mar 6 03:01:01.715334 kubelet[2800]: I0306 03:01:01.713711 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0d6b2b1e-2946-4519-a470-0b9fbcf949bd-whisker-backend-key-pair\") pod \"whisker-9986b44bf-kkx8p\" (UID: \"0d6b2b1e-2946-4519-a470-0b9fbcf949bd\") " pod="calico-system/whisker-9986b44bf-kkx8p" Mar 6 03:01:01.715334 kubelet[2800]: I0306 03:01:01.713796 2800 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d6b2b1e-2946-4519-a470-0b9fbcf949bd-whisker-ca-bundle\") pod \"whisker-9986b44bf-kkx8p\" (UID: \"0d6b2b1e-2946-4519-a470-0b9fbcf949bd\") " pod="calico-system/whisker-9986b44bf-kkx8p" Mar 6 03:01:01.987957 containerd[1532]: time="2026-03-06T03:01:01.987880018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-9986b44bf-kkx8p,Uid:0d6b2b1e-2946-4519-a470-0b9fbcf949bd,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:02.146096 systemd-networkd[1442]: cali966777e666c: Link UP Mar 6 03:01:02.148501 systemd-networkd[1442]: cali966777e666c: Gained carrier Mar 6 03:01:02.184388 containerd[1532]: 2026-03-06 03:01:02.025 [ERROR][4046] cni-plugin/utils.go 116: File does not exist, skipping the error since RequireMTUFile is false error=open /var/lib/calico/mtu: no such file or directory filename="/var/lib/calico/mtu" Mar 6 03:01:02.184388 containerd[1532]: 2026-03-06 03:01:02.040 [INFO][4046] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0 whisker-9986b44bf- calico-system 0d6b2b1e-2946-4519-a470-0b9fbcf949bd 943 0 2026-03-06 03:01:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:9986b44bf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce whisker-9986b44bf-kkx8p eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali966777e666c [] [] }} ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Namespace="calico-system" Pod="whisker-9986b44bf-kkx8p" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-" Mar 6 03:01:02.184388 containerd[1532]: 2026-03-06 03:01:02.041 [INFO][4046] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Namespace="calico-system" Pod="whisker-9986b44bf-kkx8p" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" Mar 6 03:01:02.184388 containerd[1532]: 2026-03-06 03:01:02.076 [INFO][4057] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" HandleID="k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" Mar 6 03:01:02.184749 containerd[1532]: 2026-03-06 03:01:02.085 [INFO][4057] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" HandleID="k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fd860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", "pod":"whisker-9986b44bf-kkx8p", "timestamp":"2026-03-06 03:01:02.076625585 +0000 UTC"}, Hostname:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0001882c0)} Mar 6 03:01:02.184749 containerd[1532]: 2026-03-06 03:01:02.085 [INFO][4057] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:02.184749 containerd[1532]: 2026-03-06 03:01:02.085 [INFO][4057] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:02.184749 containerd[1532]: 2026-03-06 03:01:02.086 [INFO][4057] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 03:01:02.184749 containerd[1532]: 2026-03-06 03:01:02.089 [INFO][4057] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.184749 containerd[1532]: 2026-03-06 03:01:02.096 [INFO][4057] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.184749 containerd[1532]: 2026-03-06 03:01:02.102 [INFO][4057] ipam/ipam.go 526: Trying affinity for 192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.184749 containerd[1532]: 2026-03-06 03:01:02.104 [INFO][4057] ipam/ipam.go 160: Attempting to load block cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.188136 containerd[1532]: 2026-03-06 03:01:02.107 [INFO][4057] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.188136 containerd[1532]: 2026-03-06 03:01:02.107 [INFO][4057] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.188136 containerd[1532]: 2026-03-06 03:01:02.109 [INFO][4057] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80 Mar 6 03:01:02.188136 containerd[1532]: 2026-03-06 03:01:02.113 [INFO][4057] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.188136 containerd[1532]: 2026-03-06 03:01:02.121 [INFO][4057] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.38.65/26] block=192.168.38.64/26 handle="k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.188136 containerd[1532]: 2026-03-06 03:01:02.122 [INFO][4057] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.38.65/26] handle="k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:02.188136 containerd[1532]: 2026-03-06 03:01:02.122 [INFO][4057] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:02.188136 containerd[1532]: 2026-03-06 03:01:02.122 [INFO][4057] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.38.65/26] IPv6=[] ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" HandleID="k8s-pod-network.9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" Mar 6 03:01:02.188537 containerd[1532]: 2026-03-06 03:01:02.126 [INFO][4046] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Namespace="calico-system" Pod="whisker-9986b44bf-kkx8p" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0", GenerateName:"whisker-9986b44bf-", Namespace:"calico-system", SelfLink:"", UID:"0d6b2b1e-2946-4519-a470-0b9fbcf949bd", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 1, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"9986b44bf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"", Pod:"whisker-9986b44bf-kkx8p", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.38.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali966777e666c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:02.188663 containerd[1532]: 2026-03-06 03:01:02.126 [INFO][4046] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.38.65/32] ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Namespace="calico-system" Pod="whisker-9986b44bf-kkx8p" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" Mar 6 03:01:02.188663 containerd[1532]: 2026-03-06 03:01:02.127 [INFO][4046] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali966777e666c ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Namespace="calico-system" Pod="whisker-9986b44bf-kkx8p" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" Mar 6 03:01:02.188663 containerd[1532]: 2026-03-06 03:01:02.150 [INFO][4046] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Namespace="calico-system" Pod="whisker-9986b44bf-kkx8p" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" Mar 6 03:01:02.189037 containerd[1532]: 2026-03-06 03:01:02.152 [INFO][4046] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Namespace="calico-system" Pod="whisker-9986b44bf-kkx8p" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0", GenerateName:"whisker-9986b44bf-", Namespace:"calico-system", SelfLink:"", UID:"0d6b2b1e-2946-4519-a470-0b9fbcf949bd", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 1, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"9986b44bf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80", Pod:"whisker-9986b44bf-kkx8p", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.38.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali966777e666c", MAC:"52:51:93:cf:04:39", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:02.189168 containerd[1532]: 2026-03-06 03:01:02.178 [INFO][4046] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" Namespace="calico-system" Pod="whisker-9986b44bf-kkx8p" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-whisker--9986b44bf--kkx8p-eth0" Mar 6 03:01:02.247816 kubelet[2800]: I0306 03:01:02.247256 2800 kubelet_volumes.go:161] "Cleaned up orphaned pod volumes dir" podUID="11f62943-290a-4f7b-a132-33bce0f8a5ea" path="/var/lib/kubelet/pods/11f62943-290a-4f7b-a132-33bce0f8a5ea/volumes" Mar 6 03:01:02.271671 containerd[1532]: time="2026-03-06T03:01:02.270957275Z" level=info msg="connecting to shim 9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80" address="unix:///run/containerd/s/2d2efe183605593a83c7acc11ed8cd594ac0df9abf959083eeff16d5be4afb87" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:02.346680 systemd[1]: Started cri-containerd-9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80.scope - libcontainer container 9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80. Mar 6 03:01:02.521286 containerd[1532]: time="2026-03-06T03:01:02.521137661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-9986b44bf-kkx8p,Uid:0d6b2b1e-2946-4519-a470-0b9fbcf949bd,Namespace:calico-system,Attempt:0,} returns sandbox id \"9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80\"" Mar 6 03:01:02.531392 containerd[1532]: time="2026-03-06T03:01:02.531335288Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\"" Mar 6 03:01:03.466282 systemd-networkd[1442]: cali966777e666c: Gained IPv6LL Mar 6 03:01:03.642463 containerd[1532]: time="2026-03-06T03:01:03.642390390Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:03.644050 containerd[1532]: time="2026-03-06T03:01:03.643991837Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.31.4: active requests=0, bytes read=6039889" Mar 6 03:01:03.645447 containerd[1532]: time="2026-03-06T03:01:03.645378172Z" level=info msg="ImageCreate event name:\"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:03.648628 containerd[1532]: time="2026-03-06T03:01:03.648558386Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:03.649862 containerd[1532]: time="2026-03-06T03:01:03.649655161Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.31.4\" with image id \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:9690cd395efad501f2e0c40ce4969d87b736ae2e5ed454644e7b0fd8f756bfbc\", size \"7595926\" in 1.117982103s" Mar 6 03:01:03.649862 containerd[1532]: time="2026-03-06T03:01:03.649701932Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.31.4\" returns image reference \"sha256:c02b0051502f3aa7f0815d838ea93b53dfb6bd13f185d229260e08200daf7cf7\"" Mar 6 03:01:03.656966 containerd[1532]: time="2026-03-06T03:01:03.656841920Z" level=info msg="CreateContainer within sandbox \"9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Mar 6 03:01:03.670185 containerd[1532]: time="2026-03-06T03:01:03.667799906Z" level=info msg="Container 834be267801f0983234e949bbf7ef0437dd9a8af67cf6217a3e6805e1f97ea14: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:03.687620 containerd[1532]: time="2026-03-06T03:01:03.687561768Z" level=info msg="CreateContainer within sandbox \"9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"834be267801f0983234e949bbf7ef0437dd9a8af67cf6217a3e6805e1f97ea14\"" Mar 6 03:01:03.689253 containerd[1532]: time="2026-03-06T03:01:03.689202711Z" level=info msg="StartContainer for \"834be267801f0983234e949bbf7ef0437dd9a8af67cf6217a3e6805e1f97ea14\"" Mar 6 03:01:03.691167 containerd[1532]: time="2026-03-06T03:01:03.691004630Z" level=info msg="connecting to shim 834be267801f0983234e949bbf7ef0437dd9a8af67cf6217a3e6805e1f97ea14" address="unix:///run/containerd/s/2d2efe183605593a83c7acc11ed8cd594ac0df9abf959083eeff16d5be4afb87" protocol=ttrpc version=3 Mar 6 03:01:03.726151 systemd[1]: Started cri-containerd-834be267801f0983234e949bbf7ef0437dd9a8af67cf6217a3e6805e1f97ea14.scope - libcontainer container 834be267801f0983234e949bbf7ef0437dd9a8af67cf6217a3e6805e1f97ea14. Mar 6 03:01:03.829575 containerd[1532]: time="2026-03-06T03:01:03.829522377Z" level=info msg="StartContainer for \"834be267801f0983234e949bbf7ef0437dd9a8af67cf6217a3e6805e1f97ea14\" returns successfully" Mar 6 03:01:03.832932 containerd[1532]: time="2026-03-06T03:01:03.832887806Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\"" Mar 6 03:01:04.665337 kubelet[2800]: I0306 03:01:04.664918 2800 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 6 03:01:05.591962 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2277029061.mount: Deactivated successfully. Mar 6 03:01:05.621821 containerd[1532]: time="2026-03-06T03:01:05.621606998Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:05.625343 containerd[1532]: time="2026-03-06T03:01:05.625289760Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.31.4: active requests=0, bytes read=17609475" Mar 6 03:01:05.629796 containerd[1532]: time="2026-03-06T03:01:05.628444200Z" level=info msg="ImageCreate event name:\"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:05.635375 containerd[1532]: time="2026-03-06T03:01:05.635200096Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:05.639190 containerd[1532]: time="2026-03-06T03:01:05.639118641Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" with image id \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:d252061aa298c4b17cf092517b5126af97cf95e0f56b21281b95a5f8702f15fc\", size \"17609305\" in 1.806172274s" Mar 6 03:01:05.639415 containerd[1532]: time="2026-03-06T03:01:05.639387280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.31.4\" returns image reference \"sha256:0749e3da0398e8402eb119f09acf145e5dd9759adb6eb3802ad6dc1b9bbedf1c\"" Mar 6 03:01:05.654127 containerd[1532]: time="2026-03-06T03:01:05.654077286Z" level=info msg="CreateContainer within sandbox \"9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Mar 6 03:01:05.686982 containerd[1532]: time="2026-03-06T03:01:05.686930353Z" level=info msg="Container 7c785a7f46f2d9f10457e5d366caefc1705409420566696dc2ee47f61a158d35: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:05.701869 containerd[1532]: time="2026-03-06T03:01:05.701590449Z" level=info msg="CreateContainer within sandbox \"9c90d8081893ded5c29a326b2f485fbf06e3702fd3db3d955eb27a8cfb90bf80\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"7c785a7f46f2d9f10457e5d366caefc1705409420566696dc2ee47f61a158d35\"" Mar 6 03:01:05.704334 containerd[1532]: time="2026-03-06T03:01:05.704292916Z" level=info msg="StartContainer for \"7c785a7f46f2d9f10457e5d366caefc1705409420566696dc2ee47f61a158d35\"" Mar 6 03:01:05.709041 containerd[1532]: time="2026-03-06T03:01:05.708991681Z" level=info msg="connecting to shim 7c785a7f46f2d9f10457e5d366caefc1705409420566696dc2ee47f61a158d35" address="unix:///run/containerd/s/2d2efe183605593a83c7acc11ed8cd594ac0df9abf959083eeff16d5be4afb87" protocol=ttrpc version=3 Mar 6 03:01:05.755367 systemd[1]: Started cri-containerd-7c785a7f46f2d9f10457e5d366caefc1705409420566696dc2ee47f61a158d35.scope - libcontainer container 7c785a7f46f2d9f10457e5d366caefc1705409420566696dc2ee47f61a158d35. Mar 6 03:01:05.878299 containerd[1532]: time="2026-03-06T03:01:05.877874323Z" level=info msg="StartContainer for \"7c785a7f46f2d9f10457e5d366caefc1705409420566696dc2ee47f61a158d35\" returns successfully" Mar 6 03:01:06.349158 ntpd[1656]: Listen normally on 6 cali966777e666c [fe80::ecee:eeff:feee:eeee%4]:123 Mar 6 03:01:06.350790 ntpd[1656]: 6 Mar 03:01:06 ntpd[1656]: Listen normally on 6 cali966777e666c [fe80::ecee:eeff:feee:eeee%4]:123 Mar 6 03:01:06.578385 kubelet[2800]: I0306 03:01:06.577888 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/whisker-9986b44bf-kkx8p" podStartSLOduration=2.465025479 podStartE2EDuration="5.577862368s" podCreationTimestamp="2026-03-06 03:01:01 +0000 UTC" firstStartedPulling="2026-03-06 03:01:02.529078642 +0000 UTC m=+48.554036733" lastFinishedPulling="2026-03-06 03:01:05.641915537 +0000 UTC m=+51.666873622" observedRunningTime="2026-03-06 03:01:06.575990288 +0000 UTC m=+52.600948405" watchObservedRunningTime="2026-03-06 03:01:06.577862368 +0000 UTC m=+52.602820482" Mar 6 03:01:06.618521 systemd-networkd[1442]: vxlan.calico: Link UP Mar 6 03:01:06.618534 systemd-networkd[1442]: vxlan.calico: Gained carrier Mar 6 03:01:07.882795 systemd-networkd[1442]: vxlan.calico: Gained IPv6LL Mar 6 03:01:10.349160 ntpd[1656]: Listen normally on 7 vxlan.calico 192.168.38.64:123 Mar 6 03:01:10.349656 ntpd[1656]: 6 Mar 03:01:10 ntpd[1656]: Listen normally on 7 vxlan.calico 192.168.38.64:123 Mar 6 03:01:10.349656 ntpd[1656]: 6 Mar 03:01:10 ntpd[1656]: Listen normally on 8 vxlan.calico [fe80::645b:efff:fef8:2322%5]:123 Mar 6 03:01:10.349249 ntpd[1656]: Listen normally on 8 vxlan.calico [fe80::645b:efff:fef8:2322%5]:123 Mar 6 03:01:11.231615 containerd[1532]: time="2026-03-06T03:01:11.231456614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-9kf8c,Uid:7742d6cc-2fcb-4303-99a2-f9d5a0e44f95,Namespace:kube-system,Attempt:0,}" Mar 6 03:01:11.234027 containerd[1532]: time="2026-03-06T03:01:11.233915498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6857b68657-cl7tg,Uid:cf647f0e-6583-4488-a6c3-323a650da7e7,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:11.238303 containerd[1532]: time="2026-03-06T03:01:11.237256829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2m65n,Uid:3669bc8e-a01a-47d2-a2d9-628a14d7e7f5,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:11.244357 containerd[1532]: time="2026-03-06T03:01:11.244286308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-x4r4t,Uid:2658b953-d894-4e8f-ae5f-28690d98a0a9,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:11.594697 systemd-networkd[1442]: calia615b41eeea: Link UP Mar 6 03:01:11.601206 systemd-networkd[1442]: calia615b41eeea: Gained carrier Mar 6 03:01:11.638024 containerd[1532]: 2026-03-06 03:01:11.371 [INFO][4493] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0 calico-apiserver-6857b68657- calico-system cf647f0e-6583-4488-a6c3-323a650da7e7 892 0 2026-03-06 03:00:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6857b68657 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce calico-apiserver-6857b68657-cl7tg eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] calia615b41eeea [] [] }} ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Namespace="calico-system" Pod="calico-apiserver-6857b68657-cl7tg" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-" Mar 6 03:01:11.638024 containerd[1532]: 2026-03-06 03:01:11.372 [INFO][4493] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Namespace="calico-system" Pod="calico-apiserver-6857b68657-cl7tg" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" Mar 6 03:01:11.638024 containerd[1532]: 2026-03-06 03:01:11.480 [INFO][4534] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" HandleID="k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" Mar 6 03:01:11.638369 containerd[1532]: 2026-03-06 03:01:11.503 [INFO][4534] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" HandleID="k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003806d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", "pod":"calico-apiserver-6857b68657-cl7tg", "timestamp":"2026-03-06 03:01:11.480901952 +0000 UTC"}, Hostname:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000188dc0)} Mar 6 03:01:11.638369 containerd[1532]: 2026-03-06 03:01:11.504 [INFO][4534] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:11.638369 containerd[1532]: 2026-03-06 03:01:11.504 [INFO][4534] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:11.638369 containerd[1532]: 2026-03-06 03:01:11.504 [INFO][4534] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 03:01:11.638369 containerd[1532]: 2026-03-06 03:01:11.508 [INFO][4534] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.638369 containerd[1532]: 2026-03-06 03:01:11.517 [INFO][4534] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.638369 containerd[1532]: 2026-03-06 03:01:11.524 [INFO][4534] ipam/ipam.go 526: Trying affinity for 192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.638369 containerd[1532]: 2026-03-06 03:01:11.527 [INFO][4534] ipam/ipam.go 160: Attempting to load block cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.639372 containerd[1532]: 2026-03-06 03:01:11.532 [INFO][4534] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.639372 containerd[1532]: 2026-03-06 03:01:11.532 [INFO][4534] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.639372 containerd[1532]: 2026-03-06 03:01:11.538 [INFO][4534] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e Mar 6 03:01:11.639372 containerd[1532]: 2026-03-06 03:01:11.545 [INFO][4534] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.639372 containerd[1532]: 2026-03-06 03:01:11.566 [INFO][4534] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.38.66/26] block=192.168.38.64/26 handle="k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.639372 containerd[1532]: 2026-03-06 03:01:11.567 [INFO][4534] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.38.66/26] handle="k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.639372 containerd[1532]: 2026-03-06 03:01:11.567 [INFO][4534] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:11.639372 containerd[1532]: 2026-03-06 03:01:11.567 [INFO][4534] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.38.66/26] IPv6=[] ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" HandleID="k8s-pod-network.2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" Mar 6 03:01:11.639725 containerd[1532]: 2026-03-06 03:01:11.572 [INFO][4493] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Namespace="calico-system" Pod="calico-apiserver-6857b68657-cl7tg" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0", GenerateName:"calico-apiserver-6857b68657-", Namespace:"calico-system", SelfLink:"", UID:"cf647f0e-6583-4488-a6c3-323a650da7e7", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6857b68657", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"", Pod:"calico-apiserver-6857b68657-cl7tg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.38.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calia615b41eeea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:11.640557 containerd[1532]: 2026-03-06 03:01:11.572 [INFO][4493] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.38.66/32] ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Namespace="calico-system" Pod="calico-apiserver-6857b68657-cl7tg" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" Mar 6 03:01:11.640557 containerd[1532]: 2026-03-06 03:01:11.572 [INFO][4493] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia615b41eeea ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Namespace="calico-system" Pod="calico-apiserver-6857b68657-cl7tg" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" Mar 6 03:01:11.640557 containerd[1532]: 2026-03-06 03:01:11.607 [INFO][4493] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Namespace="calico-system" Pod="calico-apiserver-6857b68657-cl7tg" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" Mar 6 03:01:11.640716 containerd[1532]: 2026-03-06 03:01:11.609 [INFO][4493] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Namespace="calico-system" Pod="calico-apiserver-6857b68657-cl7tg" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0", GenerateName:"calico-apiserver-6857b68657-", Namespace:"calico-system", SelfLink:"", UID:"cf647f0e-6583-4488-a6c3-323a650da7e7", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6857b68657", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e", Pod:"calico-apiserver-6857b68657-cl7tg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.38.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"calia615b41eeea", MAC:"aa:00:a5:7b:95:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:11.641359 containerd[1532]: 2026-03-06 03:01:11.634 [INFO][4493] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" Namespace="calico-system" Pod="calico-apiserver-6857b68657-cl7tg" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--cl7tg-eth0" Mar 6 03:01:11.705445 containerd[1532]: time="2026-03-06T03:01:11.705297720Z" level=info msg="connecting to shim 2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e" address="unix:///run/containerd/s/e4840d2a3c109d41539238ea7d3009c9437cf9e6bf2e00757e5e851c4785b1e5" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:11.727510 systemd-networkd[1442]: calif569210a8f7: Link UP Mar 6 03:01:11.732622 systemd-networkd[1442]: calif569210a8f7: Gained carrier Mar 6 03:01:11.790554 systemd[1]: Started cri-containerd-2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e.scope - libcontainer container 2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e. Mar 6 03:01:11.796698 containerd[1532]: 2026-03-06 03:01:11.424 [INFO][4501] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0 csi-node-driver- calico-system 3669bc8e-a01a-47d2-a2d9-628a14d7e7f5 745 0 2026-03-06 03:00:37 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:589b8b8d94 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce csi-node-driver-2m65n eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif569210a8f7 [] [] }} ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Namespace="calico-system" Pod="csi-node-driver-2m65n" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-" Mar 6 03:01:11.796698 containerd[1532]: 2026-03-06 03:01:11.425 [INFO][4501] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Namespace="calico-system" Pod="csi-node-driver-2m65n" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" Mar 6 03:01:11.796698 containerd[1532]: 2026-03-06 03:01:11.553 [INFO][4541] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" HandleID="k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" Mar 6 03:01:11.797101 containerd[1532]: 2026-03-06 03:01:11.585 [INFO][4541] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" HandleID="k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fdd0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", "pod":"csi-node-driver-2m65n", "timestamp":"2026-03-06 03:01:11.553200465 +0000 UTC"}, Hostname:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000188000)} Mar 6 03:01:11.797101 containerd[1532]: 2026-03-06 03:01:11.585 [INFO][4541] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:11.797101 containerd[1532]: 2026-03-06 03:01:11.585 [INFO][4541] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:11.797101 containerd[1532]: 2026-03-06 03:01:11.586 [INFO][4541] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 03:01:11.797101 containerd[1532]: 2026-03-06 03:01:11.612 [INFO][4541] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.797101 containerd[1532]: 2026-03-06 03:01:11.640 [INFO][4541] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.797101 containerd[1532]: 2026-03-06 03:01:11.656 [INFO][4541] ipam/ipam.go 526: Trying affinity for 192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.797101 containerd[1532]: 2026-03-06 03:01:11.659 [INFO][4541] ipam/ipam.go 160: Attempting to load block cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.798382 containerd[1532]: 2026-03-06 03:01:11.664 [INFO][4541] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.798382 containerd[1532]: 2026-03-06 03:01:11.664 [INFO][4541] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.798382 containerd[1532]: 2026-03-06 03:01:11.667 [INFO][4541] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b Mar 6 03:01:11.798382 containerd[1532]: 2026-03-06 03:01:11.674 [INFO][4541] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.798382 containerd[1532]: 2026-03-06 03:01:11.691 [INFO][4541] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.38.67/26] block=192.168.38.64/26 handle="k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.798382 containerd[1532]: 2026-03-06 03:01:11.691 [INFO][4541] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.38.67/26] handle="k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.798382 containerd[1532]: 2026-03-06 03:01:11.691 [INFO][4541] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:11.798382 containerd[1532]: 2026-03-06 03:01:11.691 [INFO][4541] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.38.67/26] IPv6=[] ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" HandleID="k8s-pod-network.a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" Mar 6 03:01:11.798748 containerd[1532]: 2026-03-06 03:01:11.706 [INFO][4501] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Namespace="calico-system" Pod="csi-node-driver-2m65n" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3669bc8e-a01a-47d2-a2d9-628a14d7e7f5", ResourceVersion:"745", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"", Pod:"csi-node-driver-2m65n", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.38.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif569210a8f7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:11.798879 containerd[1532]: 2026-03-06 03:01:11.706 [INFO][4501] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.38.67/32] ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Namespace="calico-system" Pod="csi-node-driver-2m65n" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" Mar 6 03:01:11.798879 containerd[1532]: 2026-03-06 03:01:11.707 [INFO][4501] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif569210a8f7 ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Namespace="calico-system" Pod="csi-node-driver-2m65n" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" Mar 6 03:01:11.798879 containerd[1532]: 2026-03-06 03:01:11.732 [INFO][4501] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Namespace="calico-system" Pod="csi-node-driver-2m65n" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" Mar 6 03:01:11.800115 containerd[1532]: 2026-03-06 03:01:11.743 [INFO][4501] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Namespace="calico-system" Pod="csi-node-driver-2m65n" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3669bc8e-a01a-47d2-a2d9-628a14d7e7f5", ResourceVersion:"745", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"589b8b8d94", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b", Pod:"csi-node-driver-2m65n", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.38.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif569210a8f7", MAC:"4a:f5:8f:7a:a9:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:11.800247 containerd[1532]: 2026-03-06 03:01:11.785 [INFO][4501] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" Namespace="calico-system" Pod="csi-node-driver-2m65n" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-csi--node--driver--2m65n-eth0" Mar 6 03:01:11.858306 systemd-networkd[1442]: caliac10b8e6b4b: Link UP Mar 6 03:01:11.860977 systemd-networkd[1442]: caliac10b8e6b4b: Gained carrier Mar 6 03:01:11.903327 containerd[1532]: time="2026-03-06T03:01:11.903264362Z" level=info msg="connecting to shim a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b" address="unix:///run/containerd/s/fb6f57e758a2ff1c92c82db1c6e3104ca6970435bd4ebdf364571aebecf23683" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:11.926078 containerd[1532]: 2026-03-06 03:01:11.449 [INFO][4486] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0 coredns-7d764666f9- kube-system 7742d6cc-2fcb-4303-99a2-f9d5a0e44f95 887 0 2026-03-06 03:00:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce coredns-7d764666f9-9kf8c eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliac10b8e6b4b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Namespace="kube-system" Pod="coredns-7d764666f9-9kf8c" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-" Mar 6 03:01:11.926078 containerd[1532]: 2026-03-06 03:01:11.451 [INFO][4486] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Namespace="kube-system" Pod="coredns-7d764666f9-9kf8c" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" Mar 6 03:01:11.926078 containerd[1532]: 2026-03-06 03:01:11.580 [INFO][4552] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" HandleID="k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" Mar 6 03:01:11.926662 containerd[1532]: 2026-03-06 03:01:11.610 [INFO][4552] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" HandleID="k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f600), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", "pod":"coredns-7d764666f9-9kf8c", "timestamp":"2026-03-06 03:01:11.580574542 +0000 UTC"}, Hostname:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc00060eb00)} Mar 6 03:01:11.926662 containerd[1532]: 2026-03-06 03:01:11.612 [INFO][4552] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:11.926662 containerd[1532]: 2026-03-06 03:01:11.692 [INFO][4552] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:11.926662 containerd[1532]: 2026-03-06 03:01:11.692 [INFO][4552] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 03:01:11.926662 containerd[1532]: 2026-03-06 03:01:11.713 [INFO][4552] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.926662 containerd[1532]: 2026-03-06 03:01:11.737 [INFO][4552] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.926662 containerd[1532]: 2026-03-06 03:01:11.763 [INFO][4552] ipam/ipam.go 526: Trying affinity for 192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.926662 containerd[1532]: 2026-03-06 03:01:11.768 [INFO][4552] ipam/ipam.go 160: Attempting to load block cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.927496 containerd[1532]: 2026-03-06 03:01:11.775 [INFO][4552] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.927496 containerd[1532]: 2026-03-06 03:01:11.775 [INFO][4552] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.927496 containerd[1532]: 2026-03-06 03:01:11.794 [INFO][4552] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7 Mar 6 03:01:11.927496 containerd[1532]: 2026-03-06 03:01:11.807 [INFO][4552] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.927496 containerd[1532]: 2026-03-06 03:01:11.821 [INFO][4552] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.38.68/26] block=192.168.38.64/26 handle="k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.927496 containerd[1532]: 2026-03-06 03:01:11.821 [INFO][4552] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.38.68/26] handle="k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:11.927496 containerd[1532]: 2026-03-06 03:01:11.821 [INFO][4552] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:11.927496 containerd[1532]: 2026-03-06 03:01:11.821 [INFO][4552] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.38.68/26] IPv6=[] ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" HandleID="k8s-pod-network.7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" Mar 6 03:01:11.928821 containerd[1532]: 2026-03-06 03:01:11.825 [INFO][4486] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Namespace="kube-system" Pod="coredns-7d764666f9-9kf8c" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"7742d6cc-2fcb-4303-99a2-f9d5a0e44f95", ResourceVersion:"887", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"", Pod:"coredns-7d764666f9-9kf8c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.38.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliac10b8e6b4b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:11.928821 containerd[1532]: 2026-03-06 03:01:11.826 [INFO][4486] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.38.68/32] ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Namespace="kube-system" Pod="coredns-7d764666f9-9kf8c" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" Mar 6 03:01:11.928821 containerd[1532]: 2026-03-06 03:01:11.826 [INFO][4486] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliac10b8e6b4b ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Namespace="kube-system" Pod="coredns-7d764666f9-9kf8c" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" Mar 6 03:01:11.928821 containerd[1532]: 2026-03-06 03:01:11.873 [INFO][4486] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Namespace="kube-system" Pod="coredns-7d764666f9-9kf8c" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" Mar 6 03:01:11.929667 containerd[1532]: 2026-03-06 03:01:11.875 [INFO][4486] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Namespace="kube-system" Pod="coredns-7d764666f9-9kf8c" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"7742d6cc-2fcb-4303-99a2-f9d5a0e44f95", ResourceVersion:"887", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7", Pod:"coredns-7d764666f9-9kf8c", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.38.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliac10b8e6b4b", MAC:"5a:f3:2f:bc:b4:69", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:11.929667 containerd[1532]: 2026-03-06 03:01:11.913 [INFO][4486] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" Namespace="kube-system" Pod="coredns-7d764666f9-9kf8c" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--9kf8c-eth0" Mar 6 03:01:11.940104 systemd-networkd[1442]: cali4e96f9320ed: Link UP Mar 6 03:01:11.941604 systemd-networkd[1442]: cali4e96f9320ed: Gained carrier Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.423 [INFO][4507] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0 goldmane-9f7667bb8- calico-system 2658b953-d894-4e8f-ae5f-28690d98a0a9 891 0 2026-03-06 03:00:36 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:9f7667bb8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce goldmane-9f7667bb8-x4r4t eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali4e96f9320ed [] [] }} ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Namespace="calico-system" Pod="goldmane-9f7667bb8-x4r4t" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.424 [INFO][4507] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Namespace="calico-system" Pod="goldmane-9f7667bb8-x4r4t" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.618 [INFO][4543] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" HandleID="k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.641 [INFO][4543] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" HandleID="k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000125d40), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", "pod":"goldmane-9f7667bb8-x4r4t", "timestamp":"2026-03-06 03:01:11.618471685 +0000 UTC"}, Hostname:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000302000)} Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.641 [INFO][4543] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.822 [INFO][4543] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.822 [INFO][4543] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.826 [INFO][4543] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.853 [INFO][4543] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.868 [INFO][4543] ipam/ipam.go 526: Trying affinity for 192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.874 [INFO][4543] ipam/ipam.go 160: Attempting to load block cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.880 [INFO][4543] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.881 [INFO][4543] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.888 [INFO][4543] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.903 [INFO][4543] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.928 [INFO][4543] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.38.69/26] block=192.168.38.64/26 handle="k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.928 [INFO][4543] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.38.69/26] handle="k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.928 [INFO][4543] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:12.003810 containerd[1532]: 2026-03-06 03:01:11.928 [INFO][4543] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.38.69/26] IPv6=[] ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" HandleID="k8s-pod-network.6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" Mar 6 03:01:12.004988 containerd[1532]: 2026-03-06 03:01:11.936 [INFO][4507] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Namespace="calico-system" Pod="goldmane-9f7667bb8-x4r4t" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"2658b953-d894-4e8f-ae5f-28690d98a0a9", ResourceVersion:"891", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"", Pod:"goldmane-9f7667bb8-x4r4t", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.38.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4e96f9320ed", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:12.004988 containerd[1532]: 2026-03-06 03:01:11.936 [INFO][4507] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.38.69/32] ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Namespace="calico-system" Pod="goldmane-9f7667bb8-x4r4t" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" Mar 6 03:01:12.004988 containerd[1532]: 2026-03-06 03:01:11.936 [INFO][4507] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4e96f9320ed ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Namespace="calico-system" Pod="goldmane-9f7667bb8-x4r4t" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" Mar 6 03:01:12.004988 containerd[1532]: 2026-03-06 03:01:11.943 [INFO][4507] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Namespace="calico-system" Pod="goldmane-9f7667bb8-x4r4t" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" Mar 6 03:01:12.004988 containerd[1532]: 2026-03-06 03:01:11.945 [INFO][4507] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Namespace="calico-system" Pod="goldmane-9f7667bb8-x4r4t" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0", GenerateName:"goldmane-9f7667bb8-", Namespace:"calico-system", SelfLink:"", UID:"2658b953-d894-4e8f-ae5f-28690d98a0a9", ResourceVersion:"891", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"9f7667bb8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e", Pod:"goldmane-9f7667bb8-x4r4t", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.38.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4e96f9320ed", MAC:"b6:25:25:ba:ad:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:12.004988 containerd[1532]: 2026-03-06 03:01:11.979 [INFO][4507] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" Namespace="calico-system" Pod="goldmane-9f7667bb8-x4r4t" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-goldmane--9f7667bb8--x4r4t-eth0" Mar 6 03:01:12.029851 containerd[1532]: time="2026-03-06T03:01:12.028006175Z" level=info msg="connecting to shim 7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7" address="unix:///run/containerd/s/a0436bb602778faf9c9563c014a658c840220ec9ca636449293467ff806088d6" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:12.041053 systemd[1]: Started cri-containerd-a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b.scope - libcontainer container a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b. Mar 6 03:01:12.098422 containerd[1532]: time="2026-03-06T03:01:12.097425800Z" level=info msg="connecting to shim 6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e" address="unix:///run/containerd/s/6398bba15ba2a5e9f4c3aacdbdf201ad7642cde2ff85f08f93672f41dfcc542d" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:12.160408 systemd[1]: Started cri-containerd-7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7.scope - libcontainer container 7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7. Mar 6 03:01:12.171339 containerd[1532]: time="2026-03-06T03:01:12.171261267Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6857b68657-cl7tg,Uid:cf647f0e-6583-4488-a6c3-323a650da7e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e\"" Mar 6 03:01:12.176132 containerd[1532]: time="2026-03-06T03:01:12.176083061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\"" Mar 6 03:01:12.217400 containerd[1532]: time="2026-03-06T03:01:12.217348279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2m65n,Uid:3669bc8e-a01a-47d2-a2d9-628a14d7e7f5,Namespace:calico-system,Attempt:0,} returns sandbox id \"a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b\"" Mar 6 03:01:12.218326 systemd[1]: Started cri-containerd-6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e.scope - libcontainer container 6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e. Mar 6 03:01:12.235311 containerd[1532]: time="2026-03-06T03:01:12.234169158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c9c5b7bf5-r8t6w,Uid:548ef683-b05f-40c7-8c04-ab888c23b7fb,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:12.240039 containerd[1532]: time="2026-03-06T03:01:12.239993222Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-bvfqk,Uid:0228e152-72ed-4332-a06f-c4b8ac58bc00,Namespace:kube-system,Attempt:0,}" Mar 6 03:01:12.396739 containerd[1532]: time="2026-03-06T03:01:12.396662082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-9kf8c,Uid:7742d6cc-2fcb-4303-99a2-f9d5a0e44f95,Namespace:kube-system,Attempt:0,} returns sandbox id \"7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7\"" Mar 6 03:01:12.414557 containerd[1532]: time="2026-03-06T03:01:12.413655061Z" level=info msg="CreateContainer within sandbox \"7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 6 03:01:12.472737 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1701099589.mount: Deactivated successfully. Mar 6 03:01:12.490607 containerd[1532]: time="2026-03-06T03:01:12.490555886Z" level=info msg="Container 167b9d717f8eb21bf9795c9720967e1216fb3193018d961fa2aeffdf35f216c5: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:12.505380 containerd[1532]: time="2026-03-06T03:01:12.505325072Z" level=info msg="CreateContainer within sandbox \"7f37ed68e75bbcb340bfcdcec365024526eb73c50b39926bbf7e59de1a54c6c7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"167b9d717f8eb21bf9795c9720967e1216fb3193018d961fa2aeffdf35f216c5\"" Mar 6 03:01:12.507350 containerd[1532]: time="2026-03-06T03:01:12.507307069Z" level=info msg="StartContainer for \"167b9d717f8eb21bf9795c9720967e1216fb3193018d961fa2aeffdf35f216c5\"" Mar 6 03:01:12.510064 containerd[1532]: time="2026-03-06T03:01:12.510009163Z" level=info msg="connecting to shim 167b9d717f8eb21bf9795c9720967e1216fb3193018d961fa2aeffdf35f216c5" address="unix:///run/containerd/s/a0436bb602778faf9c9563c014a658c840220ec9ca636449293467ff806088d6" protocol=ttrpc version=3 Mar 6 03:01:12.548541 systemd[1]: Started cri-containerd-167b9d717f8eb21bf9795c9720967e1216fb3193018d961fa2aeffdf35f216c5.scope - libcontainer container 167b9d717f8eb21bf9795c9720967e1216fb3193018d961fa2aeffdf35f216c5. Mar 6 03:01:12.685501 containerd[1532]: time="2026-03-06T03:01:12.684489823Z" level=info msg="StartContainer for \"167b9d717f8eb21bf9795c9720967e1216fb3193018d961fa2aeffdf35f216c5\" returns successfully" Mar 6 03:01:12.723928 containerd[1532]: time="2026-03-06T03:01:12.723621977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-9f7667bb8-x4r4t,Uid:2658b953-d894-4e8f-ae5f-28690d98a0a9,Namespace:calico-system,Attempt:0,} returns sandbox id \"6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e\"" Mar 6 03:01:12.831874 systemd-networkd[1442]: calia6a6272503e: Link UP Mar 6 03:01:12.833757 systemd-networkd[1442]: calia6a6272503e: Gained carrier Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.472 [INFO][4780] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0 coredns-7d764666f9- kube-system 0228e152-72ed-4332-a06f-c4b8ac58bc00 885 0 2026-03-06 03:00:19 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7d764666f9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce coredns-7d764666f9-bvfqk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia6a6272503e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Namespace="kube-system" Pod="coredns-7d764666f9-bvfqk" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.474 [INFO][4780] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Namespace="kube-system" Pod="coredns-7d764666f9-bvfqk" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.745 [INFO][4819] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" HandleID="k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.758 [INFO][4819] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" HandleID="k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c8b90), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", "pod":"coredns-7d764666f9-bvfqk", "timestamp":"2026-03-06 03:01:12.745847912 +0000 UTC"}, Hostname:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0003b66e0)} Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.759 [INFO][4819] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.759 [INFO][4819] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.759 [INFO][4819] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.764 [INFO][4819] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.788 [INFO][4819] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.796 [INFO][4819] ipam/ipam.go 526: Trying affinity for 192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.799 [INFO][4819] ipam/ipam.go 160: Attempting to load block cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.803 [INFO][4819] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.803 [INFO][4819] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.806 [INFO][4819] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707 Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.813 [INFO][4819] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.822 [INFO][4819] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.38.70/26] block=192.168.38.64/26 handle="k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.823 [INFO][4819] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.38.70/26] handle="k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.823 [INFO][4819] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:12.857643 containerd[1532]: 2026-03-06 03:01:12.823 [INFO][4819] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.38.70/26] IPv6=[] ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" HandleID="k8s-pod-network.c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" Mar 6 03:01:12.860403 containerd[1532]: 2026-03-06 03:01:12.826 [INFO][4780] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Namespace="kube-system" Pod="coredns-7d764666f9-bvfqk" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"0228e152-72ed-4332-a06f-c4b8ac58bc00", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"", Pod:"coredns-7d764666f9-bvfqk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.38.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia6a6272503e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:12.860403 containerd[1532]: 2026-03-06 03:01:12.826 [INFO][4780] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.38.70/32] ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Namespace="kube-system" Pod="coredns-7d764666f9-bvfqk" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" Mar 6 03:01:12.860403 containerd[1532]: 2026-03-06 03:01:12.826 [INFO][4780] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia6a6272503e ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Namespace="kube-system" Pod="coredns-7d764666f9-bvfqk" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" Mar 6 03:01:12.860403 containerd[1532]: 2026-03-06 03:01:12.835 [INFO][4780] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Namespace="kube-system" Pod="coredns-7d764666f9-bvfqk" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" Mar 6 03:01:12.860728 containerd[1532]: 2026-03-06 03:01:12.836 [INFO][4780] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Namespace="kube-system" Pod="coredns-7d764666f9-bvfqk" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0", GenerateName:"coredns-7d764666f9-", Namespace:"kube-system", SelfLink:"", UID:"0228e152-72ed-4332-a06f-c4b8ac58bc00", ResourceVersion:"885", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7d764666f9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707", Pod:"coredns-7d764666f9-bvfqk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.38.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia6a6272503e", MAC:"4a:86:a9:8f:89:28", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:12.860728 containerd[1532]: 2026-03-06 03:01:12.849 [INFO][4780] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" Namespace="kube-system" Pod="coredns-7d764666f9-bvfqk" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-coredns--7d764666f9--bvfqk-eth0" Mar 6 03:01:12.906807 containerd[1532]: time="2026-03-06T03:01:12.906491358Z" level=info msg="connecting to shim c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707" address="unix:///run/containerd/s/2b4a005a790e7f3aaa62ade688179f59bbfdaf5f385a3a83ffec1a6b2156c197" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:12.979075 systemd-networkd[1442]: calie949ad93022: Link UP Mar 6 03:01:12.979487 systemd-networkd[1442]: calie949ad93022: Gained carrier Mar 6 03:01:12.980290 systemd[1]: Started cri-containerd-c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707.scope - libcontainer container c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707. Mar 6 03:01:13.002072 systemd-networkd[1442]: cali4e96f9320ed: Gained IPv6LL Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.517 [INFO][4783] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0 calico-kube-controllers-5c9c5b7bf5- calico-system 548ef683-b05f-40c7-8c04-ab888c23b7fb 889 0 2026-03-06 03:00:37 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5c9c5b7bf5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce calico-kube-controllers-5c9c5b7bf5-r8t6w eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calie949ad93022 [] [] }} ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Namespace="calico-system" Pod="calico-kube-controllers-5c9c5b7bf5-r8t6w" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.517 [INFO][4783] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Namespace="calico-system" Pod="calico-kube-controllers-5c9c5b7bf5-r8t6w" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.733 [INFO][4842] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" HandleID="k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.765 [INFO][4842] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" HandleID="k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00068a120), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", "pod":"calico-kube-controllers-5c9c5b7bf5-r8t6w", "timestamp":"2026-03-06 03:01:12.733305521 +0000 UTC"}, Hostname:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc0004682c0)} Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.765 [INFO][4842] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.823 [INFO][4842] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.823 [INFO][4842] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.867 [INFO][4842] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.885 [INFO][4842] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.898 [INFO][4842] ipam/ipam.go 526: Trying affinity for 192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.906 [INFO][4842] ipam/ipam.go 160: Attempting to load block cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.913 [INFO][4842] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.913 [INFO][4842] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.918 [INFO][4842] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.928 [INFO][4842] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.950 [INFO][4842] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.38.71/26] block=192.168.38.64/26 handle="k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.950 [INFO][4842] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.38.71/26] handle="k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.951 [INFO][4842] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:13.017605 containerd[1532]: 2026-03-06 03:01:12.951 [INFO][4842] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.38.71/26] IPv6=[] ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" HandleID="k8s-pod-network.3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" Mar 6 03:01:13.018869 containerd[1532]: 2026-03-06 03:01:12.963 [INFO][4783] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Namespace="calico-system" Pod="calico-kube-controllers-5c9c5b7bf5-r8t6w" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0", GenerateName:"calico-kube-controllers-5c9c5b7bf5-", Namespace:"calico-system", SelfLink:"", UID:"548ef683-b05f-40c7-8c04-ab888c23b7fb", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c9c5b7bf5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"", Pod:"calico-kube-controllers-5c9c5b7bf5-r8t6w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.38.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie949ad93022", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:13.018869 containerd[1532]: 2026-03-06 03:01:12.963 [INFO][4783] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.38.71/32] ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Namespace="calico-system" Pod="calico-kube-controllers-5c9c5b7bf5-r8t6w" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" Mar 6 03:01:13.018869 containerd[1532]: 2026-03-06 03:01:12.963 [INFO][4783] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie949ad93022 ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Namespace="calico-system" Pod="calico-kube-controllers-5c9c5b7bf5-r8t6w" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" Mar 6 03:01:13.018869 containerd[1532]: 2026-03-06 03:01:12.983 [INFO][4783] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Namespace="calico-system" Pod="calico-kube-controllers-5c9c5b7bf5-r8t6w" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" Mar 6 03:01:13.018869 containerd[1532]: 2026-03-06 03:01:12.989 [INFO][4783] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Namespace="calico-system" Pod="calico-kube-controllers-5c9c5b7bf5-r8t6w" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0", GenerateName:"calico-kube-controllers-5c9c5b7bf5-", Namespace:"calico-system", SelfLink:"", UID:"548ef683-b05f-40c7-8c04-ab888c23b7fb", ResourceVersion:"889", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5c9c5b7bf5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff", Pod:"calico-kube-controllers-5c9c5b7bf5-r8t6w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.38.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calie949ad93022", MAC:"4a:79:72:f2:b3:80", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:13.018869 containerd[1532]: 2026-03-06 03:01:13.012 [INFO][4783] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" Namespace="calico-system" Pod="calico-kube-controllers-5c9c5b7bf5-r8t6w" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--kube--controllers--5c9c5b7bf5--r8t6w-eth0" Mar 6 03:01:13.072015 containerd[1532]: time="2026-03-06T03:01:13.071950736Z" level=info msg="connecting to shim 3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff" address="unix:///run/containerd/s/036b096d08c3617231256ff49e58774ba56e12ff11d8cfd88a9b0719e9080b0f" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:13.145150 systemd[1]: Started cri-containerd-3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff.scope - libcontainer container 3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff. Mar 6 03:01:13.162256 containerd[1532]: time="2026-03-06T03:01:13.162157651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7d764666f9-bvfqk,Uid:0228e152-72ed-4332-a06f-c4b8ac58bc00,Namespace:kube-system,Attempt:0,} returns sandbox id \"c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707\"" Mar 6 03:01:13.175348 containerd[1532]: time="2026-03-06T03:01:13.175286221Z" level=info msg="CreateContainer within sandbox \"c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 6 03:01:13.192398 containerd[1532]: time="2026-03-06T03:01:13.192294556Z" level=info msg="Container a139efc50be7fad8f7dbdc4d8ba485af32a4785a57d013af1ad19cc9f8c38556: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:13.205175 containerd[1532]: time="2026-03-06T03:01:13.205119939Z" level=info msg="CreateContainer within sandbox \"c7b0776daa6dfef9dfa9355aa092522edaf4f93e42de6d78eff80eb3356d3707\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a139efc50be7fad8f7dbdc4d8ba485af32a4785a57d013af1ad19cc9f8c38556\"" Mar 6 03:01:13.209124 containerd[1532]: time="2026-03-06T03:01:13.209062013Z" level=info msg="StartContainer for \"a139efc50be7fad8f7dbdc4d8ba485af32a4785a57d013af1ad19cc9f8c38556\"" Mar 6 03:01:13.213843 containerd[1532]: time="2026-03-06T03:01:13.212950255Z" level=info msg="connecting to shim a139efc50be7fad8f7dbdc4d8ba485af32a4785a57d013af1ad19cc9f8c38556" address="unix:///run/containerd/s/2b4a005a790e7f3aaa62ade688179f59bbfdaf5f385a3a83ffec1a6b2156c197" protocol=ttrpc version=3 Mar 6 03:01:13.294028 systemd[1]: Started cri-containerd-a139efc50be7fad8f7dbdc4d8ba485af32a4785a57d013af1ad19cc9f8c38556.scope - libcontainer container a139efc50be7fad8f7dbdc4d8ba485af32a4785a57d013af1ad19cc9f8c38556. Mar 6 03:01:13.322162 systemd-networkd[1442]: calia615b41eeea: Gained IPv6LL Mar 6 03:01:13.387411 systemd-networkd[1442]: calif569210a8f7: Gained IPv6LL Mar 6 03:01:13.394078 containerd[1532]: time="2026-03-06T03:01:13.394005645Z" level=info msg="StartContainer for \"a139efc50be7fad8f7dbdc4d8ba485af32a4785a57d013af1ad19cc9f8c38556\" returns successfully" Mar 6 03:01:13.406609 containerd[1532]: time="2026-03-06T03:01:13.406559027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5c9c5b7bf5-r8t6w,Uid:548ef683-b05f-40c7-8c04-ab888c23b7fb,Namespace:calico-system,Attempt:0,} returns sandbox id \"3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff\"" Mar 6 03:01:13.642147 systemd-networkd[1442]: caliac10b8e6b4b: Gained IPv6LL Mar 6 03:01:13.726593 kubelet[2800]: I0306 03:01:13.726395 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-9kf8c" podStartSLOduration=54.726371873 podStartE2EDuration="54.726371873s" podCreationTimestamp="2026-03-06 03:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:01:13.686327173 +0000 UTC m=+59.711285284" watchObservedRunningTime="2026-03-06 03:01:13.726371873 +0000 UTC m=+59.751329984" Mar 6 03:01:13.767751 kubelet[2800]: I0306 03:01:13.765662 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="kube-system/coredns-7d764666f9-bvfqk" podStartSLOduration=54.765640733 podStartE2EDuration="54.765640733s" podCreationTimestamp="2026-03-06 03:00:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:01:13.725019975 +0000 UTC m=+59.749978087" watchObservedRunningTime="2026-03-06 03:01:13.765640733 +0000 UTC m=+59.790598844" Mar 6 03:01:14.538007 systemd-networkd[1442]: calia6a6272503e: Gained IPv6LL Mar 6 03:01:14.794404 systemd-networkd[1442]: calie949ad93022: Gained IPv6LL Mar 6 03:01:15.231170 containerd[1532]: time="2026-03-06T03:01:15.231080898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6857b68657-htzf9,Uid:ac46cc05-0e3c-439d-bc6a-7388a9b0ece7,Namespace:calico-system,Attempt:0,}" Mar 6 03:01:15.418811 containerd[1532]: time="2026-03-06T03:01:15.418128883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:15.421202 containerd[1532]: time="2026-03-06T03:01:15.421157983Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.31.4: active requests=0, bytes read=48415780" Mar 6 03:01:15.423990 containerd[1532]: time="2026-03-06T03:01:15.423944304Z" level=info msg="ImageCreate event name:\"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:15.430161 containerd[1532]: time="2026-03-06T03:01:15.429991669Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:15.433704 containerd[1532]: time="2026-03-06T03:01:15.433640753Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" with image id \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:d212af1da3dd52a633bc9e36653a7d901d95a570f8d51d1968a837dcf6879730\", size \"49971841\" in 3.257289663s" Mar 6 03:01:15.435854 containerd[1532]: time="2026-03-06T03:01:15.435815516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.31.4\" returns image reference \"sha256:f7ff80340b9b4973ceda29859065985831588b2898f2b4009f742b5789010898\"" Mar 6 03:01:15.442367 containerd[1532]: time="2026-03-06T03:01:15.439621380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\"" Mar 6 03:01:15.451413 containerd[1532]: time="2026-03-06T03:01:15.451343726Z" level=info msg="CreateContainer within sandbox \"2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 6 03:01:15.471665 containerd[1532]: time="2026-03-06T03:01:15.469946813Z" level=info msg="Container 44c05c718cfd6e870822e35606320d4b083e58c4dd53732b5c7f51cf5ae44a60: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:15.482090 systemd-networkd[1442]: cali62a3648e7a3: Link UP Mar 6 03:01:15.484235 systemd-networkd[1442]: cali62a3648e7a3: Gained carrier Mar 6 03:01:15.498708 containerd[1532]: time="2026-03-06T03:01:15.498193094Z" level=info msg="CreateContainer within sandbox \"2582398c96f69c52c7972817c429ba0063adf30292fcc26cba9be2892babf63e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"44c05c718cfd6e870822e35606320d4b083e58c4dd53732b5c7f51cf5ae44a60\"" Mar 6 03:01:15.502816 containerd[1532]: time="2026-03-06T03:01:15.502025031Z" level=info msg="StartContainer for \"44c05c718cfd6e870822e35606320d4b083e58c4dd53732b5c7f51cf5ae44a60\"" Mar 6 03:01:15.515130 containerd[1532]: time="2026-03-06T03:01:15.515023374Z" level=info msg="connecting to shim 44c05c718cfd6e870822e35606320d4b083e58c4dd53732b5c7f51cf5ae44a60" address="unix:///run/containerd/s/e4840d2a3c109d41539238ea7d3009c9437cf9e6bf2e00757e5e851c4785b1e5" protocol=ttrpc version=3 Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.327 [INFO][5073] cni-plugin/plugin.go 342: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0 calico-apiserver-6857b68657- calico-system ac46cc05-0e3c-439d-bc6a-7388a9b0ece7 890 0 2026-03-06 03:00:35 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6857b68657 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce calico-apiserver-6857b68657-htzf9 eth0 calico-apiserver [] [] [kns.calico-system ksa.calico-system.calico-apiserver] cali62a3648e7a3 [] [] }} ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Namespace="calico-system" Pod="calico-apiserver-6857b68657-htzf9" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.328 [INFO][5073] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Namespace="calico-system" Pod="calico-apiserver-6857b68657-htzf9" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.395 [INFO][5084] ipam/ipam_plugin.go 235: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" HandleID="k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.410 [INFO][5084] ipam/ipam_plugin.go 301: Auto assigning IP ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" HandleID="k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00038b360), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", "pod":"calico-apiserver-6857b68657-htzf9", "timestamp":"2026-03-06 03:01:15.395271162 +0000 UTC"}, Hostname:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload", Namespace:(*v1.Namespace)(0xc000112580)} Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.411 [INFO][5084] ipam/ipam_plugin.go 438: About to acquire host-wide IPAM lock. Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.411 [INFO][5084] ipam/ipam_plugin.go 453: Acquired host-wide IPAM lock. Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.411 [INFO][5084] ipam/ipam.go 112: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce' Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.414 [INFO][5084] ipam/ipam.go 707: Looking up existing affinities for host handle="k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.419 [INFO][5084] ipam/ipam.go 409: Looking up existing affinities for host host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.427 [INFO][5084] ipam/ipam.go 526: Trying affinity for 192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.430 [INFO][5084] ipam/ipam.go 160: Attempting to load block cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.438 [INFO][5084] ipam/ipam.go 237: Affinity is confirmed and block has been loaded cidr=192.168.38.64/26 host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.438 [INFO][5084] ipam/ipam.go 1245: Attempting to assign 1 addresses from block block=192.168.38.64/26 handle="k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.445 [INFO][5084] ipam/ipam.go 1806: Creating new handle: k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1 Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.454 [INFO][5084] ipam/ipam.go 1272: Writing block in order to claim IPs block=192.168.38.64/26 handle="k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.464 [INFO][5084] ipam/ipam.go 1288: Successfully claimed IPs: [192.168.38.72/26] block=192.168.38.64/26 handle="k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.465 [INFO][5084] ipam/ipam.go 895: Auto-assigned 1 out of 1 IPv4s: [192.168.38.72/26] handle="k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" host="ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce" Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.465 [INFO][5084] ipam/ipam_plugin.go 459: Released host-wide IPAM lock. Mar 6 03:01:15.518971 containerd[1532]: 2026-03-06 03:01:15.465 [INFO][5084] ipam/ipam_plugin.go 325: Calico CNI IPAM assigned addresses IPv4=[192.168.38.72/26] IPv6=[] ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" HandleID="k8s-pod-network.aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Workload="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" Mar 6 03:01:15.520525 containerd[1532]: 2026-03-06 03:01:15.475 [INFO][5073] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Namespace="calico-system" Pod="calico-apiserver-6857b68657-htzf9" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0", GenerateName:"calico-apiserver-6857b68657-", Namespace:"calico-system", SelfLink:"", UID:"ac46cc05-0e3c-439d-bc6a-7388a9b0ece7", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6857b68657", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"", Pod:"calico-apiserver-6857b68657-htzf9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.38.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali62a3648e7a3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:15.520525 containerd[1532]: 2026-03-06 03:01:15.476 [INFO][5073] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.38.72/32] ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Namespace="calico-system" Pod="calico-apiserver-6857b68657-htzf9" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" Mar 6 03:01:15.520525 containerd[1532]: 2026-03-06 03:01:15.477 [INFO][5073] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali62a3648e7a3 ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Namespace="calico-system" Pod="calico-apiserver-6857b68657-htzf9" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" Mar 6 03:01:15.520525 containerd[1532]: 2026-03-06 03:01:15.481 [INFO][5073] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Namespace="calico-system" Pod="calico-apiserver-6857b68657-htzf9" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" Mar 6 03:01:15.520525 containerd[1532]: 2026-03-06 03:01:15.486 [INFO][5073] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Namespace="calico-system" Pod="calico-apiserver-6857b68657-htzf9" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0", GenerateName:"calico-apiserver-6857b68657-", Namespace:"calico-system", SelfLink:"", UID:"ac46cc05-0e3c-439d-bc6a-7388a9b0ece7", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2026, time.March, 6, 3, 0, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6857b68657", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-2-3-nightly-20260305-2100-f99df941967df7bcb5ce", ContainerID:"aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1", Pod:"calico-apiserver-6857b68657-htzf9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.38.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-apiserver"}, InterfaceName:"cali62a3648e7a3", MAC:"9e:db:85:fa:99:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Mar 6 03:01:15.520525 containerd[1532]: 2026-03-06 03:01:15.513 [INFO][5073] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" Namespace="calico-system" Pod="calico-apiserver-6857b68657-htzf9" WorkloadEndpoint="ci--4459--2--3--nightly--20260305--2100--f99df941967df7bcb5ce-k8s-calico--apiserver--6857b68657--htzf9-eth0" Mar 6 03:01:15.608703 systemd[1]: Started cri-containerd-44c05c718cfd6e870822e35606320d4b083e58c4dd53732b5c7f51cf5ae44a60.scope - libcontainer container 44c05c718cfd6e870822e35606320d4b083e58c4dd53732b5c7f51cf5ae44a60. Mar 6 03:01:15.629244 containerd[1532]: time="2026-03-06T03:01:15.629171514Z" level=info msg="connecting to shim aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1" address="unix:///run/containerd/s/450f3e9469566ee1fd9685aa641bafec9e89cbcc746a5a352610eb0cfce6f09b" namespace=k8s.io protocol=ttrpc version=3 Mar 6 03:01:15.701275 systemd[1]: Started cri-containerd-aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1.scope - libcontainer container aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1. Mar 6 03:01:15.817404 containerd[1532]: time="2026-03-06T03:01:15.817172071Z" level=info msg="StartContainer for \"44c05c718cfd6e870822e35606320d4b083e58c4dd53732b5c7f51cf5ae44a60\" returns successfully" Mar 6 03:01:15.852585 containerd[1532]: time="2026-03-06T03:01:15.852506603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6857b68657-htzf9,Uid:ac46cc05-0e3c-439d-bc6a-7388a9b0ece7,Namespace:calico-system,Attempt:0,} returns sandbox id \"aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1\"" Mar 6 03:01:15.861592 containerd[1532]: time="2026-03-06T03:01:15.861023138Z" level=info msg="CreateContainer within sandbox \"aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 6 03:01:15.872308 containerd[1532]: time="2026-03-06T03:01:15.872255445Z" level=info msg="Container cdfcf4cb96e7942434be101c80d3d9bbe3ef7a6e0e4fa771454ecff7d61b093f: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:15.891486 containerd[1532]: time="2026-03-06T03:01:15.890225168Z" level=info msg="CreateContainer within sandbox \"aa05d3fdc7178953fa9ef5354f72412e238d6322e48702967b4c9bf3138ec7c1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"cdfcf4cb96e7942434be101c80d3d9bbe3ef7a6e0e4fa771454ecff7d61b093f\"" Mar 6 03:01:15.891682 containerd[1532]: time="2026-03-06T03:01:15.891582333Z" level=info msg="StartContainer for \"cdfcf4cb96e7942434be101c80d3d9bbe3ef7a6e0e4fa771454ecff7d61b093f\"" Mar 6 03:01:15.896805 containerd[1532]: time="2026-03-06T03:01:15.896447276Z" level=info msg="connecting to shim cdfcf4cb96e7942434be101c80d3d9bbe3ef7a6e0e4fa771454ecff7d61b093f" address="unix:///run/containerd/s/450f3e9469566ee1fd9685aa641bafec9e89cbcc746a5a352610eb0cfce6f09b" protocol=ttrpc version=3 Mar 6 03:01:15.935070 systemd[1]: Started cri-containerd-cdfcf4cb96e7942434be101c80d3d9bbe3ef7a6e0e4fa771454ecff7d61b093f.scope - libcontainer container cdfcf4cb96e7942434be101c80d3d9bbe3ef7a6e0e4fa771454ecff7d61b093f. Mar 6 03:01:16.039799 containerd[1532]: time="2026-03-06T03:01:16.039711008Z" level=info msg="StartContainer for \"cdfcf4cb96e7942434be101c80d3d9bbe3ef7a6e0e4fa771454ecff7d61b093f\" returns successfully" Mar 6 03:01:16.587229 systemd-networkd[1442]: cali62a3648e7a3: Gained IPv6LL Mar 6 03:01:16.768103 kubelet[2800]: I0306 03:01:16.766755 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-6857b68657-cl7tg" podStartSLOduration=38.501681425 podStartE2EDuration="41.766731008s" podCreationTimestamp="2026-03-06 03:00:35 +0000 UTC" firstStartedPulling="2026-03-06 03:01:12.17421093 +0000 UTC m=+58.199169023" lastFinishedPulling="2026-03-06 03:01:15.439260497 +0000 UTC m=+61.464218606" observedRunningTime="2026-03-06 03:01:16.738207975 +0000 UTC m=+62.763166089" watchObservedRunningTime="2026-03-06 03:01:16.766731008 +0000 UTC m=+62.791689120" Mar 6 03:01:16.769359 kubelet[2800]: I0306 03:01:16.768617 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-apiserver-6857b68657-htzf9" podStartSLOduration=41.768600645 podStartE2EDuration="41.768600645s" podCreationTimestamp="2026-03-06 03:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-06 03:01:16.768317262 +0000 UTC m=+62.793275368" watchObservedRunningTime="2026-03-06 03:01:16.768600645 +0000 UTC m=+62.793558747" Mar 6 03:01:16.961792 containerd[1532]: time="2026-03-06T03:01:16.961228575Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:16.964245 containerd[1532]: time="2026-03-06T03:01:16.964203422Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.31.4: active requests=0, bytes read=8792502" Mar 6 03:01:16.965529 containerd[1532]: time="2026-03-06T03:01:16.965470573Z" level=info msg="ImageCreate event name:\"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:16.972010 containerd[1532]: time="2026-03-06T03:01:16.971936253Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:16.974504 containerd[1532]: time="2026-03-06T03:01:16.974257957Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.31.4\" with image id \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\", repo tag \"ghcr.io/flatcar/calico/csi:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:ab57dd6f8423ef7b3ff382bf4ca5ace6063bdca77d441d852c75ec58847dd280\", size \"10348547\" in 1.534592517s" Mar 6 03:01:16.974504 containerd[1532]: time="2026-03-06T03:01:16.974306697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.31.4\" returns image reference \"sha256:4c8cd7d0b10a4df64a5bd90e9845e9d1edbe0e37c2ebfc171bb28698e07abf72\"" Mar 6 03:01:16.976451 containerd[1532]: time="2026-03-06T03:01:16.976394464Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\"" Mar 6 03:01:16.983306 containerd[1532]: time="2026-03-06T03:01:16.983266336Z" level=info msg="CreateContainer within sandbox \"a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 6 03:01:17.003007 containerd[1532]: time="2026-03-06T03:01:17.002944982Z" level=info msg="Container faf898cd8deaf415c28547a4f9f9eed2a8e47062a49fda85ee809f36ba1f732a: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:17.024327 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1825374142.mount: Deactivated successfully. Mar 6 03:01:17.031667 containerd[1532]: time="2026-03-06T03:01:17.030646974Z" level=info msg="CreateContainer within sandbox \"a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"faf898cd8deaf415c28547a4f9f9eed2a8e47062a49fda85ee809f36ba1f732a\"" Mar 6 03:01:17.033055 containerd[1532]: time="2026-03-06T03:01:17.033022250Z" level=info msg="StartContainer for \"faf898cd8deaf415c28547a4f9f9eed2a8e47062a49fda85ee809f36ba1f732a\"" Mar 6 03:01:17.039228 containerd[1532]: time="2026-03-06T03:01:17.039180709Z" level=info msg="connecting to shim faf898cd8deaf415c28547a4f9f9eed2a8e47062a49fda85ee809f36ba1f732a" address="unix:///run/containerd/s/fb6f57e758a2ff1c92c82db1c6e3104ca6970435bd4ebdf364571aebecf23683" protocol=ttrpc version=3 Mar 6 03:01:17.094336 systemd[1]: Started cri-containerd-faf898cd8deaf415c28547a4f9f9eed2a8e47062a49fda85ee809f36ba1f732a.scope - libcontainer container faf898cd8deaf415c28547a4f9f9eed2a8e47062a49fda85ee809f36ba1f732a. Mar 6 03:01:17.281126 containerd[1532]: time="2026-03-06T03:01:17.280039494Z" level=info msg="StartContainer for \"faf898cd8deaf415c28547a4f9f9eed2a8e47062a49fda85ee809f36ba1f732a\" returns successfully" Mar 6 03:01:17.730808 kubelet[2800]: I0306 03:01:17.730051 2800 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 6 03:01:17.730808 kubelet[2800]: I0306 03:01:17.730379 2800 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 6 03:01:19.349541 ntpd[1656]: Listen normally on 9 calia615b41eeea [fe80::ecee:eeff:feee:eeee%8]:123 Mar 6 03:01:19.350668 ntpd[1656]: 6 Mar 03:01:19 ntpd[1656]: Listen normally on 9 calia615b41eeea [fe80::ecee:eeff:feee:eeee%8]:123 Mar 6 03:01:19.351112 ntpd[1656]: Listen normally on 10 calif569210a8f7 [fe80::ecee:eeff:feee:eeee%9]:123 Mar 6 03:01:19.351582 ntpd[1656]: 6 Mar 03:01:19 ntpd[1656]: Listen normally on 10 calif569210a8f7 [fe80::ecee:eeff:feee:eeee%9]:123 Mar 6 03:01:19.351582 ntpd[1656]: 6 Mar 03:01:19 ntpd[1656]: Listen normally on 11 caliac10b8e6b4b [fe80::ecee:eeff:feee:eeee%10]:123 Mar 6 03:01:19.351582 ntpd[1656]: 6 Mar 03:01:19 ntpd[1656]: Listen normally on 12 cali4e96f9320ed [fe80::ecee:eeff:feee:eeee%11]:123 Mar 6 03:01:19.351582 ntpd[1656]: 6 Mar 03:01:19 ntpd[1656]: Listen normally on 13 calia6a6272503e [fe80::ecee:eeff:feee:eeee%12]:123 Mar 6 03:01:19.351582 ntpd[1656]: 6 Mar 03:01:19 ntpd[1656]: Listen normally on 14 calie949ad93022 [fe80::ecee:eeff:feee:eeee%13]:123 Mar 6 03:01:19.351582 ntpd[1656]: 6 Mar 03:01:19 ntpd[1656]: Listen normally on 15 cali62a3648e7a3 [fe80::ecee:eeff:feee:eeee%14]:123 Mar 6 03:01:19.351347 ntpd[1656]: Listen normally on 11 caliac10b8e6b4b [fe80::ecee:eeff:feee:eeee%10]:123 Mar 6 03:01:19.351396 ntpd[1656]: Listen normally on 12 cali4e96f9320ed [fe80::ecee:eeff:feee:eeee%11]:123 Mar 6 03:01:19.351435 ntpd[1656]: Listen normally on 13 calia6a6272503e [fe80::ecee:eeff:feee:eeee%12]:123 Mar 6 03:01:19.351467 ntpd[1656]: Listen normally on 14 calie949ad93022 [fe80::ecee:eeff:feee:eeee%13]:123 Mar 6 03:01:19.351500 ntpd[1656]: Listen normally on 15 cali62a3648e7a3 [fe80::ecee:eeff:feee:eeee%14]:123 Mar 6 03:01:19.612921 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount368393737.mount: Deactivated successfully. Mar 6 03:01:20.528732 containerd[1532]: time="2026-03-06T03:01:20.528665411Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:20.531261 containerd[1532]: time="2026-03-06T03:01:20.531197625Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.31.4: active requests=0, bytes read=55623386" Mar 6 03:01:20.532824 containerd[1532]: time="2026-03-06T03:01:20.532640725Z" level=info msg="ImageCreate event name:\"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:20.536070 containerd[1532]: time="2026-03-06T03:01:20.535997434Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:20.537262 containerd[1532]: time="2026-03-06T03:01:20.537078687Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" with image id \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:44395ca5ebfe88f21ed51acfbec5fc0f31d2762966e2007a0a2eb9b30e35fc4d\", size \"55623232\" in 3.560465043s" Mar 6 03:01:20.537262 containerd[1532]: time="2026-03-06T03:01:20.537125852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.31.4\" returns image reference \"sha256:714983e5e920bbe810fab04d9f06bd16ef4e552b0d2deffd7ab2b2c4a001acbb\"" Mar 6 03:01:20.541971 containerd[1532]: time="2026-03-06T03:01:20.541930219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\"" Mar 6 03:01:20.544798 containerd[1532]: time="2026-03-06T03:01:20.544338145Z" level=info msg="CreateContainer within sandbox \"6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Mar 6 03:01:20.554951 containerd[1532]: time="2026-03-06T03:01:20.554901077Z" level=info msg="Container dbb290de22ac109f2361ba28c1bdfce4fbf2a58fcc00118fa23914a7bcc0328a: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:20.575285 containerd[1532]: time="2026-03-06T03:01:20.575233536Z" level=info msg="CreateContainer within sandbox \"6232b572eb195f01a7851b7e6ce2fff12a7fa3ec3b13c8005ac9e61b13e9587e\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"dbb290de22ac109f2361ba28c1bdfce4fbf2a58fcc00118fa23914a7bcc0328a\"" Mar 6 03:01:20.576131 containerd[1532]: time="2026-03-06T03:01:20.576089414Z" level=info msg="StartContainer for \"dbb290de22ac109f2361ba28c1bdfce4fbf2a58fcc00118fa23914a7bcc0328a\"" Mar 6 03:01:20.579199 containerd[1532]: time="2026-03-06T03:01:20.579139655Z" level=info msg="connecting to shim dbb290de22ac109f2361ba28c1bdfce4fbf2a58fcc00118fa23914a7bcc0328a" address="unix:///run/containerd/s/6398bba15ba2a5e9f4c3aacdbdf201ad7642cde2ff85f08f93672f41dfcc542d" protocol=ttrpc version=3 Mar 6 03:01:20.619042 systemd[1]: Started cri-containerd-dbb290de22ac109f2361ba28c1bdfce4fbf2a58fcc00118fa23914a7bcc0328a.scope - libcontainer container dbb290de22ac109f2361ba28c1bdfce4fbf2a58fcc00118fa23914a7bcc0328a. Mar 6 03:01:20.703347 containerd[1532]: time="2026-03-06T03:01:20.703293770Z" level=info msg="StartContainer for \"dbb290de22ac109f2361ba28c1bdfce4fbf2a58fcc00118fa23914a7bcc0328a\" returns successfully" Mar 6 03:01:20.796899 kubelet[2800]: I0306 03:01:20.795403 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/goldmane-9f7667bb8-x4r4t" podStartSLOduration=36.984388744 podStartE2EDuration="44.795381035s" podCreationTimestamp="2026-03-06 03:00:36 +0000 UTC" firstStartedPulling="2026-03-06 03:01:12.727640202 +0000 UTC m=+58.752598287" lastFinishedPulling="2026-03-06 03:01:20.538632477 +0000 UTC m=+66.563590578" observedRunningTime="2026-03-06 03:01:20.794373869 +0000 UTC m=+66.819331981" watchObservedRunningTime="2026-03-06 03:01:20.795381035 +0000 UTC m=+66.820339148" Mar 6 03:01:21.392821 systemd[1]: Started sshd@10-10.128.0.87:22-20.161.92.111:33168.service - OpenSSH per-connection server daemon (20.161.92.111:33168). Mar 6 03:01:21.724489 sshd[5364]: Accepted publickey for core from 20.161.92.111 port 33168 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:21.728016 sshd-session[5364]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:21.739033 systemd-logind[1505]: New session 10 of user core. Mar 6 03:01:21.746484 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 6 03:01:22.012048 sshd[5370]: Connection closed by 20.161.92.111 port 33168 Mar 6 03:01:22.014472 sshd-session[5364]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:22.025314 systemd[1]: sshd@10-10.128.0.87:22-20.161.92.111:33168.service: Deactivated successfully. Mar 6 03:01:22.031733 systemd[1]: session-10.scope: Deactivated successfully. Mar 6 03:01:22.034839 systemd-logind[1505]: Session 10 logged out. Waiting for processes to exit. Mar 6 03:01:22.039164 systemd-logind[1505]: Removed session 10. Mar 6 03:01:23.462786 containerd[1532]: time="2026-03-06T03:01:23.462689569Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:23.464156 containerd[1532]: time="2026-03-06T03:01:23.464062007Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.31.4: active requests=0, bytes read=52406348" Mar 6 03:01:23.465603 containerd[1532]: time="2026-03-06T03:01:23.465528793Z" level=info msg="ImageCreate event name:\"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:23.468908 containerd[1532]: time="2026-03-06T03:01:23.468808058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:23.470282 containerd[1532]: time="2026-03-06T03:01:23.470226359Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" with image id \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:99b8bb50141ca55b4b6ddfcf2f2fbde838265508ab2ac96ed08e72cd39800713\", size \"53962361\" in 2.928242543s" Mar 6 03:01:23.470450 containerd[1532]: time="2026-03-06T03:01:23.470423537Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.31.4\" returns image reference \"sha256:ff033cc89dab51090bfa1b04e155a5ce1e3b1f324f74b7b2be0dd6f0b6b10e89\"" Mar 6 03:01:23.471886 containerd[1532]: time="2026-03-06T03:01:23.471849484Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\"" Mar 6 03:01:23.501399 containerd[1532]: time="2026-03-06T03:01:23.501332321Z" level=info msg="CreateContainer within sandbox \"3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 6 03:01:23.514885 containerd[1532]: time="2026-03-06T03:01:23.514197933Z" level=info msg="Container 1a89974682dd5d4894c926ca7301af56981f9cdecf30fcaa76a4d543b2d0a836: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:23.527019 containerd[1532]: time="2026-03-06T03:01:23.526956669Z" level=info msg="CreateContainer within sandbox \"3892ece9acb448a2484dcd161fb69beb9a1d6cdd4fe68a96cee12cce7f95beff\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1a89974682dd5d4894c926ca7301af56981f9cdecf30fcaa76a4d543b2d0a836\"" Mar 6 03:01:23.528237 containerd[1532]: time="2026-03-06T03:01:23.528109389Z" level=info msg="StartContainer for \"1a89974682dd5d4894c926ca7301af56981f9cdecf30fcaa76a4d543b2d0a836\"" Mar 6 03:01:23.530371 containerd[1532]: time="2026-03-06T03:01:23.530255280Z" level=info msg="connecting to shim 1a89974682dd5d4894c926ca7301af56981f9cdecf30fcaa76a4d543b2d0a836" address="unix:///run/containerd/s/036b096d08c3617231256ff49e58774ba56e12ff11d8cfd88a9b0719e9080b0f" protocol=ttrpc version=3 Mar 6 03:01:23.567057 systemd[1]: Started cri-containerd-1a89974682dd5d4894c926ca7301af56981f9cdecf30fcaa76a4d543b2d0a836.scope - libcontainer container 1a89974682dd5d4894c926ca7301af56981f9cdecf30fcaa76a4d543b2d0a836. Mar 6 03:01:23.645653 containerd[1532]: time="2026-03-06T03:01:23.645588033Z" level=info msg="StartContainer for \"1a89974682dd5d4894c926ca7301af56981f9cdecf30fcaa76a4d543b2d0a836\" returns successfully" Mar 6 03:01:23.801698 kubelet[2800]: I0306 03:01:23.800638 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5c9c5b7bf5-r8t6w" podStartSLOduration=36.741755498 podStartE2EDuration="46.800595128s" podCreationTimestamp="2026-03-06 03:00:37 +0000 UTC" firstStartedPulling="2026-03-06 03:01:13.412692917 +0000 UTC m=+59.437651037" lastFinishedPulling="2026-03-06 03:01:23.471532564 +0000 UTC m=+69.496490667" observedRunningTime="2026-03-06 03:01:23.79716307 +0000 UTC m=+69.822121183" watchObservedRunningTime="2026-03-06 03:01:23.800595128 +0000 UTC m=+69.825553240" Mar 6 03:01:24.908010 containerd[1532]: time="2026-03-06T03:01:24.907936939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:24.909414 containerd[1532]: time="2026-03-06T03:01:24.909289109Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4: active requests=0, bytes read=14704317" Mar 6 03:01:24.911048 containerd[1532]: time="2026-03-06T03:01:24.910980599Z" level=info msg="ImageCreate event name:\"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:24.914984 containerd[1532]: time="2026-03-06T03:01:24.914861898Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 6 03:01:24.916242 containerd[1532]: time="2026-03-06T03:01:24.915750988Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" with image id \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:e41c0d73bcd33ff28ae2f2983cf781a4509d212e102d53883dbbf436ab3cd97d\", size \"16260314\" in 1.443853791s" Mar 6 03:01:24.916242 containerd[1532]: time="2026-03-06T03:01:24.915812775Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.31.4\" returns image reference \"sha256:d7aeb99114cbb6499e9048f43d3faa5f199d1a05ed44165e5974d0368ac32771\"" Mar 6 03:01:24.922927 containerd[1532]: time="2026-03-06T03:01:24.922878370Z" level=info msg="CreateContainer within sandbox \"a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 6 03:01:24.937496 containerd[1532]: time="2026-03-06T03:01:24.934576355Z" level=info msg="Container 97452043f5d0f084cf465de2308484f8bc260fa21d5ade42b1d44f102a6ed2d8: CDI devices from CRI Config.CDIDevices: []" Mar 6 03:01:24.952797 containerd[1532]: time="2026-03-06T03:01:24.952689372Z" level=info msg="CreateContainer within sandbox \"a19c60c43bec2d99e87ed5db8e5dd9f0e7d49c2b1ffd810174786da9a6920c0b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"97452043f5d0f084cf465de2308484f8bc260fa21d5ade42b1d44f102a6ed2d8\"" Mar 6 03:01:24.953573 containerd[1532]: time="2026-03-06T03:01:24.953398047Z" level=info msg="StartContainer for \"97452043f5d0f084cf465de2308484f8bc260fa21d5ade42b1d44f102a6ed2d8\"" Mar 6 03:01:24.955941 containerd[1532]: time="2026-03-06T03:01:24.955899722Z" level=info msg="connecting to shim 97452043f5d0f084cf465de2308484f8bc260fa21d5ade42b1d44f102a6ed2d8" address="unix:///run/containerd/s/fb6f57e758a2ff1c92c82db1c6e3104ca6970435bd4ebdf364571aebecf23683" protocol=ttrpc version=3 Mar 6 03:01:24.990248 systemd[1]: Started cri-containerd-97452043f5d0f084cf465de2308484f8bc260fa21d5ade42b1d44f102a6ed2d8.scope - libcontainer container 97452043f5d0f084cf465de2308484f8bc260fa21d5ade42b1d44f102a6ed2d8. Mar 6 03:01:25.094186 containerd[1532]: time="2026-03-06T03:01:25.094060882Z" level=info msg="StartContainer for \"97452043f5d0f084cf465de2308484f8bc260fa21d5ade42b1d44f102a6ed2d8\" returns successfully" Mar 6 03:01:25.358488 kubelet[2800]: I0306 03:01:25.358346 2800 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 6 03:01:25.358488 kubelet[2800]: I0306 03:01:25.358469 2800 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 6 03:01:25.804613 kubelet[2800]: I0306 03:01:25.804130 2800 pod_startup_latency_tracker.go:108] "Observed pod startup duration" pod="calico-system/csi-node-driver-2m65n" podStartSLOduration=36.10954485 podStartE2EDuration="48.804107771s" podCreationTimestamp="2026-03-06 03:00:37 +0000 UTC" firstStartedPulling="2026-03-06 03:01:12.222605249 +0000 UTC m=+58.247563351" lastFinishedPulling="2026-03-06 03:01:24.917168169 +0000 UTC m=+70.942126272" observedRunningTime="2026-03-06 03:01:25.804024733 +0000 UTC m=+71.828982845" watchObservedRunningTime="2026-03-06 03:01:25.804107771 +0000 UTC m=+71.829065883" Mar 6 03:01:27.069282 systemd[1]: Started sshd@11-10.128.0.87:22-20.161.92.111:33170.service - OpenSSH per-connection server daemon (20.161.92.111:33170). Mar 6 03:01:27.326058 sshd[5550]: Accepted publickey for core from 20.161.92.111 port 33170 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:27.327933 sshd-session[5550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:27.334867 systemd-logind[1505]: New session 11 of user core. Mar 6 03:01:27.341162 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 6 03:01:27.535502 sshd[5555]: Connection closed by 20.161.92.111 port 33170 Mar 6 03:01:27.537145 sshd-session[5550]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:27.543917 systemd-logind[1505]: Session 11 logged out. Waiting for processes to exit. Mar 6 03:01:27.544493 systemd[1]: sshd@11-10.128.0.87:22-20.161.92.111:33170.service: Deactivated successfully. Mar 6 03:01:27.549402 systemd[1]: session-11.scope: Deactivated successfully. Mar 6 03:01:27.553549 systemd-logind[1505]: Removed session 11. Mar 6 03:01:32.593166 systemd[1]: Started sshd@12-10.128.0.87:22-20.161.92.111:52550.service - OpenSSH per-connection server daemon (20.161.92.111:52550). Mar 6 03:01:32.860161 sshd[5593]: Accepted publickey for core from 20.161.92.111 port 52550 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:32.862070 sshd-session[5593]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:32.868586 systemd-logind[1505]: New session 12 of user core. Mar 6 03:01:32.874020 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 6 03:01:33.059967 sshd[5597]: Connection closed by 20.161.92.111 port 52550 Mar 6 03:01:33.061174 sshd-session[5593]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:33.067450 systemd[1]: sshd@12-10.128.0.87:22-20.161.92.111:52550.service: Deactivated successfully. Mar 6 03:01:33.070495 systemd[1]: session-12.scope: Deactivated successfully. Mar 6 03:01:33.072976 systemd-logind[1505]: Session 12 logged out. Waiting for processes to exit. Mar 6 03:01:33.076533 systemd-logind[1505]: Removed session 12. Mar 6 03:01:38.106542 systemd[1]: Started sshd@13-10.128.0.87:22-20.161.92.111:52558.service - OpenSSH per-connection server daemon (20.161.92.111:52558). Mar 6 03:01:38.339869 sshd[5620]: Accepted publickey for core from 20.161.92.111 port 52558 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:38.341585 sshd-session[5620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:38.348848 systemd-logind[1505]: New session 13 of user core. Mar 6 03:01:38.352039 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 6 03:01:38.543257 sshd[5623]: Connection closed by 20.161.92.111 port 52558 Mar 6 03:01:38.545083 sshd-session[5620]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:38.550719 systemd[1]: sshd@13-10.128.0.87:22-20.161.92.111:52558.service: Deactivated successfully. Mar 6 03:01:38.554143 systemd[1]: session-13.scope: Deactivated successfully. Mar 6 03:01:38.555856 systemd-logind[1505]: Session 13 logged out. Waiting for processes to exit. Mar 6 03:01:38.558218 systemd-logind[1505]: Removed session 13. Mar 6 03:01:41.420574 kubelet[2800]: I0306 03:01:41.419986 2800 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 6 03:01:42.576201 kubelet[2800]: I0306 03:01:42.576051 2800 prober_manager.go:356] "Failed to trigger a manual run" probe="Readiness" Mar 6 03:01:43.592621 systemd[1]: Started sshd@14-10.128.0.87:22-20.161.92.111:41916.service - OpenSSH per-connection server daemon (20.161.92.111:41916). Mar 6 03:01:43.830706 sshd[5662]: Accepted publickey for core from 20.161.92.111 port 41916 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:43.831507 sshd-session[5662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:43.841173 systemd-logind[1505]: New session 14 of user core. Mar 6 03:01:43.846054 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 6 03:01:44.024575 sshd[5665]: Connection closed by 20.161.92.111 port 41916 Mar 6 03:01:44.025190 sshd-session[5662]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:44.032285 systemd[1]: sshd@14-10.128.0.87:22-20.161.92.111:41916.service: Deactivated successfully. Mar 6 03:01:44.035930 systemd[1]: session-14.scope: Deactivated successfully. Mar 6 03:01:44.037961 systemd-logind[1505]: Session 14 logged out. Waiting for processes to exit. Mar 6 03:01:44.040811 systemd-logind[1505]: Removed session 14. Mar 6 03:01:44.074832 systemd[1]: Started sshd@15-10.128.0.87:22-20.161.92.111:41920.service - OpenSSH per-connection server daemon (20.161.92.111:41920). Mar 6 03:01:44.295139 sshd[5678]: Accepted publickey for core from 20.161.92.111 port 41920 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:44.296884 sshd-session[5678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:44.305269 systemd-logind[1505]: New session 15 of user core. Mar 6 03:01:44.313987 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 6 03:01:44.559603 sshd[5681]: Connection closed by 20.161.92.111 port 41920 Mar 6 03:01:44.562361 sshd-session[5678]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:44.574464 systemd[1]: sshd@15-10.128.0.87:22-20.161.92.111:41920.service: Deactivated successfully. Mar 6 03:01:44.575207 systemd-logind[1505]: Session 15 logged out. Waiting for processes to exit. Mar 6 03:01:44.582536 systemd[1]: session-15.scope: Deactivated successfully. Mar 6 03:01:44.588366 systemd-logind[1505]: Removed session 15. Mar 6 03:01:44.614171 systemd[1]: Started sshd@16-10.128.0.87:22-20.161.92.111:41932.service - OpenSSH per-connection server daemon (20.161.92.111:41932). Mar 6 03:01:44.879319 sshd[5691]: Accepted publickey for core from 20.161.92.111 port 41932 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:44.879658 sshd-session[5691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:44.886860 systemd-logind[1505]: New session 16 of user core. Mar 6 03:01:44.895034 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 6 03:01:45.103421 sshd[5694]: Connection closed by 20.161.92.111 port 41932 Mar 6 03:01:45.105105 sshd-session[5691]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:45.110984 systemd[1]: sshd@16-10.128.0.87:22-20.161.92.111:41932.service: Deactivated successfully. Mar 6 03:01:45.115199 systemd[1]: session-16.scope: Deactivated successfully. Mar 6 03:01:45.117747 systemd-logind[1505]: Session 16 logged out. Waiting for processes to exit. Mar 6 03:01:45.119671 systemd-logind[1505]: Removed session 16. Mar 6 03:01:50.150381 systemd[1]: Started sshd@17-10.128.0.87:22-20.161.92.111:39590.service - OpenSSH per-connection server daemon (20.161.92.111:39590). Mar 6 03:01:50.393710 sshd[5718]: Accepted publickey for core from 20.161.92.111 port 39590 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:50.395643 sshd-session[5718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:50.403248 systemd-logind[1505]: New session 17 of user core. Mar 6 03:01:50.410012 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 6 03:01:50.596164 sshd[5723]: Connection closed by 20.161.92.111 port 39590 Mar 6 03:01:50.597082 sshd-session[5718]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:50.603054 systemd[1]: sshd@17-10.128.0.87:22-20.161.92.111:39590.service: Deactivated successfully. Mar 6 03:01:50.606744 systemd[1]: session-17.scope: Deactivated successfully. Mar 6 03:01:50.608486 systemd-logind[1505]: Session 17 logged out. Waiting for processes to exit. Mar 6 03:01:50.611076 systemd-logind[1505]: Removed session 17. Mar 6 03:01:50.645382 systemd[1]: Started sshd@18-10.128.0.87:22-20.161.92.111:39594.service - OpenSSH per-connection server daemon (20.161.92.111:39594). Mar 6 03:01:50.858149 sshd[5735]: Accepted publickey for core from 20.161.92.111 port 39594 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:50.859930 sshd-session[5735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:50.868083 systemd-logind[1505]: New session 18 of user core. Mar 6 03:01:50.871968 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 6 03:01:51.102380 sshd[5738]: Connection closed by 20.161.92.111 port 39594 Mar 6 03:01:51.103097 sshd-session[5735]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:51.109070 systemd[1]: sshd@18-10.128.0.87:22-20.161.92.111:39594.service: Deactivated successfully. Mar 6 03:01:51.112382 systemd[1]: session-18.scope: Deactivated successfully. Mar 6 03:01:51.116340 systemd-logind[1505]: Session 18 logged out. Waiting for processes to exit. Mar 6 03:01:51.118677 systemd-logind[1505]: Removed session 18. Mar 6 03:01:51.147498 systemd[1]: Started sshd@19-10.128.0.87:22-20.161.92.111:39608.service - OpenSSH per-connection server daemon (20.161.92.111:39608). Mar 6 03:01:51.367983 sshd[5748]: Accepted publickey for core from 20.161.92.111 port 39608 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:51.369596 sshd-session[5748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:51.377505 systemd-logind[1505]: New session 19 of user core. Mar 6 03:01:51.381988 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 6 03:01:52.375913 sshd[5751]: Connection closed by 20.161.92.111 port 39608 Mar 6 03:01:52.378127 sshd-session[5748]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:52.387545 systemd[1]: sshd@19-10.128.0.87:22-20.161.92.111:39608.service: Deactivated successfully. Mar 6 03:01:52.395863 systemd[1]: session-19.scope: Deactivated successfully. Mar 6 03:01:52.400513 systemd-logind[1505]: Session 19 logged out. Waiting for processes to exit. Mar 6 03:01:52.406349 systemd-logind[1505]: Removed session 19. Mar 6 03:01:52.433048 systemd[1]: Started sshd@20-10.128.0.87:22-20.161.92.111:39616.service - OpenSSH per-connection server daemon (20.161.92.111:39616). Mar 6 03:01:52.698736 sshd[5770]: Accepted publickey for core from 20.161.92.111 port 39616 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:52.700564 sshd-session[5770]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:52.707870 systemd-logind[1505]: New session 20 of user core. Mar 6 03:01:52.713043 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 6 03:01:53.149954 sshd[5777]: Connection closed by 20.161.92.111 port 39616 Mar 6 03:01:53.150803 sshd-session[5770]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:53.157902 systemd[1]: sshd@20-10.128.0.87:22-20.161.92.111:39616.service: Deactivated successfully. Mar 6 03:01:53.161514 systemd[1]: session-20.scope: Deactivated successfully. Mar 6 03:01:53.163529 systemd-logind[1505]: Session 20 logged out. Waiting for processes to exit. Mar 6 03:01:53.167529 systemd-logind[1505]: Removed session 20. Mar 6 03:01:53.201439 systemd[1]: Started sshd@21-10.128.0.87:22-20.161.92.111:39620.service - OpenSSH per-connection server daemon (20.161.92.111:39620). Mar 6 03:01:53.443790 sshd[5810]: Accepted publickey for core from 20.161.92.111 port 39620 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:53.444554 sshd-session[5810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:53.453852 systemd-logind[1505]: New session 21 of user core. Mar 6 03:01:53.459058 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 6 03:01:53.634943 sshd[5815]: Connection closed by 20.161.92.111 port 39620 Mar 6 03:01:53.638444 sshd-session[5810]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:53.649875 systemd-logind[1505]: Session 21 logged out. Waiting for processes to exit. Mar 6 03:01:53.651040 systemd[1]: sshd@21-10.128.0.87:22-20.161.92.111:39620.service: Deactivated successfully. Mar 6 03:01:53.657852 systemd[1]: session-21.scope: Deactivated successfully. Mar 6 03:01:53.663010 systemd-logind[1505]: Removed session 21. Mar 6 03:01:58.686489 systemd[1]: Started sshd@22-10.128.0.87:22-20.161.92.111:39626.service - OpenSSH per-connection server daemon (20.161.92.111:39626). Mar 6 03:01:58.931636 sshd[5879]: Accepted publickey for core from 20.161.92.111 port 39626 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:01:58.933935 sshd-session[5879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:01:58.941289 systemd-logind[1505]: New session 22 of user core. Mar 6 03:01:58.948062 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 6 03:01:59.141181 sshd[5882]: Connection closed by 20.161.92.111 port 39626 Mar 6 03:01:59.142157 sshd-session[5879]: pam_unix(sshd:session): session closed for user core Mar 6 03:01:59.150371 systemd[1]: sshd@22-10.128.0.87:22-20.161.92.111:39626.service: Deactivated successfully. Mar 6 03:01:59.154084 systemd[1]: session-22.scope: Deactivated successfully. Mar 6 03:01:59.156526 systemd-logind[1505]: Session 22 logged out. Waiting for processes to exit. Mar 6 03:01:59.159501 systemd-logind[1505]: Removed session 22. Mar 6 03:02:04.190908 systemd[1]: Started sshd@23-10.128.0.87:22-20.161.92.111:33700.service - OpenSSH per-connection server daemon (20.161.92.111:33700). Mar 6 03:02:04.423722 sshd[5919]: Accepted publickey for core from 20.161.92.111 port 33700 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:02:04.425492 sshd-session[5919]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:04.432712 systemd-logind[1505]: New session 23 of user core. Mar 6 03:02:04.439283 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 6 03:02:04.604306 sshd[5922]: Connection closed by 20.161.92.111 port 33700 Mar 6 03:02:04.605455 sshd-session[5919]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:04.612564 systemd[1]: sshd@23-10.128.0.87:22-20.161.92.111:33700.service: Deactivated successfully. Mar 6 03:02:04.617306 systemd[1]: session-23.scope: Deactivated successfully. Mar 6 03:02:04.619375 systemd-logind[1505]: Session 23 logged out. Waiting for processes to exit. Mar 6 03:02:04.622166 systemd-logind[1505]: Removed session 23. Mar 6 03:02:09.651604 systemd[1]: Started sshd@24-10.128.0.87:22-20.161.92.111:33704.service - OpenSSH per-connection server daemon (20.161.92.111:33704). Mar 6 03:02:09.891000 sshd[5957]: Accepted publickey for core from 20.161.92.111 port 33704 ssh2: RSA SHA256:ZdrmvhMusRUPQ+0Cwgv/b10pj14AUU+Sa1rq8M+qNwg Mar 6 03:02:09.892895 sshd-session[5957]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 6 03:02:09.899864 systemd-logind[1505]: New session 24 of user core. Mar 6 03:02:09.906022 systemd[1]: Started session-24.scope - Session 24 of User core. Mar 6 03:02:10.092866 sshd[5960]: Connection closed by 20.161.92.111 port 33704 Mar 6 03:02:10.093696 sshd-session[5957]: pam_unix(sshd:session): session closed for user core Mar 6 03:02:10.101418 systemd[1]: sshd@24-10.128.0.87:22-20.161.92.111:33704.service: Deactivated successfully. Mar 6 03:02:10.107199 systemd[1]: session-24.scope: Deactivated successfully. Mar 6 03:02:10.110056 systemd-logind[1505]: Session 24 logged out. Waiting for processes to exit. Mar 6 03:02:10.113113 systemd-logind[1505]: Removed session 24.