Sep 12 17:51:11.157935 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 15:34:39 -00 2025 Sep 12 17:51:11.157983 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:51:11.158002 kernel: BIOS-provided physical RAM map: Sep 12 17:51:11.158014 kernel: BIOS-e820: [mem 0x0000000000000000-0x0000000000000fff] reserved Sep 12 17:51:11.158026 kernel: BIOS-e820: [mem 0x0000000000001000-0x0000000000054fff] usable Sep 12 17:51:11.158048 kernel: BIOS-e820: [mem 0x0000000000055000-0x000000000005ffff] reserved Sep 12 17:51:11.158069 kernel: BIOS-e820: [mem 0x0000000000060000-0x0000000000097fff] usable Sep 12 17:51:11.158081 kernel: BIOS-e820: [mem 0x0000000000098000-0x000000000009ffff] reserved Sep 12 17:51:11.158094 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bd329fff] usable Sep 12 17:51:11.158107 kernel: BIOS-e820: [mem 0x00000000bd32a000-0x00000000bd331fff] ACPI data Sep 12 17:51:11.158121 kernel: BIOS-e820: [mem 0x00000000bd332000-0x00000000bf8ecfff] usable Sep 12 17:51:11.158133 kernel: BIOS-e820: [mem 0x00000000bf8ed000-0x00000000bfb6cfff] reserved Sep 12 17:51:11.158146 kernel: BIOS-e820: [mem 0x00000000bfb6d000-0x00000000bfb7efff] ACPI data Sep 12 17:51:11.158158 kernel: BIOS-e820: [mem 0x00000000bfb7f000-0x00000000bfbfefff] ACPI NVS Sep 12 17:51:11.158176 kernel: BIOS-e820: [mem 0x00000000bfbff000-0x00000000bffdffff] usable Sep 12 17:51:11.158185 kernel: BIOS-e820: [mem 0x00000000bffe0000-0x00000000bfffffff] reserved Sep 12 17:51:11.158195 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000021fffffff] usable Sep 12 17:51:11.158204 kernel: NX (Execute Disable) protection: active Sep 12 17:51:11.158213 kernel: APIC: Static calls initialized Sep 12 17:51:11.158222 kernel: efi: EFI v2.7 by EDK II Sep 12 17:51:11.158232 kernel: efi: TPMFinalLog=0xbfbf7000 ACPI=0xbfb7e000 ACPI 2.0=0xbfb7e014 SMBIOS=0xbf9e8000 RNG=0xbfb73018 TPMEventLog=0xbd32a018 Sep 12 17:51:11.158241 kernel: random: crng init done Sep 12 17:51:11.158253 kernel: secureboot: Secure boot disabled Sep 12 17:51:11.158262 kernel: SMBIOS 2.4 present. Sep 12 17:51:11.158271 kernel: DMI: Google Google Compute Engine/Google Compute Engine, BIOS Google 08/14/2025 Sep 12 17:51:11.158282 kernel: DMI: Memory slots populated: 1/1 Sep 12 17:51:11.158292 kernel: Hypervisor detected: KVM Sep 12 17:51:11.158301 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 12 17:51:11.158310 kernel: kvm-clock: using sched offset of 14696038014 cycles Sep 12 17:51:11.158320 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 17:51:11.158329 kernel: tsc: Detected 2299.998 MHz processor Sep 12 17:51:11.158339 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 17:51:11.158352 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 17:51:11.158361 kernel: last_pfn = 0x220000 max_arch_pfn = 0x400000000 Sep 12 17:51:11.158371 kernel: MTRR map: 3 entries (2 fixed + 1 variable; max 18), built from 8 variable MTRRs Sep 12 17:51:11.158380 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 17:51:11.158389 kernel: last_pfn = 0xbffe0 max_arch_pfn = 0x400000000 Sep 12 17:51:11.158399 kernel: Using GB pages for direct mapping Sep 12 17:51:11.158408 kernel: ACPI: Early table checksum verification disabled Sep 12 17:51:11.158418 kernel: ACPI: RSDP 0x00000000BFB7E014 000024 (v02 Google) Sep 12 17:51:11.158434 kernel: ACPI: XSDT 0x00000000BFB7D0E8 00005C (v01 Google GOOGFACP 00000001 01000013) Sep 12 17:51:11.158444 kernel: ACPI: FACP 0x00000000BFB78000 0000F4 (v02 Google GOOGFACP 00000001 GOOG 00000001) Sep 12 17:51:11.158477 kernel: ACPI: DSDT 0x00000000BFB79000 001A64 (v01 Google GOOGDSDT 00000001 GOOG 00000001) Sep 12 17:51:11.158494 kernel: ACPI: FACS 0x00000000BFBF2000 000040 Sep 12 17:51:11.158511 kernel: ACPI: SSDT 0x00000000BFB7C000 000316 (v02 GOOGLE Tpm2Tabl 00001000 INTL 20250404) Sep 12 17:51:11.158521 kernel: ACPI: TPM2 0x00000000BFB7B000 000034 (v04 GOOGLE 00000001 GOOG 00000001) Sep 12 17:51:11.158534 kernel: ACPI: SRAT 0x00000000BFB77000 0000C8 (v03 Google GOOGSRAT 00000001 GOOG 00000001) Sep 12 17:51:11.158544 kernel: ACPI: APIC 0x00000000BFB76000 000076 (v05 Google GOOGAPIC 00000001 GOOG 00000001) Sep 12 17:51:11.158554 kernel: ACPI: SSDT 0x00000000BFB75000 000980 (v01 Google GOOGSSDT 00000001 GOOG 00000001) Sep 12 17:51:11.158564 kernel: ACPI: WAET 0x00000000BFB74000 000028 (v01 Google GOOGWAET 00000001 GOOG 00000001) Sep 12 17:51:11.158574 kernel: ACPI: Reserving FACP table memory at [mem 0xbfb78000-0xbfb780f3] Sep 12 17:51:11.158583 kernel: ACPI: Reserving DSDT table memory at [mem 0xbfb79000-0xbfb7aa63] Sep 12 17:51:11.158593 kernel: ACPI: Reserving FACS table memory at [mem 0xbfbf2000-0xbfbf203f] Sep 12 17:51:11.158603 kernel: ACPI: Reserving SSDT table memory at [mem 0xbfb7c000-0xbfb7c315] Sep 12 17:51:11.158612 kernel: ACPI: Reserving TPM2 table memory at [mem 0xbfb7b000-0xbfb7b033] Sep 12 17:51:11.158625 kernel: ACPI: Reserving SRAT table memory at [mem 0xbfb77000-0xbfb770c7] Sep 12 17:51:11.158634 kernel: ACPI: Reserving APIC table memory at [mem 0xbfb76000-0xbfb76075] Sep 12 17:51:11.158644 kernel: ACPI: Reserving SSDT table memory at [mem 0xbfb75000-0xbfb7597f] Sep 12 17:51:11.158654 kernel: ACPI: Reserving WAET table memory at [mem 0xbfb74000-0xbfb74027] Sep 12 17:51:11.158664 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Sep 12 17:51:11.158673 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0xbfffffff] Sep 12 17:51:11.158683 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x21fffffff] Sep 12 17:51:11.158693 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0xbfffffff] -> [mem 0x00001000-0xbfffffff] Sep 12 17:51:11.158703 kernel: NUMA: Node 0 [mem 0x00001000-0xbfffffff] + [mem 0x100000000-0x21fffffff] -> [mem 0x00001000-0x21fffffff] Sep 12 17:51:11.158716 kernel: NODE_DATA(0) allocated [mem 0x21fff6dc0-0x21fffdfff] Sep 12 17:51:11.158726 kernel: Zone ranges: Sep 12 17:51:11.158735 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 17:51:11.158745 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Sep 12 17:51:11.158755 kernel: Normal [mem 0x0000000100000000-0x000000021fffffff] Sep 12 17:51:11.158764 kernel: Device empty Sep 12 17:51:11.158774 kernel: Movable zone start for each node Sep 12 17:51:11.158784 kernel: Early memory node ranges Sep 12 17:51:11.158794 kernel: node 0: [mem 0x0000000000001000-0x0000000000054fff] Sep 12 17:51:11.158803 kernel: node 0: [mem 0x0000000000060000-0x0000000000097fff] Sep 12 17:51:11.158816 kernel: node 0: [mem 0x0000000000100000-0x00000000bd329fff] Sep 12 17:51:11.158825 kernel: node 0: [mem 0x00000000bd332000-0x00000000bf8ecfff] Sep 12 17:51:11.158835 kernel: node 0: [mem 0x00000000bfbff000-0x00000000bffdffff] Sep 12 17:51:11.158845 kernel: node 0: [mem 0x0000000100000000-0x000000021fffffff] Sep 12 17:51:11.158854 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000021fffffff] Sep 12 17:51:11.158864 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 17:51:11.158874 kernel: On node 0, zone DMA: 11 pages in unavailable ranges Sep 12 17:51:11.158884 kernel: On node 0, zone DMA: 104 pages in unavailable ranges Sep 12 17:51:11.158893 kernel: On node 0, zone DMA32: 8 pages in unavailable ranges Sep 12 17:51:11.158906 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Sep 12 17:51:11.158916 kernel: On node 0, zone Normal: 32 pages in unavailable ranges Sep 12 17:51:11.158925 kernel: ACPI: PM-Timer IO Port: 0xb008 Sep 12 17:51:11.158935 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 12 17:51:11.158945 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 12 17:51:11.158955 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 12 17:51:11.158964 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 17:51:11.158974 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 12 17:51:11.158984 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 12 17:51:11.158996 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 17:51:11.159006 kernel: CPU topo: Max. logical packages: 1 Sep 12 17:51:11.159016 kernel: CPU topo: Max. logical dies: 1 Sep 12 17:51:11.159025 kernel: CPU topo: Max. dies per package: 1 Sep 12 17:51:11.159041 kernel: CPU topo: Max. threads per core: 2 Sep 12 17:51:11.159051 kernel: CPU topo: Num. cores per package: 1 Sep 12 17:51:11.159060 kernel: CPU topo: Num. threads per package: 2 Sep 12 17:51:11.159070 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 12 17:51:11.159080 kernel: [mem 0xc0000000-0xffffffff] available for PCI devices Sep 12 17:51:11.159093 kernel: Booting paravirtualized kernel on KVM Sep 12 17:51:11.159103 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 17:51:11.159113 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 12 17:51:11.159122 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 12 17:51:11.159132 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 12 17:51:11.159141 kernel: pcpu-alloc: [0] 0 1 Sep 12 17:51:11.159151 kernel: kvm-guest: PV spinlocks enabled Sep 12 17:51:11.159161 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Sep 12 17:51:11.159172 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:51:11.159185 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:51:11.159195 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Sep 12 17:51:11.159205 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 17:51:11.159214 kernel: Fallback order for Node 0: 0 Sep 12 17:51:11.159224 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1965138 Sep 12 17:51:11.159234 kernel: Policy zone: Normal Sep 12 17:51:11.159244 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:51:11.159254 kernel: software IO TLB: area num 2. Sep 12 17:51:11.159276 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 17:51:11.159288 kernel: Kernel/User page tables isolation: enabled Sep 12 17:51:11.159299 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 17:51:11.159312 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 17:51:11.159322 kernel: Dynamic Preempt: voluntary Sep 12 17:51:11.159332 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:51:11.159344 kernel: rcu: RCU event tracing is enabled. Sep 12 17:51:11.159355 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 17:51:11.159369 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:51:11.159379 kernel: Rude variant of Tasks RCU enabled. Sep 12 17:51:11.159390 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:51:11.159400 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:51:11.159410 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 17:51:11.159420 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:51:11.159431 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:51:11.159441 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:51:11.159464 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 12 17:51:11.159486 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:51:11.159496 kernel: Console: colour dummy device 80x25 Sep 12 17:51:11.159506 kernel: printk: legacy console [ttyS0] enabled Sep 12 17:51:11.159517 kernel: ACPI: Core revision 20240827 Sep 12 17:51:11.159527 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 17:51:11.159537 kernel: x2apic enabled Sep 12 17:51:11.159548 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 17:51:11.159558 kernel: ..TIMER: vector=0x30 apic1=0 pin1=0 apic2=-1 pin2=-1 Sep 12 17:51:11.159568 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Sep 12 17:51:11.159582 kernel: Calibrating delay loop (skipped) preset value.. 4599.99 BogoMIPS (lpj=2299998) Sep 12 17:51:11.159592 kernel: Last level iTLB entries: 4KB 1024, 2MB 1024, 4MB 1024 Sep 12 17:51:11.159602 kernel: Last level dTLB entries: 4KB 1024, 2MB 1024, 4MB 1024, 1GB 4 Sep 12 17:51:11.159613 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 17:51:11.159623 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall and VM exit Sep 12 17:51:11.159633 kernel: Spectre V2 : Mitigation: IBRS Sep 12 17:51:11.159644 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 17:51:11.159654 kernel: RETBleed: Mitigation: IBRS Sep 12 17:51:11.159665 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 17:51:11.159678 kernel: Spectre V2 : User space: Mitigation: STIBP via prctl Sep 12 17:51:11.159688 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 17:51:11.159698 kernel: MDS: Mitigation: Clear CPU buffers Sep 12 17:51:11.159709 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Sep 12 17:51:11.159719 kernel: active return thunk: its_return_thunk Sep 12 17:51:11.159729 kernel: ITS: Mitigation: Aligned branch/return thunks Sep 12 17:51:11.159740 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 17:51:11.159750 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 17:51:11.159760 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 17:51:11.159773 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 17:51:11.159784 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Sep 12 17:51:11.159794 kernel: Freeing SMP alternatives memory: 32K Sep 12 17:51:11.159804 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:51:11.159814 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 17:51:11.159825 kernel: landlock: Up and running. Sep 12 17:51:11.159835 kernel: SELinux: Initializing. Sep 12 17:51:11.159845 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 17:51:11.159855 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 17:51:11.159869 kernel: smpboot: CPU0: Intel(R) Xeon(R) CPU @ 2.30GHz (family: 0x6, model: 0x3f, stepping: 0x0) Sep 12 17:51:11.159879 kernel: Performance Events: unsupported p6 CPU model 63 no PMU driver, software events only. Sep 12 17:51:11.159889 kernel: signal: max sigframe size: 1776 Sep 12 17:51:11.159900 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:51:11.159910 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:51:11.159920 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 17:51:11.159931 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Sep 12 17:51:11.159941 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:51:11.159954 kernel: smpboot: x86: Booting SMP configuration: Sep 12 17:51:11.159964 kernel: .... node #0, CPUs: #1 Sep 12 17:51:11.159975 kernel: MDS CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/mds.html for more details. Sep 12 17:51:11.159986 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Sep 12 17:51:11.159996 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:51:11.160006 kernel: smpboot: Total of 2 processors activated (9199.99 BogoMIPS) Sep 12 17:51:11.160017 kernel: Memory: 7564272K/7860552K available (14336K kernel code, 2432K rwdata, 9960K rodata, 54040K init, 2924K bss, 290712K reserved, 0K cma-reserved) Sep 12 17:51:11.160027 kernel: devtmpfs: initialized Sep 12 17:51:11.160044 kernel: x86/mm: Memory block size: 128MB Sep 12 17:51:11.160057 kernel: ACPI: PM: Registering ACPI NVS region [mem 0xbfb7f000-0xbfbfefff] (524288 bytes) Sep 12 17:51:11.160068 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:51:11.160078 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 17:51:11.160088 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:51:11.160098 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:51:11.160108 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:51:11.160119 kernel: audit: type=2000 audit(1757699467.383:1): state=initialized audit_enabled=0 res=1 Sep 12 17:51:11.160129 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:51:11.160139 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 17:51:11.160152 kernel: cpuidle: using governor menu Sep 12 17:51:11.160162 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:51:11.160173 kernel: dca service started, version 1.12.1 Sep 12 17:51:11.160183 kernel: PCI: Using configuration type 1 for base access Sep 12 17:51:11.160193 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 17:51:11.160203 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:51:11.160214 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:51:11.160224 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:51:11.160234 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:51:11.160248 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:51:11.160258 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:51:11.160268 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:51:11.160278 kernel: ACPI: 3 ACPI AML tables successfully acquired and loaded Sep 12 17:51:11.160291 kernel: ACPI: Interpreter enabled Sep 12 17:51:11.160301 kernel: ACPI: PM: (supports S0 S3 S5) Sep 12 17:51:11.160311 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 17:51:11.160321 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 17:51:11.160332 kernel: PCI: Ignoring E820 reservations for host bridge windows Sep 12 17:51:11.160345 kernel: ACPI: Enabled 16 GPEs in block 00 to 0F Sep 12 17:51:11.160356 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 17:51:11.160574 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Sep 12 17:51:11.160707 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] Sep 12 17:51:11.160827 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge Sep 12 17:51:11.160841 kernel: PCI host bridge to bus 0000:00 Sep 12 17:51:11.160960 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 17:51:11.161086 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 17:51:11.161195 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 17:51:11.161305 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfefff window] Sep 12 17:51:11.161412 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 17:51:11.161572 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint Sep 12 17:51:11.161706 kernel: pci 0000:00:01.0: [8086:7110] type 00 class 0x060100 conventional PCI endpoint Sep 12 17:51:11.161841 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint Sep 12 17:51:11.161961 kernel: pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI Sep 12 17:51:11.162094 kernel: pci 0000:00:03.0: [1af4:1004] type 00 class 0x000000 conventional PCI endpoint Sep 12 17:51:11.162216 kernel: pci 0000:00:03.0: BAR 0 [io 0xc040-0xc07f] Sep 12 17:51:11.162339 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc0001000-0xc000107f] Sep 12 17:51:11.162491 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Sep 12 17:51:11.162631 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc03f] Sep 12 17:51:11.162751 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc0000000-0xc000007f] Sep 12 17:51:11.162877 kernel: pci 0000:00:05.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Sep 12 17:51:11.162996 kernel: pci 0000:00:05.0: BAR 0 [io 0xc080-0xc09f] Sep 12 17:51:11.163125 kernel: pci 0000:00:05.0: BAR 1 [mem 0xc0002000-0xc000203f] Sep 12 17:51:11.163139 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 12 17:51:11.163149 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 12 17:51:11.163164 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 17:51:11.163174 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 12 17:51:11.163184 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Sep 12 17:51:11.163195 kernel: iommu: Default domain type: Translated Sep 12 17:51:11.163205 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 17:51:11.163216 kernel: efivars: Registered efivars operations Sep 12 17:51:11.163226 kernel: PCI: Using ACPI for IRQ routing Sep 12 17:51:11.163236 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 17:51:11.163247 kernel: e820: reserve RAM buffer [mem 0x00055000-0x0005ffff] Sep 12 17:51:11.163257 kernel: e820: reserve RAM buffer [mem 0x00098000-0x0009ffff] Sep 12 17:51:11.163270 kernel: e820: reserve RAM buffer [mem 0xbd32a000-0xbfffffff] Sep 12 17:51:11.163282 kernel: e820: reserve RAM buffer [mem 0xbf8ed000-0xbfffffff] Sep 12 17:51:11.163293 kernel: e820: reserve RAM buffer [mem 0xbffe0000-0xbfffffff] Sep 12 17:51:11.163303 kernel: vgaarb: loaded Sep 12 17:51:11.163313 kernel: clocksource: Switched to clocksource kvm-clock Sep 12 17:51:11.163323 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:51:11.163334 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:51:11.163344 kernel: pnp: PnP ACPI init Sep 12 17:51:11.163358 kernel: pnp: PnP ACPI: found 7 devices Sep 12 17:51:11.163369 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 17:51:11.163379 kernel: NET: Registered PF_INET protocol family Sep 12 17:51:11.163389 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 17:51:11.163400 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Sep 12 17:51:11.163411 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:51:11.163421 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 17:51:11.163431 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Sep 12 17:51:11.163442 kernel: TCP: Hash tables configured (established 65536 bind 65536) Sep 12 17:51:11.163469 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Sep 12 17:51:11.163487 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Sep 12 17:51:11.163504 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:51:11.163515 kernel: NET: Registered PF_XDP protocol family Sep 12 17:51:11.163636 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 17:51:11.163746 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 17:51:11.163854 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 17:51:11.163982 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfefff window] Sep 12 17:51:11.164117 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Sep 12 17:51:11.164131 kernel: PCI: CLS 0 bytes, default 64 Sep 12 17:51:11.164142 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Sep 12 17:51:11.164153 kernel: software IO TLB: mapped [mem 0x00000000b7f7f000-0x00000000bbf7f000] (64MB) Sep 12 17:51:11.164163 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Sep 12 17:51:11.164174 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x212733415c7, max_idle_ns: 440795236380 ns Sep 12 17:51:11.164185 kernel: clocksource: Switched to clocksource tsc Sep 12 17:51:11.164195 kernel: Initialise system trusted keyrings Sep 12 17:51:11.164209 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Sep 12 17:51:11.164220 kernel: Key type asymmetric registered Sep 12 17:51:11.164230 kernel: Asymmetric key parser 'x509' registered Sep 12 17:51:11.164240 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 17:51:11.164251 kernel: io scheduler mq-deadline registered Sep 12 17:51:11.164261 kernel: io scheduler kyber registered Sep 12 17:51:11.164271 kernel: io scheduler bfq registered Sep 12 17:51:11.164284 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 17:51:11.164295 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Sep 12 17:51:11.164420 kernel: virtio-pci 0000:00:03.0: virtio_pci: leaving for legacy driver Sep 12 17:51:11.164434 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 10 Sep 12 17:51:11.164619 kernel: virtio-pci 0000:00:04.0: virtio_pci: leaving for legacy driver Sep 12 17:51:11.164646 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Sep 12 17:51:11.164839 kernel: virtio-pci 0000:00:05.0: virtio_pci: leaving for legacy driver Sep 12 17:51:11.164862 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:51:11.164880 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 17:51:11.164897 kernel: 00:04: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Sep 12 17:51:11.164916 kernel: 00:05: ttyS2 at I/O 0x3e8 (irq = 6, base_baud = 115200) is a 16550A Sep 12 17:51:11.164945 kernel: 00:06: ttyS3 at I/O 0x2e8 (irq = 7, base_baud = 115200) is a 16550A Sep 12 17:51:11.165150 kernel: tpm_tis MSFT0101:00: 2.0 TPM (device-id 0x9009, rev-id 0) Sep 12 17:51:11.165176 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 12 17:51:11.165195 kernel: i8042: Warning: Keylock active Sep 12 17:51:11.165212 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 17:51:11.165231 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 17:51:11.165416 kernel: rtc_cmos 00:00: RTC can wake from S4 Sep 12 17:51:11.165622 kernel: rtc_cmos 00:00: registered as rtc0 Sep 12 17:51:11.165802 kernel: rtc_cmos 00:00: setting system clock to 2025-09-12T17:51:10 UTC (1757699470) Sep 12 17:51:11.165981 kernel: rtc_cmos 00:00: alarms up to one day, 114 bytes nvram Sep 12 17:51:11.166004 kernel: intel_pstate: CPU model not supported Sep 12 17:51:11.166023 kernel: pstore: Using crash dump compression: deflate Sep 12 17:51:11.166053 kernel: pstore: Registered efi_pstore as persistent store backend Sep 12 17:51:11.166071 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:51:11.166090 kernel: Segment Routing with IPv6 Sep 12 17:51:11.166108 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:51:11.166133 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:51:11.166151 kernel: Key type dns_resolver registered Sep 12 17:51:11.166168 kernel: IPI shorthand broadcast: enabled Sep 12 17:51:11.166187 kernel: sched_clock: Marking stable (3596004440, 152363648)->(3764380666, -16012578) Sep 12 17:51:11.166206 kernel: registered taskstats version 1 Sep 12 17:51:11.166225 kernel: Loading compiled-in X.509 certificates Sep 12 17:51:11.166243 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: f1ae8d6e9bfae84d90f4136cf098b0465b2a5bd7' Sep 12 17:51:11.166261 kernel: Demotion targets for Node 0: null Sep 12 17:51:11.166280 kernel: Key type .fscrypt registered Sep 12 17:51:11.166302 kernel: Key type fscrypt-provisioning registered Sep 12 17:51:11.166320 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:51:11.166338 kernel: ima: No architecture policies found Sep 12 17:51:11.166356 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Sep 12 17:51:11.166375 kernel: clk: Disabling unused clocks Sep 12 17:51:11.166393 kernel: Warning: unable to open an initial console. Sep 12 17:51:11.166412 kernel: Freeing unused kernel image (initmem) memory: 54040K Sep 12 17:51:11.166429 kernel: Write protecting the kernel read-only data: 24576k Sep 12 17:51:11.166475 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 12 17:51:11.166495 kernel: Run /init as init process Sep 12 17:51:11.166511 kernel: with arguments: Sep 12 17:51:11.166525 kernel: /init Sep 12 17:51:11.166541 kernel: with environment: Sep 12 17:51:11.166576 kernel: HOME=/ Sep 12 17:51:11.166592 kernel: TERM=linux Sep 12 17:51:11.166608 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:51:11.166627 systemd[1]: Successfully made /usr/ read-only. Sep 12 17:51:11.166655 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:51:11.166675 systemd[1]: Detected virtualization google. Sep 12 17:51:11.166693 systemd[1]: Detected architecture x86-64. Sep 12 17:51:11.166711 systemd[1]: Running in initrd. Sep 12 17:51:11.166729 systemd[1]: No hostname configured, using default hostname. Sep 12 17:51:11.166748 systemd[1]: Hostname set to . Sep 12 17:51:11.166766 systemd[1]: Initializing machine ID from random generator. Sep 12 17:51:11.166788 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:51:11.166825 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:51:11.166848 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:51:11.166868 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:51:11.166887 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:51:11.166907 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:51:11.166932 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:51:11.166952 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:51:11.166972 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:51:11.166991 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:51:11.167010 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:51:11.167036 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:51:11.167056 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:51:11.167079 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:51:11.167098 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:51:11.167118 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:51:11.167137 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:51:11.167156 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:51:11.167175 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 17:51:11.167195 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:51:11.167214 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:51:11.167233 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:51:11.167256 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:51:11.167275 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:51:11.167295 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:51:11.167314 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:51:11.167334 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 17:51:11.167353 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:51:11.167372 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:51:11.167391 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:51:11.167415 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:51:11.167434 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:51:11.167507 systemd-journald[206]: Collecting audit messages is disabled. Sep 12 17:51:11.167554 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:51:11.167574 systemd-journald[206]: Journal started Sep 12 17:51:11.167613 systemd-journald[206]: Runtime Journal (/run/log/journal/2156a70ea93a42f195ad79869f763a1d) is 8M, max 148.9M, 140.9M free. Sep 12 17:51:11.171481 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:51:11.176522 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:51:11.185685 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:51:11.188626 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:51:11.198837 systemd-modules-load[208]: Inserted module 'overlay' Sep 12 17:51:11.219566 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:51:11.228742 systemd-tmpfiles[218]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 17:51:11.231211 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:51:11.242701 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:51:11.257847 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:51:11.252934 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:51:11.259398 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:51:11.260243 systemd-modules-load[208]: Inserted module 'br_netfilter' Sep 12 17:51:11.260483 kernel: Bridge firewalling registered Sep 12 17:51:11.263005 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:51:11.269647 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:51:11.288022 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:51:11.295001 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:51:11.301856 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:51:11.304536 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:51:11.311635 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:51:11.338384 dracut-cmdline[244]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 flatcar.first_boot=detected flatcar.oem.id=gce verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:51:11.388793 systemd-resolved[245]: Positive Trust Anchors: Sep 12 17:51:11.389349 systemd-resolved[245]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:51:11.389429 systemd-resolved[245]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:51:11.401171 systemd-resolved[245]: Defaulting to hostname 'linux'. Sep 12 17:51:11.404640 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:51:11.410749 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:51:11.463500 kernel: SCSI subsystem initialized Sep 12 17:51:11.475487 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:51:11.487496 kernel: iscsi: registered transport (tcp) Sep 12 17:51:11.513505 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:51:11.513594 kernel: QLogic iSCSI HBA Driver Sep 12 17:51:11.536905 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:51:11.559958 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:51:11.563099 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:51:11.625664 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:51:11.628752 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:51:11.690496 kernel: raid6: avx2x4 gen() 18279 MB/s Sep 12 17:51:11.707487 kernel: raid6: avx2x2 gen() 18251 MB/s Sep 12 17:51:11.724941 kernel: raid6: avx2x1 gen() 13968 MB/s Sep 12 17:51:11.724999 kernel: raid6: using algorithm avx2x4 gen() 18279 MB/s Sep 12 17:51:11.742988 kernel: raid6: .... xor() 7333 MB/s, rmw enabled Sep 12 17:51:11.743053 kernel: raid6: using avx2x2 recovery algorithm Sep 12 17:51:11.765493 kernel: xor: automatically using best checksumming function avx Sep 12 17:51:11.949499 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:51:11.957868 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:51:11.965797 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:51:12.001177 systemd-udevd[454]: Using default interface naming scheme 'v255'. Sep 12 17:51:12.010426 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:51:12.017643 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:51:12.047233 dracut-pre-trigger[461]: rd.md=0: removing MD RAID activation Sep 12 17:51:12.080660 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:51:12.086343 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:51:12.176678 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:51:12.183227 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:51:12.297487 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 17:51:12.297567 kernel: virtio_scsi virtio0: 1/0/0 default/read/poll queues Sep 12 17:51:12.331488 kernel: AES CTR mode by8 optimization enabled Sep 12 17:51:12.343588 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Sep 12 17:51:12.348497 kernel: scsi host0: Virtio SCSI HBA Sep 12 17:51:12.359879 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:51:12.360093 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:51:12.363038 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:51:12.379905 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:51:12.389162 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:51:12.411062 kernel: scsi 0:0:1:0: Direct-Access Google PersistentDisk 1 PQ: 0 ANSI: 6 Sep 12 17:51:12.455201 kernel: sd 0:0:1:0: [sda] 25165824 512-byte logical blocks: (12.9 GB/12.0 GiB) Sep 12 17:51:12.455569 kernel: sd 0:0:1:0: [sda] 4096-byte physical blocks Sep 12 17:51:12.457474 kernel: sd 0:0:1:0: [sda] Write Protect is off Sep 12 17:51:12.457758 kernel: sd 0:0:1:0: [sda] Mode Sense: 1f 00 00 08 Sep 12 17:51:12.457987 kernel: sd 0:0:1:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 17:51:12.459980 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:51:12.469969 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 17:51:12.470028 kernel: GPT:17805311 != 25165823 Sep 12 17:51:12.470055 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 17:51:12.470095 kernel: GPT:17805311 != 25165823 Sep 12 17:51:12.470484 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 17:51:12.471897 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:51:12.472985 kernel: sd 0:0:1:0: [sda] Attached SCSI disk Sep 12 17:51:12.550582 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - PersistentDisk EFI-SYSTEM. Sep 12 17:51:12.555472 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:51:12.580385 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - PersistentDisk ROOT. Sep 12 17:51:12.600279 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - PersistentDisk USR-A. Sep 12 17:51:12.600602 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - PersistentDisk USR-A. Sep 12 17:51:12.620698 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - PersistentDisk OEM. Sep 12 17:51:12.621060 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:51:12.628579 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:51:12.632551 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:51:12.637850 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:51:12.645646 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:51:12.665701 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:51:12.669002 disk-uuid[607]: Primary Header is updated. Sep 12 17:51:12.669002 disk-uuid[607]: Secondary Entries is updated. Sep 12 17:51:12.669002 disk-uuid[607]: Secondary Header is updated. Sep 12 17:51:12.687480 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:51:13.717481 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:51:13.717693 disk-uuid[613]: The operation has completed successfully. Sep 12 17:51:13.788636 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:51:13.788811 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:51:13.847233 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:51:13.863669 sh[629]: Success Sep 12 17:51:13.886629 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:51:13.886710 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:51:13.886736 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 17:51:13.898771 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" Sep 12 17:51:13.978359 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:51:13.984575 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:51:14.002263 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:51:14.021502 kernel: BTRFS: device fsid 74707491-1b86-4926-8bdb-c533ce2a0c32 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (641) Sep 12 17:51:14.024254 kernel: BTRFS info (device dm-0): first mount of filesystem 74707491-1b86-4926-8bdb-c533ce2a0c32 Sep 12 17:51:14.024322 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:51:14.046105 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 17:51:14.046216 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:51:14.046257 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 17:51:14.051245 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:51:14.052846 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:51:14.054987 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:51:14.057220 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:51:14.060721 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:51:14.103572 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (674) Sep 12 17:51:14.107371 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:51:14.107449 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:51:14.117182 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:51:14.117268 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:51:14.117296 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:51:14.123488 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:51:14.125526 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:51:14.133576 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:51:14.270577 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:51:14.306548 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:51:14.387722 ignition[735]: Ignition 2.21.0 Sep 12 17:51:14.388212 ignition[735]: Stage: fetch-offline Sep 12 17:51:14.391201 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:51:14.388280 ignition[735]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:51:14.394386 systemd-networkd[810]: lo: Link UP Sep 12 17:51:14.388297 ignition[735]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 12 17:51:14.394392 systemd-networkd[810]: lo: Gained carrier Sep 12 17:51:14.388506 ignition[735]: parsed url from cmdline: "" Sep 12 17:51:14.396530 systemd-networkd[810]: Enumeration completed Sep 12 17:51:14.388514 ignition[735]: no config URL provided Sep 12 17:51:14.397086 systemd-networkd[810]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:51:14.388524 ignition[735]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:51:14.397094 systemd-networkd[810]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:51:14.388550 ignition[735]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:51:14.398993 systemd-networkd[810]: eth0: Link UP Sep 12 17:51:14.388561 ignition[735]: failed to fetch config: resource requires networking Sep 12 17:51:14.399245 systemd-networkd[810]: eth0: Gained carrier Sep 12 17:51:14.388893 ignition[735]: Ignition finished successfully Sep 12 17:51:14.399264 systemd-networkd[810]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:51:14.404754 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:51:14.408652 systemd[1]: Reached target network.target - Network. Sep 12 17:51:14.454503 ignition[819]: Ignition 2.21.0 Sep 12 17:51:14.411948 systemd-networkd[810]: eth0: DHCPv4 address 10.128.0.19/32, gateway 10.128.0.1 acquired from 169.254.169.254 Sep 12 17:51:14.454514 ignition[819]: Stage: fetch Sep 12 17:51:14.412837 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 17:51:14.454690 ignition[819]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:51:14.454702 ignition[819]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 12 17:51:14.472202 unknown[819]: fetched base config from "system" Sep 12 17:51:14.454816 ignition[819]: parsed url from cmdline: "" Sep 12 17:51:14.472213 unknown[819]: fetched base config from "system" Sep 12 17:51:14.454820 ignition[819]: no config URL provided Sep 12 17:51:14.472222 unknown[819]: fetched user config from "gcp" Sep 12 17:51:14.454827 ignition[819]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:51:14.476682 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 17:51:14.454837 ignition[819]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:51:14.479575 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:51:14.454875 ignition[819]: GET http://169.254.169.254/computeMetadata/v1/instance/attributes/user-data: attempt #1 Sep 12 17:51:14.459619 ignition[819]: GET result: OK Sep 12 17:51:14.459806 ignition[819]: parsing config with SHA512: 8b4bedf968f56d03b053a6fbd04a6c1f4b0cb2128d2a78ff67144dd8e825237001cea847146f720a787044b5d3e4bf82cf4fd24bde4923d98e45fbeb2368a2c5 Sep 12 17:51:14.473289 ignition[819]: fetch: fetch complete Sep 12 17:51:14.473298 ignition[819]: fetch: fetch passed Sep 12 17:51:14.473352 ignition[819]: Ignition finished successfully Sep 12 17:51:14.522339 ignition[827]: Ignition 2.21.0 Sep 12 17:51:14.522357 ignition[827]: Stage: kargs Sep 12 17:51:14.525886 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:51:14.522623 ignition[827]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:51:14.529963 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:51:14.522644 ignition[827]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 12 17:51:14.523941 ignition[827]: kargs: kargs passed Sep 12 17:51:14.524001 ignition[827]: Ignition finished successfully Sep 12 17:51:14.562170 ignition[834]: Ignition 2.21.0 Sep 12 17:51:14.562188 ignition[834]: Stage: disks Sep 12 17:51:14.562439 ignition[834]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:51:14.565996 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:51:14.562490 ignition[834]: no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 12 17:51:14.567414 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:51:14.563799 ignition[834]: disks: disks passed Sep 12 17:51:14.570785 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:51:14.563862 ignition[834]: Ignition finished successfully Sep 12 17:51:14.577738 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:51:14.580778 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:51:14.584790 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:51:14.590172 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:51:14.625735 systemd-fsck[843]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 12 17:51:14.634112 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:51:14.640068 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:51:14.803498 kernel: EXT4-fs (sda9): mounted filesystem 26739aba-b0be-4ce3-bfbd-ca4dbcbe2426 r/w with ordered data mode. Quota mode: none. Sep 12 17:51:14.804286 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:51:14.807254 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:51:14.812576 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:51:14.819275 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:51:14.827208 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Sep 12 17:51:14.827302 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:51:14.827347 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:51:14.842591 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (851) Sep 12 17:51:14.842629 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:51:14.842653 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:51:14.846904 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:51:14.850074 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:51:14.854594 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:51:14.854635 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:51:14.854658 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:51:14.860581 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:51:14.959050 initrd-setup-root[875]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:51:14.969057 initrd-setup-root[882]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:51:14.976870 initrd-setup-root[889]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:51:14.984977 initrd-setup-root[896]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:51:15.129795 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:51:15.135309 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:51:15.139132 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:51:15.165063 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:51:15.166722 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:51:15.196905 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:51:15.205208 ignition[963]: INFO : Ignition 2.21.0 Sep 12 17:51:15.205208 ignition[963]: INFO : Stage: mount Sep 12 17:51:15.209630 ignition[963]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:51:15.209630 ignition[963]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 12 17:51:15.209630 ignition[963]: INFO : mount: mount passed Sep 12 17:51:15.209630 ignition[963]: INFO : Ignition finished successfully Sep 12 17:51:15.210792 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:51:15.219316 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:51:15.254908 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:51:15.283501 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (976) Sep 12 17:51:15.286876 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:51:15.286978 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:51:15.292629 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:51:15.292713 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:51:15.292738 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:51:15.296306 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:51:15.329534 ignition[993]: INFO : Ignition 2.21.0 Sep 12 17:51:15.329534 ignition[993]: INFO : Stage: files Sep 12 17:51:15.335603 ignition[993]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:51:15.335603 ignition[993]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 12 17:51:15.335603 ignition[993]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:51:15.335603 ignition[993]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:51:15.335603 ignition[993]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:51:15.351546 ignition[993]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:51:15.351546 ignition[993]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:51:15.351546 ignition[993]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:51:15.351546 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 17:51:15.351546 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 12 17:51:15.339392 unknown[993]: wrote ssh authorized keys file for user: core Sep 12 17:51:16.255725 systemd-networkd[810]: eth0: Gained IPv6LL Sep 12 17:51:16.821359 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:51:17.098340 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 17:51:17.098340 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:51:17.107561 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 12 17:51:17.502377 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:51:18.300956 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 17:51:18.300956 ignition[993]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:51:18.308620 ignition[993]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:51:18.308620 ignition[993]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:51:18.308620 ignition[993]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:51:18.308620 ignition[993]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:51:18.308620 ignition[993]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:51:18.308620 ignition[993]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:51:18.332870 ignition[993]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:51:18.332870 ignition[993]: INFO : files: files passed Sep 12 17:51:18.332870 ignition[993]: INFO : Ignition finished successfully Sep 12 17:51:18.311662 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:51:18.315189 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:51:18.325225 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:51:18.351735 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:51:18.351928 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:51:18.366638 initrd-setup-root-after-ignition[1022]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:51:18.366638 initrd-setup-root-after-ignition[1022]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:51:18.376588 initrd-setup-root-after-ignition[1027]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:51:18.367194 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:51:18.371286 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:51:18.378084 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:51:18.441317 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:51:18.441522 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:51:18.446443 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:51:18.452708 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:51:18.456734 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:51:18.458216 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:51:18.495699 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:51:18.502233 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:51:18.532213 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:51:18.538752 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:51:18.539171 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:51:18.546810 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:51:18.547047 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:51:18.555939 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:51:18.561840 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:51:18.564967 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:51:18.569030 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:51:18.572999 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:51:18.577037 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:51:18.581040 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:51:18.585062 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:51:18.589185 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:51:18.594074 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:51:18.598142 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:51:18.602021 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:51:18.602660 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:51:18.610057 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:51:18.613193 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:51:18.616975 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:51:18.617397 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:51:18.620952 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:51:18.621377 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:51:18.629947 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:51:18.630563 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:51:18.633239 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:51:18.633696 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:51:18.639321 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:51:18.657789 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:51:18.663662 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:51:18.664169 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:51:18.668470 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:51:18.670226 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:51:18.686083 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:51:18.686246 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:51:18.695675 ignition[1047]: INFO : Ignition 2.21.0 Sep 12 17:51:18.695675 ignition[1047]: INFO : Stage: umount Sep 12 17:51:18.695675 ignition[1047]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:51:18.695675 ignition[1047]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/gcp" Sep 12 17:51:18.706791 ignition[1047]: INFO : umount: umount passed Sep 12 17:51:18.706791 ignition[1047]: INFO : Ignition finished successfully Sep 12 17:51:18.701276 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:51:18.701593 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:51:18.709632 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:51:18.710338 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:51:18.710477 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:51:18.712201 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:51:18.712322 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:51:18.718660 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:51:18.718740 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:51:18.724687 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 17:51:18.724775 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 17:51:18.730679 systemd[1]: Stopped target network.target - Network. Sep 12 17:51:18.734563 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:51:18.734665 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:51:18.738618 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:51:18.742568 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:51:18.746706 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:51:18.748827 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:51:18.752776 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:51:18.756816 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:51:18.756996 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:51:18.760813 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:51:18.761013 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:51:18.764801 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:51:18.765001 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:51:18.768922 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:51:18.768996 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:51:18.772775 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:51:18.772998 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:51:18.777220 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:51:18.785998 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:51:18.795025 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:51:18.795325 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:51:18.802547 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 17:51:18.802922 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:51:18.803058 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:51:18.808400 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 17:51:18.809776 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 17:51:18.810059 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:51:18.810232 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:51:18.816110 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:51:18.816718 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:51:18.816934 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:51:18.817213 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:51:18.817425 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:51:18.839611 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:51:18.839810 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:51:18.853653 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:51:18.853760 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:51:18.857965 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:51:18.865175 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 17:51:18.865265 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:51:18.870001 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:51:18.870235 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:51:18.879048 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:51:18.879631 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:51:18.886728 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:51:18.886803 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:51:18.890824 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:51:18.890921 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:51:18.897905 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:51:18.898120 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:51:18.906571 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:51:18.906691 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:51:18.916949 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:51:18.928569 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 17:51:18.928678 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:51:18.931869 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:51:18.931959 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:51:18.938169 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 17:51:18.938404 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:51:18.951011 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:51:18.951095 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:51:18.956703 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:51:18.956803 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:51:18.964944 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 17:51:18.965021 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 17:51:18.965064 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 17:51:18.965111 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:51:18.965760 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:51:18.965947 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:51:18.969688 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:51:18.970053 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:51:18.976105 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:51:18.981074 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:51:19.003816 systemd[1]: Switching root. Sep 12 17:51:19.030224 systemd-journald[206]: Journal stopped Sep 12 17:51:21.043007 systemd-journald[206]: Received SIGTERM from PID 1 (systemd). Sep 12 17:51:21.043096 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:51:21.043122 kernel: SELinux: policy capability open_perms=1 Sep 12 17:51:21.043142 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:51:21.043162 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:51:21.043182 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:51:21.043210 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:51:21.043230 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:51:21.043251 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:51:21.043272 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 17:51:21.043292 kernel: audit: type=1403 audit(1757699479.626:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:51:21.043315 systemd[1]: Successfully loaded SELinux policy in 71.130ms. Sep 12 17:51:21.043339 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 8.019ms. Sep 12 17:51:21.043377 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:51:21.043402 systemd[1]: Detected virtualization google. Sep 12 17:51:21.043426 systemd[1]: Detected architecture x86-64. Sep 12 17:51:21.043448 systemd[1]: Detected first boot. Sep 12 17:51:21.043495 systemd[1]: Initializing machine ID from random generator. Sep 12 17:51:21.043518 zram_generator::config[1090]: No configuration found. Sep 12 17:51:21.043541 kernel: Guest personality initialized and is inactive Sep 12 17:51:21.043561 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 17:51:21.043579 kernel: Initialized host personality Sep 12 17:51:21.043599 kernel: NET: Registered PF_VSOCK protocol family Sep 12 17:51:21.043621 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:51:21.043643 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 17:51:21.043670 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:51:21.043692 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:51:21.043713 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:51:21.043735 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:51:21.043757 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:51:21.043780 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:51:21.043802 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:51:21.043828 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:51:21.043849 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:51:21.043870 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:51:21.043890 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:51:21.043911 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:51:21.043932 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:51:21.043953 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:51:21.043976 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:51:21.044005 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:51:21.044031 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:51:21.044053 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 17:51:21.044075 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:51:21.044099 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:51:21.044119 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:51:21.044142 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:51:21.044169 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:51:21.044193 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:51:21.044215 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:51:21.044238 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:51:21.044261 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:51:21.044284 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:51:21.044307 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:51:21.044330 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:51:21.044362 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 17:51:21.044391 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:51:21.044414 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:51:21.044437 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:51:21.044488 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:51:21.044515 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:51:21.044535 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:51:21.044555 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:51:21.044574 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:51:21.044595 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:51:21.044614 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:51:21.044634 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:51:21.044657 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:51:21.044680 systemd[1]: Reached target machines.target - Containers. Sep 12 17:51:21.044710 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:51:21.044734 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:51:21.044757 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:51:21.044781 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:51:21.044806 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:51:21.044830 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:51:21.044854 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:51:21.044880 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:51:21.044910 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:51:21.044935 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:51:21.044960 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:51:21.044985 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:51:21.045009 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:51:21.045033 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:51:21.045059 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:51:21.045084 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:51:21.045112 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:51:21.045139 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:51:21.045232 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:51:21.045255 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 17:51:21.045278 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:51:21.045300 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:51:21.045322 systemd[1]: Stopped verity-setup.service. Sep 12 17:51:21.045343 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:51:21.045379 kernel: loop: module loaded Sep 12 17:51:21.045401 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:51:21.045422 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:51:21.045444 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:51:21.045542 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:51:21.045567 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:51:21.045588 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:51:21.045610 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:51:21.045632 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:51:21.045660 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:51:21.045742 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:51:21.045766 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:51:21.045786 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:51:21.045805 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:51:21.045826 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:51:21.045847 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:51:21.045869 kernel: fuse: init (API version 7.41) Sep 12 17:51:21.045890 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:51:21.045917 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:51:21.045938 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:51:21.045958 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:51:21.045980 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:51:21.046001 kernel: ACPI: bus type drm_connector registered Sep 12 17:51:21.046020 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:51:21.046042 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:51:21.046076 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 17:51:21.046139 systemd-journald[1161]: Collecting audit messages is disabled. Sep 12 17:51:21.046187 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:51:21.046215 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:51:21.046239 systemd-journald[1161]: Journal started Sep 12 17:51:21.046281 systemd-journald[1161]: Runtime Journal (/run/log/journal/55521ac0132148a9be6b443ef29e6889) is 8M, max 148.9M, 140.9M free. Sep 12 17:51:20.484609 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:51:20.510417 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:51:20.511139 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:51:21.049547 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:51:21.054513 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:51:21.061079 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:51:21.072514 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:51:21.072606 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:51:21.080628 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:51:21.098957 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:51:21.106488 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:51:21.118664 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:51:21.119139 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:51:21.123031 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:51:21.125718 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:51:21.129180 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 17:51:21.132994 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:51:21.135994 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:51:21.167086 kernel: loop0: detected capacity change from 0 to 128016 Sep 12 17:51:21.183826 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:51:21.202329 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:51:21.211348 systemd-tmpfiles[1189]: ACLs are not supported, ignoring. Sep 12 17:51:21.211386 systemd-tmpfiles[1189]: ACLs are not supported, ignoring. Sep 12 17:51:21.211817 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 17:51:21.213979 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:51:21.268218 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:51:21.274687 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:51:21.286662 systemd-journald[1161]: Time spent on flushing to /var/log/journal/55521ac0132148a9be6b443ef29e6889 is 131.550ms for 968 entries. Sep 12 17:51:21.286662 systemd-journald[1161]: System Journal (/var/log/journal/55521ac0132148a9be6b443ef29e6889) is 8M, max 584.8M, 576.8M free. Sep 12 17:51:21.455115 systemd-journald[1161]: Received client request to flush runtime journal. Sep 12 17:51:21.455221 kernel: loop1: detected capacity change from 0 to 221472 Sep 12 17:51:21.455266 kernel: loop2: detected capacity change from 0 to 111000 Sep 12 17:51:21.289357 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:51:21.293296 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:51:21.330168 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 17:51:21.453274 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:51:21.461117 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:51:21.463303 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:51:21.466796 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:51:21.514863 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:51:21.521821 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:51:21.526827 systemd-tmpfiles[1228]: ACLs are not supported, ignoring. Sep 12 17:51:21.526859 systemd-tmpfiles[1228]: ACLs are not supported, ignoring. Sep 12 17:51:21.534507 kernel: loop3: detected capacity change from 0 to 52056 Sep 12 17:51:21.536668 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:51:21.545504 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:51:21.626519 kernel: loop4: detected capacity change from 0 to 128016 Sep 12 17:51:21.658591 kernel: loop5: detected capacity change from 0 to 221472 Sep 12 17:51:21.687492 kernel: loop6: detected capacity change from 0 to 111000 Sep 12 17:51:21.740497 kernel: loop7: detected capacity change from 0 to 52056 Sep 12 17:51:21.766410 (sd-merge)[1238]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-gce'. Sep 12 17:51:21.768274 (sd-merge)[1238]: Merged extensions into '/usr'. Sep 12 17:51:21.782938 systemd[1]: Reload requested from client PID 1186 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:51:21.783178 systemd[1]: Reloading... Sep 12 17:51:21.903040 zram_generator::config[1260]: No configuration found. Sep 12 17:51:22.327484 ldconfig[1181]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:51:22.455854 systemd[1]: Reloading finished in 671 ms. Sep 12 17:51:22.471430 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:51:22.476482 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:51:22.492661 systemd[1]: Starting ensure-sysext.service... Sep 12 17:51:22.497357 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:51:22.534874 systemd[1]: Reload requested from client PID 1304 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:51:22.534905 systemd[1]: Reloading... Sep 12 17:51:22.543882 systemd-tmpfiles[1305]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 17:51:22.543945 systemd-tmpfiles[1305]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 17:51:22.544973 systemd-tmpfiles[1305]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:51:22.545664 systemd-tmpfiles[1305]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:51:22.549323 systemd-tmpfiles[1305]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:51:22.549955 systemd-tmpfiles[1305]: ACLs are not supported, ignoring. Sep 12 17:51:22.550088 systemd-tmpfiles[1305]: ACLs are not supported, ignoring. Sep 12 17:51:22.567283 systemd-tmpfiles[1305]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:51:22.567305 systemd-tmpfiles[1305]: Skipping /boot Sep 12 17:51:22.603346 systemd-tmpfiles[1305]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:51:22.606763 systemd-tmpfiles[1305]: Skipping /boot Sep 12 17:51:22.613493 zram_generator::config[1331]: No configuration found. Sep 12 17:51:22.932486 systemd[1]: Reloading finished in 396 ms. Sep 12 17:51:22.958076 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:51:22.978504 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:51:22.991269 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:51:22.997810 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:51:23.002579 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:51:23.010897 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:51:23.017885 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:51:23.025515 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:51:23.036305 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:51:23.036692 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:51:23.041956 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:51:23.047661 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:51:23.054589 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:51:23.057769 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:51:23.058009 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:51:23.062529 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:51:23.065576 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:51:23.075359 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:51:23.075801 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:51:23.076112 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:51:23.076275 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:51:23.076446 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:51:23.088583 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:51:23.089023 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:51:23.097073 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:51:23.109197 systemd[1]: Starting setup-oem.service - Setup OEM... Sep 12 17:51:23.111758 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:51:23.112778 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:51:23.113144 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:51:23.114739 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:51:23.118285 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:51:23.119411 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:51:23.138982 systemd[1]: Finished ensure-sysext.service. Sep 12 17:51:23.142092 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:51:23.143560 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:51:23.147185 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:51:23.148423 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:51:23.152180 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:51:23.153391 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:51:23.177029 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:51:23.177301 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:51:23.179601 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:51:23.185378 systemd-udevd[1378]: Using default interface naming scheme 'v255'. Sep 12 17:51:23.249147 systemd[1]: Finished setup-oem.service - Setup OEM. Sep 12 17:51:23.255996 systemd[1]: Starting oem-gce-enable-oslogin.service - Enable GCE OS Login... Sep 12 17:51:23.258799 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:51:23.262725 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:51:23.276165 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:51:23.278301 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:51:23.282903 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:51:23.316825 augenrules[1425]: No rules Sep 12 17:51:23.322077 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:51:23.322468 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:51:23.325828 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:51:23.335386 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:51:23.360623 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:51:23.383808 systemd[1]: Finished oem-gce-enable-oslogin.service - Enable GCE OS Login. Sep 12 17:51:23.459209 systemd-resolved[1377]: Positive Trust Anchors: Sep 12 17:51:23.459243 systemd-resolved[1377]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:51:23.459303 systemd-resolved[1377]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:51:23.478783 systemd-resolved[1377]: Defaulting to hostname 'linux'. Sep 12 17:51:23.483531 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:51:23.486727 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:51:23.489651 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:51:23.492775 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:51:23.495691 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:51:23.498640 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 17:51:23.500515 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:51:23.503748 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:51:23.506591 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:51:23.509601 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:51:23.509665 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:51:23.512576 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:51:23.517878 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:51:23.523234 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:51:23.531514 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 17:51:23.534836 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 17:51:23.538558 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 17:51:23.553697 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:51:23.557338 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 17:51:23.561232 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:51:23.575137 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:51:23.576618 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:51:23.579736 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:51:23.579784 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:51:23.584800 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:51:23.591086 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:51:23.597848 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:51:23.608368 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:51:23.614721 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:51:23.617613 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:51:23.623251 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 17:51:23.633781 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:51:23.649906 systemd[1]: Started ntpd.service - Network Time Service. Sep 12 17:51:23.656136 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:51:23.666917 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:51:23.703832 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:51:23.730689 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:51:23.736188 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionSecurity=!tpm2). Sep 12 17:51:23.737817 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:51:23.738551 jq[1478]: false Sep 12 17:51:23.748854 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:51:23.768644 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:51:23.772174 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:51:23.777618 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:51:23.777983 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:51:23.793067 jq[1493]: true Sep 12 17:51:23.800654 systemd-networkd[1442]: lo: Link UP Sep 12 17:51:23.800672 systemd-networkd[1442]: lo: Gained carrier Sep 12 17:51:23.801944 systemd-networkd[1442]: Enumeration completed Sep 12 17:51:23.802520 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:51:23.812434 systemd[1]: Reached target network.target - Network. Sep 12 17:51:23.814233 extend-filesystems[1479]: Found /dev/sda6 Sep 12 17:51:23.818734 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:51:23.827556 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Refreshing passwd entry cache Sep 12 17:51:23.827578 oslogin_cache_refresh[1480]: Refreshing passwd entry cache Sep 12 17:51:23.834328 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 17:51:23.840780 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:51:23.844414 systemd[1]: Condition check resulted in dev-tpmrm0.device - /dev/tpmrm0 being skipped. Sep 12 17:51:23.845530 systemd[1]: Reached target tpm2.target - Trusted Platform Module. Sep 12 17:51:23.860233 extend-filesystems[1479]: Found /dev/sda9 Sep 12 17:51:23.885577 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Failure getting users, quitting Sep 12 17:51:23.885577 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:51:23.885577 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Refreshing group entry cache Sep 12 17:51:23.883165 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:51:23.870237 oslogin_cache_refresh[1480]: Failure getting users, quitting Sep 12 17:51:23.883978 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:51:23.870265 oslogin_cache_refresh[1480]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:51:23.870335 oslogin_cache_refresh[1480]: Refreshing group entry cache Sep 12 17:51:23.888689 extend-filesystems[1479]: Checking size of /dev/sda9 Sep 12 17:51:23.899496 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Failure getting groups, quitting Sep 12 17:51:23.899496 google_oslogin_nss_cache[1480]: oslogin_cache_refresh[1480]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:51:23.893953 oslogin_cache_refresh[1480]: Failure getting groups, quitting Sep 12 17:51:23.893974 oslogin_cache_refresh[1480]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:51:23.929090 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 17:51:23.929450 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 17:51:23.958176 jq[1498]: true Sep 12 17:51:23.974528 update_engine[1490]: I20250912 17:51:23.973716 1490 main.cc:92] Flatcar Update Engine starting Sep 12 17:51:23.998493 extend-filesystems[1479]: Resized partition /dev/sda9 Sep 12 17:51:24.022413 extend-filesystems[1527]: resize2fs 1.47.2 (1-Jan-2025) Sep 12 17:51:24.057750 tar[1494]: linux-amd64/helm Sep 12 17:51:24.071099 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 17:51:24.087143 (ntainerd)[1529]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:51:24.114647 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 2538491 blocks Sep 12 17:51:24.115824 coreos-metadata[1474]: Sep 12 17:51:24.115 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/hostname: Attempt #1 Sep 12 17:51:24.122937 coreos-metadata[1474]: Sep 12 17:51:24.117 INFO Failed to fetch: error sending request for url (http://169.254.169.254/computeMetadata/v1/instance/hostname) Sep 12 17:51:24.140630 kernel: EXT4-fs (sda9): resized filesystem to 2538491 Sep 12 17:51:24.171231 extend-filesystems[1527]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 17:51:24.171231 extend-filesystems[1527]: old_desc_blocks = 1, new_desc_blocks = 2 Sep 12 17:51:24.171231 extend-filesystems[1527]: The filesystem on /dev/sda9 is now 2538491 (4k) blocks long. Sep 12 17:51:24.184504 extend-filesystems[1479]: Resized filesystem in /dev/sda9 Sep 12 17:51:24.173426 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:51:24.173802 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:51:24.189667 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:51:24.190018 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:51:24.258821 bash[1548]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:51:24.258522 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:51:24.270725 systemd[1]: Starting sshkeys.service... Sep 12 17:51:24.278171 dbus-daemon[1475]: [system] SELinux support is enabled Sep 12 17:51:24.278417 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:51:24.287757 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:51:24.287801 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:51:24.290840 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:51:24.290874 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:51:24.311165 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:51:24.316876 update_engine[1490]: I20250912 17:51:24.314440 1490 update_check_scheduler.cc:74] Next update check in 5m16s Sep 12 17:51:24.347930 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:51:24.384885 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 17:51:24.396048 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 17:51:24.420837 systemd-networkd[1442]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:51:24.420851 systemd-networkd[1442]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:51:24.453518 systemd-networkd[1442]: eth0: Link UP Sep 12 17:51:24.460516 systemd-networkd[1442]: eth0: Gained carrier Sep 12 17:51:24.460565 systemd-networkd[1442]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:51:24.480146 systemd-networkd[1442]: eth0: DHCPv4 address 10.128.0.19/32, gateway 10.128.0.1 acquired from 169.254.169.254 Sep 12 17:51:24.485131 dbus-daemon[1475]: [system] Activating via systemd: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.0' (uid=244 pid=1442 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Sep 12 17:51:24.495584 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... Sep 12 17:51:24.502644 ntpd[1482]: ntpd 4.2.8p17@1.4004-o Fri Sep 12 14:59:08 UTC 2025 (1): Starting Sep 12 17:51:24.505074 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: ntpd 4.2.8p17@1.4004-o Fri Sep 12 14:59:08 UTC 2025 (1): Starting Sep 12 17:51:24.505074 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 12 17:51:24.505074 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: ---------------------------------------------------- Sep 12 17:51:24.505074 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: ntp-4 is maintained by Network Time Foundation, Sep 12 17:51:24.505074 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 12 17:51:24.505074 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: corporation. Support and training for ntp-4 are Sep 12 17:51:24.505074 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: available at https://www.nwtime.org/support Sep 12 17:51:24.505074 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: ---------------------------------------------------- Sep 12 17:51:24.502682 ntpd[1482]: Command line: /usr/sbin/ntpd -g -n -u ntp:ntp Sep 12 17:51:24.502697 ntpd[1482]: ---------------------------------------------------- Sep 12 17:51:24.502710 ntpd[1482]: ntp-4 is maintained by Network Time Foundation, Sep 12 17:51:24.502725 ntpd[1482]: Inc. (NTF), a non-profit 501(c)(3) public-benefit Sep 12 17:51:24.502760 ntpd[1482]: corporation. Support and training for ntp-4 are Sep 12 17:51:24.502774 ntpd[1482]: available at https://www.nwtime.org/support Sep 12 17:51:24.502787 ntpd[1482]: ---------------------------------------------------- Sep 12 17:51:24.516971 ntpd[1482]: proto: precision = 0.113 usec (-23) Sep 12 17:51:24.517594 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: proto: precision = 0.113 usec (-23) Sep 12 17:51:24.520758 ntpd[1482]: basedate set to 2025-08-31 Sep 12 17:51:24.522675 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: basedate set to 2025-08-31 Sep 12 17:51:24.522675 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: gps base set to 2025-08-31 (week 2382) Sep 12 17:51:24.520794 ntpd[1482]: gps base set to 2025-08-31 (week 2382) Sep 12 17:51:24.537019 ntpd[1482]: Listen and drop on 0 v6wildcard [::]:123 Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: Listen and drop on 0 v6wildcard [::]:123 Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: Listen normally on 2 lo 127.0.0.1:123 Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: Listen normally on 3 eth0 10.128.0.19:123 Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: Listen normally on 4 lo [::1]:123 Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: bind(21) AF_INET6 fe80::4001:aff:fe80:13%2#123 flags 0x11 failed: Cannot assign requested address Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: unable to create socket on eth0 (5) for fe80::4001:aff:fe80:13%2#123 Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: failed to init interface for address fe80::4001:aff:fe80:13%2 Sep 12 17:51:24.541869 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: Listening on routing socket on fd #21 for interface updates Sep 12 17:51:24.540426 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 17:51:24.537300 ntpd[1482]: Listen and drop on 1 v4wildcard 0.0.0.0:123 Sep 12 17:51:24.541105 ntpd[1482]: Listen normally on 2 lo 127.0.0.1:123 Sep 12 17:51:24.541170 ntpd[1482]: Listen normally on 3 eth0 10.128.0.19:123 Sep 12 17:51:24.541230 ntpd[1482]: Listen normally on 4 lo [::1]:123 Sep 12 17:51:24.541297 ntpd[1482]: bind(21) AF_INET6 fe80::4001:aff:fe80:13%2#123 flags 0x11 failed: Cannot assign requested address Sep 12 17:51:24.541328 ntpd[1482]: unable to create socket on eth0 (5) for fe80::4001:aff:fe80:13%2#123 Sep 12 17:51:24.541350 ntpd[1482]: failed to init interface for address fe80::4001:aff:fe80:13%2 Sep 12 17:51:24.541396 ntpd[1482]: Listening on routing socket on fd #21 for interface updates Sep 12 17:51:24.560499 coreos-metadata[1561]: Sep 12 17:51:24.556 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/sshKeys: Attempt #1 Sep 12 17:51:24.564499 coreos-metadata[1561]: Sep 12 17:51:24.561 INFO Fetch failed with 404: resource not found Sep 12 17:51:24.564499 coreos-metadata[1561]: Sep 12 17:51:24.561 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/ssh-keys: Attempt #1 Sep 12 17:51:24.564499 coreos-metadata[1561]: Sep 12 17:51:24.562 INFO Fetch successful Sep 12 17:51:24.564499 coreos-metadata[1561]: Sep 12 17:51:24.562 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/attributes/block-project-ssh-keys: Attempt #1 Sep 12 17:51:24.565055 coreos-metadata[1561]: Sep 12 17:51:24.564 INFO Fetch failed with 404: resource not found Sep 12 17:51:24.565055 coreos-metadata[1561]: Sep 12 17:51:24.564 INFO Fetching http://169.254.169.254/computeMetadata/v1/project/attributes/sshKeys: Attempt #1 Sep 12 17:51:24.578492 coreos-metadata[1561]: Sep 12 17:51:24.574 INFO Fetch failed with 404: resource not found Sep 12 17:51:24.578492 coreos-metadata[1561]: Sep 12 17:51:24.574 INFO Fetching http://169.254.169.254/computeMetadata/v1/project/attributes/ssh-keys: Attempt #1 Sep 12 17:51:24.578492 coreos-metadata[1561]: Sep 12 17:51:24.575 INFO Fetch successful Sep 12 17:51:24.582873 ntpd[1482]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 17:51:24.586812 unknown[1561]: wrote ssh authorized keys file for user: core Sep 12 17:51:24.589614 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 17:51:24.589614 ntpd[1482]: 12 Sep 17:51:24 ntpd[1482]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 17:51:24.583249 ntpd[1482]: kernel reports TIME_ERROR: 0x41: Clock Unsynchronized Sep 12 17:51:24.610407 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 12 17:51:24.622605 kernel: ACPI: button: Power Button [PWRF] Sep 12 17:51:24.622717 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSLPBN:00/input/input5 Sep 12 17:51:24.630538 kernel: ACPI: button: Sleep Button [SLPF] Sep 12 17:51:24.676489 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:51:24.727493 update-ssh-keys[1574]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:51:24.728883 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 17:51:24.741569 systemd[1]: Finished sshkeys.service. Sep 12 17:51:24.770928 locksmithd[1552]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:51:24.823073 kernel: piix4_smbus 0000:00:01.3: SMBus base address uninitialized - upgrade BIOS or use force_addr=0xaddr Sep 12 17:51:24.822853 dbus-daemon[1475]: [system] Successfully activated service 'org.freedesktop.hostname1' Sep 12 17:51:24.819328 systemd[1]: Started systemd-hostnamed.service - Hostname Service. Sep 12 17:51:24.835297 dbus-daemon[1475]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1569 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Sep 12 17:51:24.843968 systemd[1]: Starting polkit.service - Authorization Manager... Sep 12 17:51:24.891671 systemd-logind[1487]: New seat seat0. Sep 12 17:51:24.896414 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:51:24.923858 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - PersistentDisk OEM. Sep 12 17:51:24.933971 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:51:24.995501 kernel: EDAC MC: Ver: 3.0.0 Sep 12 17:51:25.055677 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:51:25.106910 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:51:25.120942 coreos-metadata[1474]: Sep 12 17:51:25.120 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/hostname: Attempt #2 Sep 12 17:51:25.124491 coreos-metadata[1474]: Sep 12 17:51:25.122 INFO Fetch successful Sep 12 17:51:25.124491 coreos-metadata[1474]: Sep 12 17:51:25.123 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/network-interfaces/0/access-configs/0/external-ip: Attempt #1 Sep 12 17:51:25.127123 coreos-metadata[1474]: Sep 12 17:51:25.126 INFO Fetch successful Sep 12 17:51:25.127123 coreos-metadata[1474]: Sep 12 17:51:25.126 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/network-interfaces/0/ip: Attempt #1 Sep 12 17:51:25.128924 coreos-metadata[1474]: Sep 12 17:51:25.128 INFO Fetch successful Sep 12 17:51:25.129082 coreos-metadata[1474]: Sep 12 17:51:25.128 INFO Fetching http://169.254.169.254/computeMetadata/v1/instance/machine-type: Attempt #1 Sep 12 17:51:25.131830 coreos-metadata[1474]: Sep 12 17:51:25.130 INFO Fetch successful Sep 12 17:51:25.145364 containerd[1529]: time="2025-09-12T17:51:25Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 17:51:25.193783 systemd-logind[1487]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 17:51:25.213089 containerd[1529]: time="2025-09-12T17:51:25.213023581Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 17:51:25.225511 systemd-logind[1487]: Watching system buttons on /dev/input/event3 (Sleep Button) Sep 12 17:51:25.229170 systemd-logind[1487]: Watching system buttons on /dev/input/event2 (Power Button) Sep 12 17:51:25.239495 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:51:25.243397 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:51:25.327820 containerd[1529]: time="2025-09-12T17:51:25.327688973Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="16.056µs" Sep 12 17:51:25.327820 containerd[1529]: time="2025-09-12T17:51:25.327748572Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 17:51:25.327820 containerd[1529]: time="2025-09-12T17:51:25.327777364Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 17:51:25.328092 containerd[1529]: time="2025-09-12T17:51:25.328007372Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 17:51:25.328092 containerd[1529]: time="2025-09-12T17:51:25.328036783Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 17:51:25.328211 containerd[1529]: time="2025-09-12T17:51:25.328114304Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:51:25.328265 containerd[1529]: time="2025-09-12T17:51:25.328214914Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:51:25.328265 containerd[1529]: time="2025-09-12T17:51:25.328235385Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:51:25.333022 containerd[1529]: time="2025-09-12T17:51:25.332394796Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:51:25.333022 containerd[1529]: time="2025-09-12T17:51:25.332444790Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:51:25.333022 containerd[1529]: time="2025-09-12T17:51:25.332507190Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:51:25.333022 containerd[1529]: time="2025-09-12T17:51:25.332525486Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 17:51:25.337088 containerd[1529]: time="2025-09-12T17:51:25.335957019Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 17:51:25.337088 containerd[1529]: time="2025-09-12T17:51:25.336386791Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:51:25.337088 containerd[1529]: time="2025-09-12T17:51:25.336445032Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:51:25.337088 containerd[1529]: time="2025-09-12T17:51:25.336502566Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 17:51:25.337088 containerd[1529]: time="2025-09-12T17:51:25.336789051Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 17:51:25.339160 containerd[1529]: time="2025-09-12T17:51:25.338873843Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 17:51:25.339160 containerd[1529]: time="2025-09-12T17:51:25.339015895Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349421901Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349562748Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349592194Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349660939Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349681987Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349698747Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349721656Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349741595Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349761855Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349780005Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349796889Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349818894Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.349996536Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 17:51:25.350091 containerd[1529]: time="2025-09-12T17:51:25.350025551Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350061506Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350082830Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350099782Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350118887Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350135841Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350151050Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350175956Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350194960Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350211737Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350303999Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 17:51:25.350733 containerd[1529]: time="2025-09-12T17:51:25.350324768Z" level=info msg="Start snapshots syncer" Sep 12 17:51:25.353812 containerd[1529]: time="2025-09-12T17:51:25.352888085Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 17:51:25.354037 containerd[1529]: time="2025-09-12T17:51:25.353973897Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 17:51:25.354275 containerd[1529]: time="2025-09-12T17:51:25.354071562Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 17:51:25.359370 containerd[1529]: time="2025-09-12T17:51:25.359262367Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 17:51:25.359994 containerd[1529]: time="2025-09-12T17:51:25.359779066Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 17:51:25.359994 containerd[1529]: time="2025-09-12T17:51:25.359831952Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 17:51:25.359994 containerd[1529]: time="2025-09-12T17:51:25.359854029Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 17:51:25.359994 containerd[1529]: time="2025-09-12T17:51:25.359872697Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 17:51:25.359994 containerd[1529]: time="2025-09-12T17:51:25.359895591Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 17:51:25.359994 containerd[1529]: time="2025-09-12T17:51:25.359930040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 17:51:25.359994 containerd[1529]: time="2025-09-12T17:51:25.359959697Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 17:51:25.360318 containerd[1529]: time="2025-09-12T17:51:25.360006268Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 17:51:25.360318 containerd[1529]: time="2025-09-12T17:51:25.360027044Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 17:51:25.369505 containerd[1529]: time="2025-09-12T17:51:25.369397860Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 17:51:25.369814 containerd[1529]: time="2025-09-12T17:51:25.369774101Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:51:25.370258 containerd[1529]: time="2025-09-12T17:51:25.370213888Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:51:25.370631 containerd[1529]: time="2025-09-12T17:51:25.370597425Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:51:25.370772 containerd[1529]: time="2025-09-12T17:51:25.370736624Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:51:25.372395 containerd[1529]: time="2025-09-12T17:51:25.371475381Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 17:51:25.372395 containerd[1529]: time="2025-09-12T17:51:25.371525932Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 17:51:25.372395 containerd[1529]: time="2025-09-12T17:51:25.371555576Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 17:51:25.372395 containerd[1529]: time="2025-09-12T17:51:25.371590140Z" level=info msg="runtime interface created" Sep 12 17:51:25.372395 containerd[1529]: time="2025-09-12T17:51:25.371600960Z" level=info msg="created NRI interface" Sep 12 17:51:25.372395 containerd[1529]: time="2025-09-12T17:51:25.371622434Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 17:51:25.372395 containerd[1529]: time="2025-09-12T17:51:25.371657654Z" level=info msg="Connect containerd service" Sep 12 17:51:25.372395 containerd[1529]: time="2025-09-12T17:51:25.371728296Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:51:25.381634 containerd[1529]: time="2025-09-12T17:51:25.381005368Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:51:25.503288 ntpd[1482]: bind(24) AF_INET6 fe80::4001:aff:fe80:13%2#123 flags 0x11 failed: Cannot assign requested address Sep 12 17:51:25.505009 ntpd[1482]: 12 Sep 17:51:25 ntpd[1482]: bind(24) AF_INET6 fe80::4001:aff:fe80:13%2#123 flags 0x11 failed: Cannot assign requested address Sep 12 17:51:25.505009 ntpd[1482]: 12 Sep 17:51:25 ntpd[1482]: unable to create socket on eth0 (6) for fe80::4001:aff:fe80:13%2#123 Sep 12 17:51:25.505009 ntpd[1482]: 12 Sep 17:51:25 ntpd[1482]: failed to init interface for address fe80::4001:aff:fe80:13%2 Sep 12 17:51:25.503339 ntpd[1482]: unable to create socket on eth0 (6) for fe80::4001:aff:fe80:13%2#123 Sep 12 17:51:25.503361 ntpd[1482]: failed to init interface for address fe80::4001:aff:fe80:13%2 Sep 12 17:51:25.522663 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:51:25.851280 containerd[1529]: time="2025-09-12T17:51:25.851229375Z" level=info msg="Start subscribing containerd event" Sep 12 17:51:25.851878 containerd[1529]: time="2025-09-12T17:51:25.851816028Z" level=info msg="Start recovering state" Sep 12 17:51:25.852103 containerd[1529]: time="2025-09-12T17:51:25.852082346Z" level=info msg="Start event monitor" Sep 12 17:51:25.854889 containerd[1529]: time="2025-09-12T17:51:25.854639173Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:51:25.854889 containerd[1529]: time="2025-09-12T17:51:25.854671367Z" level=info msg="Start streaming server" Sep 12 17:51:25.854889 containerd[1529]: time="2025-09-12T17:51:25.854687208Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 17:51:25.854889 containerd[1529]: time="2025-09-12T17:51:25.854698994Z" level=info msg="runtime interface starting up..." Sep 12 17:51:25.854889 containerd[1529]: time="2025-09-12T17:51:25.854708904Z" level=info msg="starting plugins..." Sep 12 17:51:25.854889 containerd[1529]: time="2025-09-12T17:51:25.854738012Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 17:51:25.862481 containerd[1529]: time="2025-09-12T17:51:25.859603332Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:51:25.862481 containerd[1529]: time="2025-09-12T17:51:25.860051579Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:51:25.860482 systemd-networkd[1442]: eth0: Gained IPv6LL Sep 12 17:51:25.866926 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:51:25.867325 containerd[1529]: time="2025-09-12T17:51:25.866780545Z" level=info msg="containerd successfully booted in 0.730753s" Sep 12 17:51:25.880793 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:51:25.892819 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:51:25.909944 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:51:25.921223 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:51:25.925181 polkitd[1578]: Started polkitd version 126 Sep 12 17:51:25.927859 sshd_keygen[1518]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:51:25.936951 systemd[1]: Starting oem-gce.service - GCE Linux Agent... Sep 12 17:51:25.962718 polkitd[1578]: Loading rules from directory /etc/polkit-1/rules.d Sep 12 17:51:25.963484 polkitd[1578]: Loading rules from directory /run/polkit-1/rules.d Sep 12 17:51:25.963563 polkitd[1578]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 12 17:51:25.964176 polkitd[1578]: Loading rules from directory /usr/local/share/polkit-1/rules.d Sep 12 17:51:25.964214 polkitd[1578]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) Sep 12 17:51:25.964273 polkitd[1578]: Loading rules from directory /usr/share/polkit-1/rules.d Sep 12 17:51:25.969093 init.sh[1628]: + '[' -e /etc/default/instance_configs.cfg.template ']' Sep 12 17:51:25.969093 init.sh[1628]: + echo -e '[InstanceSetup]\nset_host_keys = false' Sep 12 17:51:25.974287 init.sh[1628]: + /usr/bin/google_instance_setup Sep 12 17:51:25.973878 systemd[1]: Started polkit.service - Authorization Manager. Sep 12 17:51:25.973449 polkitd[1578]: Finished loading, compiling and executing 2 rules Sep 12 17:51:25.976734 dbus-daemon[1475]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Sep 12 17:51:25.978550 polkitd[1578]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Sep 12 17:51:26.011098 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:51:26.025681 tar[1494]: linux-amd64/LICENSE Sep 12 17:51:26.026141 tar[1494]: linux-amd64/README.md Sep 12 17:51:26.029622 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:51:26.048130 systemd-hostnamed[1569]: Hostname set to (transient) Sep 12 17:51:26.050761 systemd-resolved[1377]: System hostname changed to 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal'. Sep 12 17:51:26.062746 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:51:26.074717 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:51:26.085388 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:51:26.085840 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:51:26.102949 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:51:26.131255 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:51:26.146799 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:51:26.158436 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 17:51:26.167991 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:51:26.508180 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:51:26.520836 systemd[1]: Started sshd@0-10.128.0.19:22-139.178.68.195:45260.service - OpenSSH per-connection server daemon (139.178.68.195:45260). Sep 12 17:51:26.645147 instance-setup[1637]: INFO Running google_set_multiqueue. Sep 12 17:51:26.664742 instance-setup[1637]: INFO Set channels for eth0 to 2. Sep 12 17:51:26.670882 instance-setup[1637]: INFO Setting /proc/irq/31/smp_affinity_list to 0 for device virtio1. Sep 12 17:51:26.673249 instance-setup[1637]: INFO /proc/irq/31/smp_affinity_list: real affinity 0 Sep 12 17:51:26.673569 instance-setup[1637]: INFO Setting /proc/irq/32/smp_affinity_list to 0 for device virtio1. Sep 12 17:51:26.675935 instance-setup[1637]: INFO /proc/irq/32/smp_affinity_list: real affinity 0 Sep 12 17:51:26.676012 instance-setup[1637]: INFO Setting /proc/irq/33/smp_affinity_list to 1 for device virtio1. Sep 12 17:51:26.677530 instance-setup[1637]: INFO /proc/irq/33/smp_affinity_list: real affinity 1 Sep 12 17:51:26.678917 instance-setup[1637]: INFO Setting /proc/irq/34/smp_affinity_list to 1 for device virtio1. Sep 12 17:51:26.680878 instance-setup[1637]: INFO /proc/irq/34/smp_affinity_list: real affinity 1 Sep 12 17:51:26.690048 instance-setup[1637]: INFO /usr/sbin/google_set_multiqueue: line 133: echo: write error: Value too large for defined data type Sep 12 17:51:26.696662 instance-setup[1637]: INFO /usr/sbin/google_set_multiqueue: line 133: echo: write error: Value too large for defined data type Sep 12 17:51:26.698864 instance-setup[1637]: INFO Queue 0 XPS=1 for /sys/class/net/eth0/queues/tx-0/xps_cpus Sep 12 17:51:26.699565 instance-setup[1637]: INFO Queue 1 XPS=2 for /sys/class/net/eth0/queues/tx-1/xps_cpus Sep 12 17:51:26.726215 init.sh[1628]: + /usr/bin/google_metadata_script_runner --script-type startup Sep 12 17:51:26.904938 startup-script[1695]: INFO Starting startup scripts. Sep 12 17:51:26.912172 startup-script[1695]: INFO No startup scripts found in metadata. Sep 12 17:51:26.912251 startup-script[1695]: INFO Finished running startup scripts. Sep 12 17:51:26.942105 init.sh[1628]: + trap 'stopping=1 ; kill "${daemon_pids[@]}" || :' SIGTERM Sep 12 17:51:26.943495 init.sh[1628]: + daemon_pids=() Sep 12 17:51:26.943495 init.sh[1628]: + for d in accounts clock_skew network Sep 12 17:51:26.943495 init.sh[1628]: + daemon_pids+=($!) Sep 12 17:51:26.943495 init.sh[1628]: + for d in accounts clock_skew network Sep 12 17:51:26.943495 init.sh[1628]: + daemon_pids+=($!) Sep 12 17:51:26.943495 init.sh[1628]: + for d in accounts clock_skew network Sep 12 17:51:26.943813 init.sh[1699]: + /usr/bin/google_clock_skew_daemon Sep 12 17:51:26.944134 init.sh[1628]: + daemon_pids+=($!) Sep 12 17:51:26.944134 init.sh[1628]: + NOTIFY_SOCKET=/run/systemd/notify Sep 12 17:51:26.944134 init.sh[1628]: + /usr/bin/systemd-notify --ready Sep 12 17:51:26.944531 init.sh[1698]: + /usr/bin/google_accounts_daemon Sep 12 17:51:26.945708 init.sh[1700]: + /usr/bin/google_network_daemon Sep 12 17:51:26.971587 systemd[1]: Started oem-gce.service - GCE Linux Agent. Sep 12 17:51:26.984483 init.sh[1628]: + wait -n 1698 1699 1700 Sep 12 17:51:27.010513 sshd[1664]: Accepted publickey for core from 139.178.68.195 port 45260 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:51:27.013879 sshd-session[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:51:27.033239 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:51:27.047852 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:51:27.095545 systemd-logind[1487]: New session 1 of user core. Sep 12 17:51:27.114193 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:51:27.133930 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:51:27.171424 (systemd)[1704]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:51:27.179403 systemd-logind[1487]: New session c1 of user core. Sep 12 17:51:27.588424 systemd[1704]: Queued start job for default target default.target. Sep 12 17:51:27.594185 systemd[1704]: Created slice app.slice - User Application Slice. Sep 12 17:51:27.594244 systemd[1704]: Reached target paths.target - Paths. Sep 12 17:51:27.594358 systemd[1704]: Reached target timers.target - Timers. Sep 12 17:51:27.596711 systemd[1704]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:51:27.613345 google-clock-skew[1699]: INFO Starting Google Clock Skew daemon. Sep 12 17:51:27.630099 google-clock-skew[1699]: INFO Clock drift token has changed: 0. Sep 12 17:51:27.644844 systemd[1704]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:51:27.645406 systemd[1704]: Reached target sockets.target - Sockets. Sep 12 17:51:27.645728 systemd[1704]: Reached target basic.target - Basic System. Sep 12 17:51:27.645817 systemd[1704]: Reached target default.target - Main User Target. Sep 12 17:51:27.645869 systemd[1704]: Startup finished in 445ms. Sep 12 17:51:27.646058 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:51:27.660744 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:51:27.661523 google-networking[1700]: INFO Starting Google Networking daemon. Sep 12 17:51:28.000802 systemd-resolved[1377]: Clock change detected. Flushing caches. Sep 12 17:51:28.001750 google-clock-skew[1699]: INFO Synced system time with hardware clock. Sep 12 17:51:28.005085 groupadd[1721]: group added to /etc/group: name=google-sudoers, GID=1000 Sep 12 17:51:28.010629 groupadd[1721]: group added to /etc/gshadow: name=google-sudoers Sep 12 17:51:28.053577 groupadd[1721]: new group: name=google-sudoers, GID=1000 Sep 12 17:51:28.080676 google-accounts[1698]: INFO Starting Google Accounts daemon. Sep 12 17:51:28.096049 google-accounts[1698]: WARNING OS Login not installed. Sep 12 17:51:28.097690 google-accounts[1698]: INFO Creating a new user account for 0. Sep 12 17:51:28.105298 init.sh[1729]: useradd: invalid user name '0': use --badname to ignore Sep 12 17:51:28.103796 google-accounts[1698]: WARNING Could not create user 0. Command '['useradd', '-m', '-s', '/bin/bash', '-p', '*', '0']' returned non-zero exit status 3.. Sep 12 17:51:28.234268 systemd[1]: Started sshd@1-10.128.0.19:22-139.178.68.195:45270.service - OpenSSH per-connection server daemon (139.178.68.195:45270). Sep 12 17:51:28.258530 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:51:28.270924 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:51:28.280673 systemd[1]: Startup finished in 3.812s (kernel) + 8.782s (initrd) + 8.446s (userspace) = 21.040s. Sep 12 17:51:28.284509 (kubelet)[1739]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:51:28.622751 sshd[1737]: Accepted publickey for core from 139.178.68.195 port 45270 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:51:28.625160 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:51:28.633943 systemd-logind[1487]: New session 2 of user core. Sep 12 17:51:28.638093 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:51:28.778723 ntpd[1482]: Listen normally on 7 eth0 [fe80::4001:aff:fe80:13%2]:123 Sep 12 17:51:28.779274 ntpd[1482]: 12 Sep 17:51:28 ntpd[1482]: Listen normally on 7 eth0 [fe80::4001:aff:fe80:13%2]:123 Sep 12 17:51:28.888108 sshd[1750]: Connection closed by 139.178.68.195 port 45270 Sep 12 17:51:28.889111 sshd-session[1737]: pam_unix(sshd:session): session closed for user core Sep 12 17:51:28.900088 systemd[1]: sshd@1-10.128.0.19:22-139.178.68.195:45270.service: Deactivated successfully. Sep 12 17:51:28.903235 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 17:51:28.906151 systemd-logind[1487]: Session 2 logged out. Waiting for processes to exit. Sep 12 17:51:28.907930 systemd-logind[1487]: Removed session 2. Sep 12 17:51:28.963386 systemd[1]: Started sshd@2-10.128.0.19:22-139.178.68.195:45278.service - OpenSSH per-connection server daemon (139.178.68.195:45278). Sep 12 17:51:29.120001 kubelet[1739]: E0912 17:51:29.119903 1739 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:51:29.123800 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:51:29.124089 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:51:29.124629 systemd[1]: kubelet.service: Consumed 1.304s CPU time, 266M memory peak. Sep 12 17:51:29.368697 sshd[1757]: Accepted publickey for core from 139.178.68.195 port 45278 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:51:29.370447 sshd-session[1757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:51:29.378645 systemd-logind[1487]: New session 3 of user core. Sep 12 17:51:29.384118 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:51:29.638702 sshd[1761]: Connection closed by 139.178.68.195 port 45278 Sep 12 17:51:29.639949 sshd-session[1757]: pam_unix(sshd:session): session closed for user core Sep 12 17:51:29.645104 systemd[1]: sshd@2-10.128.0.19:22-139.178.68.195:45278.service: Deactivated successfully. Sep 12 17:51:29.647649 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 17:51:29.649799 systemd-logind[1487]: Session 3 logged out. Waiting for processes to exit. Sep 12 17:51:29.651970 systemd-logind[1487]: Removed session 3. Sep 12 17:51:29.701277 systemd[1]: Started sshd@3-10.128.0.19:22-139.178.68.195:45282.service - OpenSSH per-connection server daemon (139.178.68.195:45282). Sep 12 17:51:30.075430 sshd[1767]: Accepted publickey for core from 139.178.68.195 port 45282 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:51:30.077120 sshd-session[1767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:51:30.084748 systemd-logind[1487]: New session 4 of user core. Sep 12 17:51:30.089089 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:51:30.336900 sshd[1770]: Connection closed by 139.178.68.195 port 45282 Sep 12 17:51:30.337925 sshd-session[1767]: pam_unix(sshd:session): session closed for user core Sep 12 17:51:30.344396 systemd[1]: sshd@3-10.128.0.19:22-139.178.68.195:45282.service: Deactivated successfully. Sep 12 17:51:30.347257 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:51:30.348581 systemd-logind[1487]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:51:30.350671 systemd-logind[1487]: Removed session 4. Sep 12 17:51:30.413072 systemd[1]: Started sshd@4-10.128.0.19:22-139.178.68.195:47642.service - OpenSSH per-connection server daemon (139.178.68.195:47642). Sep 12 17:51:30.817228 sshd[1776]: Accepted publickey for core from 139.178.68.195 port 47642 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:51:30.818999 sshd-session[1776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:51:30.826564 systemd-logind[1487]: New session 5 of user core. Sep 12 17:51:30.832128 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:51:31.056074 sudo[1780]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:51:31.056639 sudo[1780]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:51:31.074342 sudo[1780]: pam_unix(sudo:session): session closed for user root Sep 12 17:51:31.132384 sshd[1779]: Connection closed by 139.178.68.195 port 47642 Sep 12 17:51:31.133475 sshd-session[1776]: pam_unix(sshd:session): session closed for user core Sep 12 17:51:31.138979 systemd[1]: sshd@4-10.128.0.19:22-139.178.68.195:47642.service: Deactivated successfully. Sep 12 17:51:31.141456 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:51:31.144361 systemd-logind[1487]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:51:31.145884 systemd-logind[1487]: Removed session 5. Sep 12 17:51:31.201295 systemd[1]: Started sshd@5-10.128.0.19:22-139.178.68.195:47646.service - OpenSSH per-connection server daemon (139.178.68.195:47646). Sep 12 17:51:31.597073 sshd[1786]: Accepted publickey for core from 139.178.68.195 port 47646 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:51:31.598938 sshd-session[1786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:51:31.606610 systemd-logind[1487]: New session 6 of user core. Sep 12 17:51:31.614150 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:51:31.821536 sudo[1791]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:51:31.822160 sudo[1791]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:51:31.829070 sudo[1791]: pam_unix(sudo:session): session closed for user root Sep 12 17:51:31.843087 sudo[1790]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 17:51:31.843580 sudo[1790]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:51:31.856953 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:51:31.911377 augenrules[1813]: No rules Sep 12 17:51:31.912279 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:51:31.912693 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:51:31.914636 sudo[1790]: pam_unix(sudo:session): session closed for user root Sep 12 17:51:31.972328 sshd[1789]: Connection closed by 139.178.68.195 port 47646 Sep 12 17:51:31.973163 sshd-session[1786]: pam_unix(sshd:session): session closed for user core Sep 12 17:51:31.979380 systemd[1]: sshd@5-10.128.0.19:22-139.178.68.195:47646.service: Deactivated successfully. Sep 12 17:51:31.982208 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:51:31.983966 systemd-logind[1487]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:51:31.985764 systemd-logind[1487]: Removed session 6. Sep 12 17:51:32.046934 systemd[1]: Started sshd@6-10.128.0.19:22-139.178.68.195:47648.service - OpenSSH per-connection server daemon (139.178.68.195:47648). Sep 12 17:51:32.447031 sshd[1822]: Accepted publickey for core from 139.178.68.195 port 47648 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:51:32.448754 sshd-session[1822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:51:32.456439 systemd-logind[1487]: New session 7 of user core. Sep 12 17:51:32.462104 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:51:32.672213 sudo[1826]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:51:32.672720 sudo[1826]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:51:33.120170 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:51:33.149587 (dockerd)[1843]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:51:33.515543 dockerd[1843]: time="2025-09-12T17:51:33.515067017Z" level=info msg="Starting up" Sep 12 17:51:33.516623 dockerd[1843]: time="2025-09-12T17:51:33.516125382Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 17:51:33.532642 dockerd[1843]: time="2025-09-12T17:51:33.531341204Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 17:51:33.704155 dockerd[1843]: time="2025-09-12T17:51:33.704063446Z" level=info msg="Loading containers: start." Sep 12 17:51:33.724023 kernel: Initializing XFRM netlink socket Sep 12 17:51:34.081383 systemd-networkd[1442]: docker0: Link UP Sep 12 17:51:34.086539 dockerd[1843]: time="2025-09-12T17:51:34.086476118Z" level=info msg="Loading containers: done." Sep 12 17:51:34.107899 dockerd[1843]: time="2025-09-12T17:51:34.105897845Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:51:34.107899 dockerd[1843]: time="2025-09-12T17:51:34.106039531Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 17:51:34.107899 dockerd[1843]: time="2025-09-12T17:51:34.106150441Z" level=info msg="Initializing buildkit" Sep 12 17:51:34.111672 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3959603501-merged.mount: Deactivated successfully. Sep 12 17:51:34.141637 dockerd[1843]: time="2025-09-12T17:51:34.141580788Z" level=info msg="Completed buildkit initialization" Sep 12 17:51:34.146343 dockerd[1843]: time="2025-09-12T17:51:34.146260561Z" level=info msg="Daemon has completed initialization" Sep 12 17:51:34.146343 dockerd[1843]: time="2025-09-12T17:51:34.146326625Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:51:34.146767 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:51:35.036545 containerd[1529]: time="2025-09-12T17:51:35.036468880Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 17:51:35.569695 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2323280392.mount: Deactivated successfully. Sep 12 17:51:37.144038 containerd[1529]: time="2025-09-12T17:51:37.143964491Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:37.145552 containerd[1529]: time="2025-09-12T17:51:37.145501578Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28124707" Sep 12 17:51:37.146690 containerd[1529]: time="2025-09-12T17:51:37.146613318Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:37.149804 containerd[1529]: time="2025-09-12T17:51:37.149761063Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:37.151486 containerd[1529]: time="2025-09-12T17:51:37.151275413Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 2.114729653s" Sep 12 17:51:37.151486 containerd[1529]: time="2025-09-12T17:51:37.151326520Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 12 17:51:37.152465 containerd[1529]: time="2025-09-12T17:51:37.152415219Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 17:51:38.659542 containerd[1529]: time="2025-09-12T17:51:38.659464265Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:38.661040 containerd[1529]: time="2025-09-12T17:51:38.660985995Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24718566" Sep 12 17:51:38.661924 containerd[1529]: time="2025-09-12T17:51:38.661845103Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:38.665018 containerd[1529]: time="2025-09-12T17:51:38.664951544Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:38.666560 containerd[1529]: time="2025-09-12T17:51:38.666281909Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 1.513816702s" Sep 12 17:51:38.666560 containerd[1529]: time="2025-09-12T17:51:38.666335055Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 12 17:51:38.667464 containerd[1529]: time="2025-09-12T17:51:38.667415136Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 17:51:39.304682 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:51:39.308488 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:51:39.630071 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:51:39.643892 (kubelet)[2126]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:51:39.750245 kubelet[2126]: E0912 17:51:39.749844 2126 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:51:39.756785 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:51:39.757047 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:51:39.758839 systemd[1]: kubelet.service: Consumed 255ms CPU time, 108.2M memory peak. Sep 12 17:51:40.306592 containerd[1529]: time="2025-09-12T17:51:40.306508181Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:40.308202 containerd[1529]: time="2025-09-12T17:51:40.308031590Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18789614" Sep 12 17:51:40.309911 containerd[1529]: time="2025-09-12T17:51:40.309255479Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:40.313140 containerd[1529]: time="2025-09-12T17:51:40.312573765Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:40.314774 containerd[1529]: time="2025-09-12T17:51:40.314701910Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 1.647247131s" Sep 12 17:51:40.314774 containerd[1529]: time="2025-09-12T17:51:40.314777858Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 12 17:51:40.315711 containerd[1529]: time="2025-09-12T17:51:40.315642796Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 17:51:41.397116 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3657541064.mount: Deactivated successfully. Sep 12 17:51:42.076648 containerd[1529]: time="2025-09-12T17:51:42.076580999Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:42.078134 containerd[1529]: time="2025-09-12T17:51:42.077873175Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30412147" Sep 12 17:51:42.079254 containerd[1529]: time="2025-09-12T17:51:42.079208799Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:42.081789 containerd[1529]: time="2025-09-12T17:51:42.081744066Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:42.082635 containerd[1529]: time="2025-09-12T17:51:42.082590904Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 1.766727614s" Sep 12 17:51:42.082756 containerd[1529]: time="2025-09-12T17:51:42.082643009Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 12 17:51:42.083565 containerd[1529]: time="2025-09-12T17:51:42.083512440Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 17:51:42.535815 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2403588565.mount: Deactivated successfully. Sep 12 17:51:43.704602 containerd[1529]: time="2025-09-12T17:51:43.704529092Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:43.706144 containerd[1529]: time="2025-09-12T17:51:43.706026207Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18571883" Sep 12 17:51:43.707237 containerd[1529]: time="2025-09-12T17:51:43.707195615Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:43.711166 containerd[1529]: time="2025-09-12T17:51:43.710579743Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:43.712049 containerd[1529]: time="2025-09-12T17:51:43.712005795Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.628455072s" Sep 12 17:51:43.712172 containerd[1529]: time="2025-09-12T17:51:43.712056251Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 17:51:43.712675 containerd[1529]: time="2025-09-12T17:51:43.712642188Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:51:44.121784 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1666727555.mount: Deactivated successfully. Sep 12 17:51:44.126612 containerd[1529]: time="2025-09-12T17:51:44.126549013Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:51:44.127605 containerd[1529]: time="2025-09-12T17:51:44.127545102Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=322072" Sep 12 17:51:44.129880 containerd[1529]: time="2025-09-12T17:51:44.128616692Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:51:44.131304 containerd[1529]: time="2025-09-12T17:51:44.131269199Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:51:44.132195 containerd[1529]: time="2025-09-12T17:51:44.132156848Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 419.469449ms" Sep 12 17:51:44.132304 containerd[1529]: time="2025-09-12T17:51:44.132202226Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 17:51:44.132984 containerd[1529]: time="2025-09-12T17:51:44.132819752Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 17:51:44.564629 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2722105558.mount: Deactivated successfully. Sep 12 17:51:46.818761 containerd[1529]: time="2025-09-12T17:51:46.818691019Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:46.820364 containerd[1529]: time="2025-09-12T17:51:46.820303221Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56918218" Sep 12 17:51:46.821982 containerd[1529]: time="2025-09-12T17:51:46.821331263Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:46.824723 containerd[1529]: time="2025-09-12T17:51:46.824681785Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:51:46.826246 containerd[1529]: time="2025-09-12T17:51:46.826199977Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.693338274s" Sep 12 17:51:46.826425 containerd[1529]: time="2025-09-12T17:51:46.826398423Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 12 17:51:49.804976 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:51:49.807400 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:51:50.170055 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:51:50.181473 (kubelet)[2281]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:51:50.260207 kubelet[2281]: E0912 17:51:50.260135 2281 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:51:50.262850 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:51:50.263117 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:51:50.263602 systemd[1]: kubelet.service: Consumed 256ms CPU time, 108.7M memory peak. Sep 12 17:51:50.446478 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:51:50.446922 systemd[1]: kubelet.service: Consumed 256ms CPU time, 108.7M memory peak. Sep 12 17:51:50.450517 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:51:50.492554 systemd[1]: Reload requested from client PID 2295 ('systemctl') (unit session-7.scope)... Sep 12 17:51:50.492587 systemd[1]: Reloading... Sep 12 17:51:50.698906 zram_generator::config[2343]: No configuration found. Sep 12 17:51:51.018760 systemd[1]: Reloading finished in 525 ms. Sep 12 17:51:51.092763 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 17:51:51.092967 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 17:51:51.093907 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:51:51.094014 systemd[1]: kubelet.service: Consumed 161ms CPU time, 98.3M memory peak. Sep 12 17:51:51.097398 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:51:51.719431 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:51:51.730560 (kubelet)[2390]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:51:51.793889 kubelet[2390]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:51:51.793889 kubelet[2390]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 17:51:51.793889 kubelet[2390]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:51:51.793889 kubelet[2390]: I0912 17:51:51.793193 2390 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:51:52.161417 kubelet[2390]: I0912 17:51:52.161358 2390 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 17:51:52.161417 kubelet[2390]: I0912 17:51:52.161410 2390 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:51:52.162086 kubelet[2390]: I0912 17:51:52.162053 2390 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 17:51:52.200760 kubelet[2390]: E0912 17:51:52.200681 2390 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.128.0.19:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:51:52.209402 kubelet[2390]: I0912 17:51:52.209162 2390 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:51:52.218764 kubelet[2390]: I0912 17:51:52.218726 2390 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:51:52.224318 kubelet[2390]: I0912 17:51:52.224261 2390 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:51:52.225396 kubelet[2390]: I0912 17:51:52.225351 2390 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 17:51:52.225686 kubelet[2390]: I0912 17:51:52.225631 2390 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:51:52.225956 kubelet[2390]: I0912 17:51:52.225676 2390 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:51:52.226137 kubelet[2390]: I0912 17:51:52.225965 2390 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:51:52.226137 kubelet[2390]: I0912 17:51:52.225983 2390 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 17:51:52.226239 kubelet[2390]: I0912 17:51:52.226153 2390 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:51:52.230448 kubelet[2390]: I0912 17:51:52.230045 2390 kubelet.go:408] "Attempting to sync node with API server" Sep 12 17:51:52.230448 kubelet[2390]: I0912 17:51:52.230083 2390 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:51:52.230448 kubelet[2390]: I0912 17:51:52.230140 2390 kubelet.go:314] "Adding apiserver pod source" Sep 12 17:51:52.230448 kubelet[2390]: I0912 17:51:52.230169 2390 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:51:52.238008 kubelet[2390]: W0912 17:51:52.237937 2390 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.128.0.19:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal&limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 12 17:51:52.238274 kubelet[2390]: E0912 17:51:52.238222 2390 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.128.0.19:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal&limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:51:52.239816 kubelet[2390]: W0912 17:51:52.239742 2390 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.128.0.19:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 12 17:51:52.240081 kubelet[2390]: E0912 17:51:52.240032 2390 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.128.0.19:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:51:52.240395 kubelet[2390]: I0912 17:51:52.240375 2390 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:51:52.241190 kubelet[2390]: I0912 17:51:52.241164 2390 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:51:52.242444 kubelet[2390]: W0912 17:51:52.242398 2390 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:51:52.245144 kubelet[2390]: I0912 17:51:52.244969 2390 server.go:1274] "Started kubelet" Sep 12 17:51:52.254643 kubelet[2390]: I0912 17:51:52.254600 2390 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:51:52.256733 kubelet[2390]: I0912 17:51:52.256335 2390 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:51:52.259332 kubelet[2390]: I0912 17:51:52.259308 2390 server.go:449] "Adding debug handlers to kubelet server" Sep 12 17:51:52.265994 kubelet[2390]: I0912 17:51:52.265626 2390 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:51:52.268991 kubelet[2390]: I0912 17:51:52.268951 2390 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 17:51:52.269337 kubelet[2390]: E0912 17:51:52.269302 2390 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" not found" Sep 12 17:51:52.270446 kubelet[2390]: I0912 17:51:52.270128 2390 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 17:51:52.270446 kubelet[2390]: I0912 17:51:52.270229 2390 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:51:52.274736 kubelet[2390]: W0912 17:51:52.274663 2390 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.128.0.19:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 12 17:51:52.274847 kubelet[2390]: E0912 17:51:52.274755 2390 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.128.0.19:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:51:52.274942 kubelet[2390]: E0912 17:51:52.274851 2390 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.19:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal?timeout=10s\": dial tcp 10.128.0.19:6443: connect: connection refused" interval="200ms" Sep 12 17:51:52.275537 kubelet[2390]: I0912 17:51:52.275469 2390 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:51:52.275799 kubelet[2390]: I0912 17:51:52.275757 2390 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:51:52.279371 kubelet[2390]: E0912 17:51:52.274971 2390 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.128.0.19:6443/api/v1/namespaces/default/events\": dial tcp 10.128.0.19:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal.18649a64c93c8bd6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,UID:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,},FirstTimestamp:2025-09-12 17:51:52.244931542 +0000 UTC m=+0.507498571,LastTimestamp:2025-09-12 17:51:52.244931542 +0000 UTC m=+0.507498571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,}" Sep 12 17:51:52.279591 kubelet[2390]: I0912 17:51:52.279566 2390 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:51:52.279696 kubelet[2390]: I0912 17:51:52.279670 2390 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:51:52.282568 kubelet[2390]: I0912 17:51:52.281990 2390 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:51:52.298151 kubelet[2390]: I0912 17:51:52.298060 2390 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:51:52.299813 kubelet[2390]: I0912 17:51:52.299754 2390 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:51:52.299813 kubelet[2390]: I0912 17:51:52.299789 2390 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 17:51:52.300046 kubelet[2390]: I0912 17:51:52.299823 2390 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 17:51:52.300046 kubelet[2390]: E0912 17:51:52.299917 2390 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:51:52.308571 kubelet[2390]: W0912 17:51:52.308139 2390 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.128.0.19:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 12 17:51:52.308571 kubelet[2390]: E0912 17:51:52.308258 2390 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.128.0.19:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:51:52.309115 kubelet[2390]: E0912 17:51:52.309051 2390 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:51:52.326444 kubelet[2390]: I0912 17:51:52.326394 2390 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 17:51:52.326444 kubelet[2390]: I0912 17:51:52.326417 2390 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 17:51:52.326444 kubelet[2390]: I0912 17:51:52.326444 2390 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:51:52.328810 kubelet[2390]: I0912 17:51:52.328762 2390 policy_none.go:49] "None policy: Start" Sep 12 17:51:52.330063 kubelet[2390]: I0912 17:51:52.330032 2390 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 17:51:52.330063 kubelet[2390]: I0912 17:51:52.330065 2390 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:51:52.340334 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:51:52.354545 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:51:52.360388 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:51:52.369744 kubelet[2390]: E0912 17:51:52.369708 2390 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" not found" Sep 12 17:51:52.371890 kubelet[2390]: I0912 17:51:52.371643 2390 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:51:52.373421 kubelet[2390]: I0912 17:51:52.373396 2390 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:51:52.374030 kubelet[2390]: I0912 17:51:52.373904 2390 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:51:52.375063 kubelet[2390]: I0912 17:51:52.374945 2390 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:51:52.378135 kubelet[2390]: E0912 17:51:52.378104 2390 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" not found" Sep 12 17:51:52.425842 systemd[1]: Created slice kubepods-burstable-podfd5d491c7820612ed2a45fcea9296e7a.slice - libcontainer container kubepods-burstable-podfd5d491c7820612ed2a45fcea9296e7a.slice. Sep 12 17:51:52.444710 systemd[1]: Created slice kubepods-burstable-pod1bf9be6b187b0476e998564ddc97f64a.slice - libcontainer container kubepods-burstable-pod1bf9be6b187b0476e998564ddc97f64a.slice. Sep 12 17:51:52.461410 systemd[1]: Created slice kubepods-burstable-pod6b5c38a769466186eb9815e604c23285.slice - libcontainer container kubepods-burstable-pod6b5c38a769466186eb9815e604c23285.slice. Sep 12 17:51:52.471150 kubelet[2390]: I0912 17:51:52.471087 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-kubeconfig\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.471150 kubelet[2390]: I0912 17:51:52.471143 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fd5d491c7820612ed2a45fcea9296e7a-kubeconfig\") pod \"kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"fd5d491c7820612ed2a45fcea9296e7a\") " pod="kube-system/kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.471387 kubelet[2390]: I0912 17:51:52.471173 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1bf9be6b187b0476e998564ddc97f64a-ca-certs\") pod \"kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"1bf9be6b187b0476e998564ddc97f64a\") " pod="kube-system/kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.471387 kubelet[2390]: I0912 17:51:52.471198 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1bf9be6b187b0476e998564ddc97f64a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"1bf9be6b187b0476e998564ddc97f64a\") " pod="kube-system/kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.471387 kubelet[2390]: I0912 17:51:52.471224 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-ca-certs\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.471387 kubelet[2390]: I0912 17:51:52.471248 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-k8s-certs\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.471567 kubelet[2390]: I0912 17:51:52.471276 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.471567 kubelet[2390]: I0912 17:51:52.471305 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1bf9be6b187b0476e998564ddc97f64a-k8s-certs\") pod \"kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"1bf9be6b187b0476e998564ddc97f64a\") " pod="kube-system/kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.471567 kubelet[2390]: I0912 17:51:52.471331 2390 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-flexvolume-dir\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.475397 kubelet[2390]: E0912 17:51:52.475303 2390 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.19:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal?timeout=10s\": dial tcp 10.128.0.19:6443: connect: connection refused" interval="400ms" Sep 12 17:51:52.480534 kubelet[2390]: I0912 17:51:52.480498 2390 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.481001 kubelet[2390]: E0912 17:51:52.480948 2390 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.128.0.19:6443/api/v1/nodes\": dial tcp 10.128.0.19:6443: connect: connection refused" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.687449 kubelet[2390]: I0912 17:51:52.687293 2390 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.688098 kubelet[2390]: E0912 17:51:52.687915 2390 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.128.0.19:6443/api/v1/nodes\": dial tcp 10.128.0.19:6443: connect: connection refused" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:52.742447 containerd[1529]: time="2025-09-12T17:51:52.742394698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,Uid:fd5d491c7820612ed2a45fcea9296e7a,Namespace:kube-system,Attempt:0,}" Sep 12 17:51:52.766278 containerd[1529]: time="2025-09-12T17:51:52.766149785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,Uid:6b5c38a769466186eb9815e604c23285,Namespace:kube-system,Attempt:0,}" Sep 12 17:51:52.779263 containerd[1529]: time="2025-09-12T17:51:52.779203755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,Uid:1bf9be6b187b0476e998564ddc97f64a,Namespace:kube-system,Attempt:0,}" Sep 12 17:51:52.785501 containerd[1529]: time="2025-09-12T17:51:52.785421324Z" level=info msg="connecting to shim 92e841382f953429bdc09000002eec7d8ad028beab36865862ee1bffad45cd9b" address="unix:///run/containerd/s/0179ca13b239b252861ad1b3ed18a1e1dab2fe51f6eb8ef176ca6fdccff3283c" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:52.856133 systemd[1]: Started cri-containerd-92e841382f953429bdc09000002eec7d8ad028beab36865862ee1bffad45cd9b.scope - libcontainer container 92e841382f953429bdc09000002eec7d8ad028beab36865862ee1bffad45cd9b. Sep 12 17:51:52.858125 containerd[1529]: time="2025-09-12T17:51:52.857097208Z" level=info msg="connecting to shim ab452f32518fe98a68aa3eb0e60f7e4f86ceefdf8baf1ef5a7c04081506aaf42" address="unix:///run/containerd/s/7b4fd4bc83803f01e8b64d1e0569bfacfa7b85a57ff7114448f5585a2869d09b" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:52.861449 containerd[1529]: time="2025-09-12T17:51:52.861396666Z" level=info msg="connecting to shim 54c0278014d536903a82163e5b4726552bda54a13f0cc05558238fb9bb187ce2" address="unix:///run/containerd/s/1845cf54b37f30735177e9ccaa6ffd8ef0c43d99a32e92e179eaff5a60a477da" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:51:52.876083 kubelet[2390]: E0912 17:51:52.876030 2390 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.128.0.19:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal?timeout=10s\": dial tcp 10.128.0.19:6443: connect: connection refused" interval="800ms" Sep 12 17:51:52.927280 systemd[1]: Started cri-containerd-ab452f32518fe98a68aa3eb0e60f7e4f86ceefdf8baf1ef5a7c04081506aaf42.scope - libcontainer container ab452f32518fe98a68aa3eb0e60f7e4f86ceefdf8baf1ef5a7c04081506aaf42. Sep 12 17:51:52.939257 systemd[1]: Started cri-containerd-54c0278014d536903a82163e5b4726552bda54a13f0cc05558238fb9bb187ce2.scope - libcontainer container 54c0278014d536903a82163e5b4726552bda54a13f0cc05558238fb9bb187ce2. Sep 12 17:51:53.059656 containerd[1529]: time="2025-09-12T17:51:53.059604957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,Uid:1bf9be6b187b0476e998564ddc97f64a,Namespace:kube-system,Attempt:0,} returns sandbox id \"54c0278014d536903a82163e5b4726552bda54a13f0cc05558238fb9bb187ce2\"" Sep 12 17:51:53.064568 kubelet[2390]: E0912 17:51:53.064516 2390 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" hostnameMaxLen=63 truncatedHostname="kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-21291" Sep 12 17:51:53.068390 containerd[1529]: time="2025-09-12T17:51:53.067719815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,Uid:fd5d491c7820612ed2a45fcea9296e7a,Namespace:kube-system,Attempt:0,} returns sandbox id \"92e841382f953429bdc09000002eec7d8ad028beab36865862ee1bffad45cd9b\"" Sep 12 17:51:53.069478 containerd[1529]: time="2025-09-12T17:51:53.069201801Z" level=info msg="CreateContainer within sandbox \"54c0278014d536903a82163e5b4726552bda54a13f0cc05558238fb9bb187ce2\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:51:53.072482 kubelet[2390]: E0912 17:51:53.072423 2390 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" hostnameMaxLen=63 truncatedHostname="kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-21291" Sep 12 17:51:53.075754 containerd[1529]: time="2025-09-12T17:51:53.075716946Z" level=info msg="CreateContainer within sandbox \"92e841382f953429bdc09000002eec7d8ad028beab36865862ee1bffad45cd9b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:51:53.084033 containerd[1529]: time="2025-09-12T17:51:53.083979988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal,Uid:6b5c38a769466186eb9815e604c23285,Namespace:kube-system,Attempt:0,} returns sandbox id \"ab452f32518fe98a68aa3eb0e60f7e4f86ceefdf8baf1ef5a7c04081506aaf42\"" Sep 12 17:51:53.087305 kubelet[2390]: E0912 17:51:53.087227 2390 kubelet_pods.go:538] "Hostname for pod was too long, truncated it" podName="kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" hostnameMaxLen=63 truncatedHostname="kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flat" Sep 12 17:51:53.088649 containerd[1529]: time="2025-09-12T17:51:53.088571276Z" level=info msg="Container 76ee36c7467bf583251382ddefa5006f406905b9ce17e866093167ae47e88565: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:53.091923 containerd[1529]: time="2025-09-12T17:51:53.091801406Z" level=info msg="CreateContainer within sandbox \"ab452f32518fe98a68aa3eb0e60f7e4f86ceefdf8baf1ef5a7c04081506aaf42\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:51:53.093486 containerd[1529]: time="2025-09-12T17:51:53.093429338Z" level=info msg="Container e84b952e96f15f2d13294a41dbcee0fbd67d8f2c11c4a6701c3a4ce2d280a6d6: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:53.097173 kubelet[2390]: I0912 17:51:53.096721 2390 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:53.097722 kubelet[2390]: E0912 17:51:53.097602 2390 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.128.0.19:6443/api/v1/nodes\": dial tcp 10.128.0.19:6443: connect: connection refused" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:53.099976 containerd[1529]: time="2025-09-12T17:51:53.099668332Z" level=info msg="CreateContainer within sandbox \"54c0278014d536903a82163e5b4726552bda54a13f0cc05558238fb9bb187ce2\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"76ee36c7467bf583251382ddefa5006f406905b9ce17e866093167ae47e88565\"" Sep 12 17:51:53.100772 containerd[1529]: time="2025-09-12T17:51:53.100690423Z" level=info msg="StartContainer for \"76ee36c7467bf583251382ddefa5006f406905b9ce17e866093167ae47e88565\"" Sep 12 17:51:53.103071 containerd[1529]: time="2025-09-12T17:51:53.102941536Z" level=info msg="connecting to shim 76ee36c7467bf583251382ddefa5006f406905b9ce17e866093167ae47e88565" address="unix:///run/containerd/s/1845cf54b37f30735177e9ccaa6ffd8ef0c43d99a32e92e179eaff5a60a477da" protocol=ttrpc version=3 Sep 12 17:51:53.107766 containerd[1529]: time="2025-09-12T17:51:53.107624781Z" level=info msg="CreateContainer within sandbox \"92e841382f953429bdc09000002eec7d8ad028beab36865862ee1bffad45cd9b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"e84b952e96f15f2d13294a41dbcee0fbd67d8f2c11c4a6701c3a4ce2d280a6d6\"" Sep 12 17:51:53.108934 containerd[1529]: time="2025-09-12T17:51:53.108438140Z" level=info msg="StartContainer for \"e84b952e96f15f2d13294a41dbcee0fbd67d8f2c11c4a6701c3a4ce2d280a6d6\"" Sep 12 17:51:53.109086 containerd[1529]: time="2025-09-12T17:51:53.109055787Z" level=info msg="Container a0eef932f343813ac2443db6bacb23091378d35d912631eee453b6e7fe138d90: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:51:53.110250 containerd[1529]: time="2025-09-12T17:51:53.110215344Z" level=info msg="connecting to shim e84b952e96f15f2d13294a41dbcee0fbd67d8f2c11c4a6701c3a4ce2d280a6d6" address="unix:///run/containerd/s/0179ca13b239b252861ad1b3ed18a1e1dab2fe51f6eb8ef176ca6fdccff3283c" protocol=ttrpc version=3 Sep 12 17:51:53.123754 containerd[1529]: time="2025-09-12T17:51:53.123697253Z" level=info msg="CreateContainer within sandbox \"ab452f32518fe98a68aa3eb0e60f7e4f86ceefdf8baf1ef5a7c04081506aaf42\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a0eef932f343813ac2443db6bacb23091378d35d912631eee453b6e7fe138d90\"" Sep 12 17:51:53.124946 containerd[1529]: time="2025-09-12T17:51:53.124910401Z" level=info msg="StartContainer for \"a0eef932f343813ac2443db6bacb23091378d35d912631eee453b6e7fe138d90\"" Sep 12 17:51:53.131262 containerd[1529]: time="2025-09-12T17:51:53.131218329Z" level=info msg="connecting to shim a0eef932f343813ac2443db6bacb23091378d35d912631eee453b6e7fe138d90" address="unix:///run/containerd/s/7b4fd4bc83803f01e8b64d1e0569bfacfa7b85a57ff7114448f5585a2869d09b" protocol=ttrpc version=3 Sep 12 17:51:53.143815 systemd[1]: Started cri-containerd-76ee36c7467bf583251382ddefa5006f406905b9ce17e866093167ae47e88565.scope - libcontainer container 76ee36c7467bf583251382ddefa5006f406905b9ce17e866093167ae47e88565. Sep 12 17:51:53.158391 systemd[1]: Started cri-containerd-e84b952e96f15f2d13294a41dbcee0fbd67d8f2c11c4a6701c3a4ce2d280a6d6.scope - libcontainer container e84b952e96f15f2d13294a41dbcee0fbd67d8f2c11c4a6701c3a4ce2d280a6d6. Sep 12 17:51:53.184127 systemd[1]: Started cri-containerd-a0eef932f343813ac2443db6bacb23091378d35d912631eee453b6e7fe138d90.scope - libcontainer container a0eef932f343813ac2443db6bacb23091378d35d912631eee453b6e7fe138d90. Sep 12 17:51:53.221010 kubelet[2390]: W0912 17:51:53.219838 2390 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.128.0.19:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.128.0.19:6443: connect: connection refused Sep 12 17:51:53.221010 kubelet[2390]: E0912 17:51:53.219957 2390 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.128.0.19:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.128.0.19:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:51:53.269400 containerd[1529]: time="2025-09-12T17:51:53.269069526Z" level=info msg="StartContainer for \"76ee36c7467bf583251382ddefa5006f406905b9ce17e866093167ae47e88565\" returns successfully" Sep 12 17:51:53.355573 containerd[1529]: time="2025-09-12T17:51:53.355512829Z" level=info msg="StartContainer for \"a0eef932f343813ac2443db6bacb23091378d35d912631eee453b6e7fe138d90\" returns successfully" Sep 12 17:51:53.370762 containerd[1529]: time="2025-09-12T17:51:53.370624940Z" level=info msg="StartContainer for \"e84b952e96f15f2d13294a41dbcee0fbd67d8f2c11c4a6701c3a4ce2d280a6d6\" returns successfully" Sep 12 17:51:53.906193 kubelet[2390]: I0912 17:51:53.906135 2390 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:56.356975 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Sep 12 17:51:57.541344 kubelet[2390]: E0912 17:51:57.541253 2390 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" not found" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:57.634995 kubelet[2390]: I0912 17:51:57.634719 2390 kubelet_node_status.go:75] "Successfully registered node" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:51:58.241163 kubelet[2390]: I0912 17:51:58.241109 2390 apiserver.go:52] "Watching apiserver" Sep 12 17:51:58.271013 kubelet[2390]: I0912 17:51:58.270942 2390 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 17:51:59.495727 kubelet[2390]: W0912 17:51:59.495236 2390 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters must not contain dots] Sep 12 17:51:59.703166 systemd[1]: Reload requested from client PID 2666 ('systemctl') (unit session-7.scope)... Sep 12 17:51:59.703193 systemd[1]: Reloading... Sep 12 17:51:59.877901 zram_generator::config[2710]: No configuration found. Sep 12 17:52:00.194079 systemd[1]: Reloading finished in 490 ms. Sep 12 17:52:00.240310 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:52:00.257936 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:52:00.258325 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:52:00.258440 systemd[1]: kubelet.service: Consumed 1.032s CPU time, 130.3M memory peak. Sep 12 17:52:00.261214 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:52:00.599703 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:52:00.614813 (kubelet)[2758]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:52:00.706901 kubelet[2758]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:52:00.706901 kubelet[2758]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 17:52:00.706901 kubelet[2758]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:52:00.706901 kubelet[2758]: I0912 17:52:00.706458 2758 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:52:00.718675 kubelet[2758]: I0912 17:52:00.718629 2758 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 17:52:00.718933 kubelet[2758]: I0912 17:52:00.718912 2758 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:52:00.719613 kubelet[2758]: I0912 17:52:00.719518 2758 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 17:52:00.723385 kubelet[2758]: I0912 17:52:00.723314 2758 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 17:52:00.735649 kubelet[2758]: I0912 17:52:00.734198 2758 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:52:00.743845 kubelet[2758]: I0912 17:52:00.743807 2758 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:52:00.752236 kubelet[2758]: I0912 17:52:00.752182 2758 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:52:00.752407 kubelet[2758]: I0912 17:52:00.752355 2758 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 17:52:00.752817 kubelet[2758]: I0912 17:52:00.752544 2758 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:52:00.753068 kubelet[2758]: I0912 17:52:00.752612 2758 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:52:00.753244 kubelet[2758]: I0912 17:52:00.753120 2758 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:52:00.753244 kubelet[2758]: I0912 17:52:00.753144 2758 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 17:52:00.753244 kubelet[2758]: I0912 17:52:00.753190 2758 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:52:00.753406 kubelet[2758]: I0912 17:52:00.753372 2758 kubelet.go:408] "Attempting to sync node with API server" Sep 12 17:52:00.754564 kubelet[2758]: I0912 17:52:00.753393 2758 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:52:00.754564 kubelet[2758]: I0912 17:52:00.754309 2758 kubelet.go:314] "Adding apiserver pod source" Sep 12 17:52:00.754564 kubelet[2758]: I0912 17:52:00.754326 2758 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:52:00.768722 kubelet[2758]: I0912 17:52:00.767628 2758 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:52:00.768722 kubelet[2758]: I0912 17:52:00.768286 2758 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:52:00.770271 kubelet[2758]: I0912 17:52:00.769927 2758 server.go:1274] "Started kubelet" Sep 12 17:52:00.774173 kubelet[2758]: I0912 17:52:00.773813 2758 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:52:00.793584 kubelet[2758]: I0912 17:52:00.793525 2758 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 17:52:00.795074 kubelet[2758]: I0912 17:52:00.794557 2758 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 17:52:00.795074 kubelet[2758]: I0912 17:52:00.794737 2758 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:52:00.799913 kubelet[2758]: I0912 17:52:00.799766 2758 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:52:00.802372 kubelet[2758]: I0912 17:52:00.801849 2758 server.go:449] "Adding debug handlers to kubelet server" Sep 12 17:52:00.803402 kubelet[2758]: I0912 17:52:00.803296 2758 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:52:00.803590 kubelet[2758]: I0912 17:52:00.803556 2758 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:52:00.803948 kubelet[2758]: I0912 17:52:00.803918 2758 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:52:00.805736 kubelet[2758]: I0912 17:52:00.804807 2758 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:52:00.816420 kubelet[2758]: I0912 17:52:00.815993 2758 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:52:00.817316 kubelet[2758]: E0912 17:52:00.817283 2758 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:52:00.821551 kubelet[2758]: I0912 17:52:00.821444 2758 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:52:00.821748 kubelet[2758]: I0912 17:52:00.821730 2758 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:52:00.824638 kubelet[2758]: I0912 17:52:00.824098 2758 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:52:00.824638 kubelet[2758]: I0912 17:52:00.824132 2758 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 17:52:00.824638 kubelet[2758]: I0912 17:52:00.824158 2758 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 17:52:00.824638 kubelet[2758]: E0912 17:52:00.824235 2758 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:52:00.927909 kubelet[2758]: E0912 17:52:00.926912 2758 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Sep 12 17:52:00.931716 kubelet[2758]: I0912 17:52:00.931675 2758 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 17:52:00.932258 kubelet[2758]: I0912 17:52:00.931950 2758 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 17:52:00.932258 kubelet[2758]: I0912 17:52:00.931982 2758 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:52:00.932258 kubelet[2758]: I0912 17:52:00.932178 2758 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:52:00.932258 kubelet[2758]: I0912 17:52:00.932191 2758 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:52:00.932258 kubelet[2758]: I0912 17:52:00.932211 2758 policy_none.go:49] "None policy: Start" Sep 12 17:52:00.933907 kubelet[2758]: I0912 17:52:00.933480 2758 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 17:52:00.933907 kubelet[2758]: I0912 17:52:00.933509 2758 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:52:00.933907 kubelet[2758]: I0912 17:52:00.933732 2758 state_mem.go:75] "Updated machine memory state" Sep 12 17:52:00.944308 kubelet[2758]: I0912 17:52:00.944262 2758 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:52:00.945630 kubelet[2758]: I0912 17:52:00.945584 2758 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:52:00.946212 kubelet[2758]: I0912 17:52:00.945604 2758 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:52:00.948882 kubelet[2758]: I0912 17:52:00.947776 2758 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:52:01.067531 kubelet[2758]: I0912 17:52:01.067354 2758 kubelet_node_status.go:72] "Attempting to register node" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.081877 kubelet[2758]: I0912 17:52:01.081419 2758 kubelet_node_status.go:111] "Node was previously registered" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.081877 kubelet[2758]: I0912 17:52:01.081529 2758 kubelet_node_status.go:75] "Successfully registered node" node="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.137687 kubelet[2758]: W0912 17:52:01.137640 2758 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters must not contain dots] Sep 12 17:52:01.142216 kubelet[2758]: W0912 17:52:01.142175 2758 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters must not contain dots] Sep 12 17:52:01.142492 kubelet[2758]: E0912 17:52:01.142269 2758 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" already exists" pod="kube-system/kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.142877 kubelet[2758]: W0912 17:52:01.142622 2758 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must be no more than 63 characters must not contain dots] Sep 12 17:52:01.197735 kubelet[2758]: I0912 17:52:01.196570 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1bf9be6b187b0476e998564ddc97f64a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"1bf9be6b187b0476e998564ddc97f64a\") " pod="kube-system/kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.197735 kubelet[2758]: I0912 17:52:01.196646 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-ca-certs\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.197735 kubelet[2758]: I0912 17:52:01.196684 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-flexvolume-dir\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.197735 kubelet[2758]: I0912 17:52:01.196717 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-kubeconfig\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.198248 kubelet[2758]: I0912 17:52:01.196749 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fd5d491c7820612ed2a45fcea9296e7a-kubeconfig\") pod \"kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"fd5d491c7820612ed2a45fcea9296e7a\") " pod="kube-system/kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.198248 kubelet[2758]: I0912 17:52:01.196780 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1bf9be6b187b0476e998564ddc97f64a-ca-certs\") pod \"kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"1bf9be6b187b0476e998564ddc97f64a\") " pod="kube-system/kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.198248 kubelet[2758]: I0912 17:52:01.196809 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1bf9be6b187b0476e998564ddc97f64a-k8s-certs\") pod \"kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"1bf9be6b187b0476e998564ddc97f64a\") " pod="kube-system/kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.198248 kubelet[2758]: I0912 17:52:01.196838 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-k8s-certs\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.198479 kubelet[2758]: I0912 17:52:01.196894 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6b5c38a769466186eb9815e604c23285-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" (UID: \"6b5c38a769466186eb9815e604c23285\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:01.757837 kubelet[2758]: I0912 17:52:01.757771 2758 apiserver.go:52] "Watching apiserver" Sep 12 17:52:01.795287 kubelet[2758]: I0912 17:52:01.795193 2758 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 17:52:01.938730 kubelet[2758]: I0912 17:52:01.938148 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" podStartSLOduration=0.938120713 podStartE2EDuration="938.120713ms" podCreationTimestamp="2025-09-12 17:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:52:01.926910478 +0000 UTC m=+1.302675910" watchObservedRunningTime="2025-09-12 17:52:01.938120713 +0000 UTC m=+1.313886155" Sep 12 17:52:01.939496 kubelet[2758]: I0912 17:52:01.939213 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" podStartSLOduration=2.939191964 podStartE2EDuration="2.939191964s" podCreationTimestamp="2025-09-12 17:51:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:52:01.938762199 +0000 UTC m=+1.314527621" watchObservedRunningTime="2025-09-12 17:52:01.939191964 +0000 UTC m=+1.314957395" Sep 12 17:52:03.951196 kubelet[2758]: I0912 17:52:03.951119 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" podStartSLOduration=2.951092137 podStartE2EDuration="2.951092137s" podCreationTimestamp="2025-09-12 17:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:52:01.960519015 +0000 UTC m=+1.336284447" watchObservedRunningTime="2025-09-12 17:52:03.951092137 +0000 UTC m=+3.326857567" Sep 12 17:52:05.276596 kubelet[2758]: I0912 17:52:05.276552 2758 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:52:05.277266 containerd[1529]: time="2025-09-12T17:52:05.277157150Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:52:05.277998 kubelet[2758]: I0912 17:52:05.277522 2758 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:52:05.984775 systemd[1]: Created slice kubepods-besteffort-pod4af8ac8e_5715_4217_b688_ee9266e22449.slice - libcontainer container kubepods-besteffort-pod4af8ac8e_5715_4217_b688_ee9266e22449.slice. Sep 12 17:52:06.030157 kubelet[2758]: I0912 17:52:06.030061 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/4af8ac8e-5715-4217-b688-ee9266e22449-kube-proxy\") pod \"kube-proxy-98csb\" (UID: \"4af8ac8e-5715-4217-b688-ee9266e22449\") " pod="kube-system/kube-proxy-98csb" Sep 12 17:52:06.030157 kubelet[2758]: I0912 17:52:06.030116 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4af8ac8e-5715-4217-b688-ee9266e22449-xtables-lock\") pod \"kube-proxy-98csb\" (UID: \"4af8ac8e-5715-4217-b688-ee9266e22449\") " pod="kube-system/kube-proxy-98csb" Sep 12 17:52:06.030157 kubelet[2758]: I0912 17:52:06.030155 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4af8ac8e-5715-4217-b688-ee9266e22449-lib-modules\") pod \"kube-proxy-98csb\" (UID: \"4af8ac8e-5715-4217-b688-ee9266e22449\") " pod="kube-system/kube-proxy-98csb" Sep 12 17:52:06.030462 kubelet[2758]: I0912 17:52:06.030185 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2tx8\" (UniqueName: \"kubernetes.io/projected/4af8ac8e-5715-4217-b688-ee9266e22449-kube-api-access-p2tx8\") pod \"kube-proxy-98csb\" (UID: \"4af8ac8e-5715-4217-b688-ee9266e22449\") " pod="kube-system/kube-proxy-98csb" Sep 12 17:52:06.296394 containerd[1529]: time="2025-09-12T17:52:06.296241585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-98csb,Uid:4af8ac8e-5715-4217-b688-ee9266e22449,Namespace:kube-system,Attempt:0,}" Sep 12 17:52:06.332097 kubelet[2758]: I0912 17:52:06.331990 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/daf27ccb-7421-43c7-bce8-0410d07f63a4-var-lib-calico\") pod \"tigera-operator-58fc44c59b-dmb6g\" (UID: \"daf27ccb-7421-43c7-bce8-0410d07f63a4\") " pod="tigera-operator/tigera-operator-58fc44c59b-dmb6g" Sep 12 17:52:06.334258 kubelet[2758]: I0912 17:52:06.334077 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bp7m\" (UniqueName: \"kubernetes.io/projected/daf27ccb-7421-43c7-bce8-0410d07f63a4-kube-api-access-9bp7m\") pod \"tigera-operator-58fc44c59b-dmb6g\" (UID: \"daf27ccb-7421-43c7-bce8-0410d07f63a4\") " pod="tigera-operator/tigera-operator-58fc44c59b-dmb6g" Sep 12 17:52:06.345007 containerd[1529]: time="2025-09-12T17:52:06.343828508Z" level=info msg="connecting to shim 6df47066456d1bd89c9e7c3c08e6288f9323654202c3c58524c0c6301c30bd8d" address="unix:///run/containerd/s/deaa0ad0e50fcb2eebfbcaa17b745ea983fdc90876084e42ba93b5310fa8fe54" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:06.346980 systemd[1]: Created slice kubepods-besteffort-poddaf27ccb_7421_43c7_bce8_0410d07f63a4.slice - libcontainer container kubepods-besteffort-poddaf27ccb_7421_43c7_bce8_0410d07f63a4.slice. Sep 12 17:52:06.393068 systemd[1]: Started cri-containerd-6df47066456d1bd89c9e7c3c08e6288f9323654202c3c58524c0c6301c30bd8d.scope - libcontainer container 6df47066456d1bd89c9e7c3c08e6288f9323654202c3c58524c0c6301c30bd8d. Sep 12 17:52:06.430326 containerd[1529]: time="2025-09-12T17:52:06.430275647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-98csb,Uid:4af8ac8e-5715-4217-b688-ee9266e22449,Namespace:kube-system,Attempt:0,} returns sandbox id \"6df47066456d1bd89c9e7c3c08e6288f9323654202c3c58524c0c6301c30bd8d\"" Sep 12 17:52:06.436371 containerd[1529]: time="2025-09-12T17:52:06.436321345Z" level=info msg="CreateContainer within sandbox \"6df47066456d1bd89c9e7c3c08e6288f9323654202c3c58524c0c6301c30bd8d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:52:06.456087 containerd[1529]: time="2025-09-12T17:52:06.456024955Z" level=info msg="Container 1d3aada14d7ce127a205e5f08f42fce2df3eac720887779ebad3c158e451545c: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:06.466594 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1609629933.mount: Deactivated successfully. Sep 12 17:52:06.478164 containerd[1529]: time="2025-09-12T17:52:06.477339836Z" level=info msg="CreateContainer within sandbox \"6df47066456d1bd89c9e7c3c08e6288f9323654202c3c58524c0c6301c30bd8d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1d3aada14d7ce127a205e5f08f42fce2df3eac720887779ebad3c158e451545c\"" Sep 12 17:52:06.479945 containerd[1529]: time="2025-09-12T17:52:06.479900945Z" level=info msg="StartContainer for \"1d3aada14d7ce127a205e5f08f42fce2df3eac720887779ebad3c158e451545c\"" Sep 12 17:52:06.485274 containerd[1529]: time="2025-09-12T17:52:06.485208432Z" level=info msg="connecting to shim 1d3aada14d7ce127a205e5f08f42fce2df3eac720887779ebad3c158e451545c" address="unix:///run/containerd/s/deaa0ad0e50fcb2eebfbcaa17b745ea983fdc90876084e42ba93b5310fa8fe54" protocol=ttrpc version=3 Sep 12 17:52:06.512141 systemd[1]: Started cri-containerd-1d3aada14d7ce127a205e5f08f42fce2df3eac720887779ebad3c158e451545c.scope - libcontainer container 1d3aada14d7ce127a205e5f08f42fce2df3eac720887779ebad3c158e451545c. Sep 12 17:52:06.574108 containerd[1529]: time="2025-09-12T17:52:06.574007782Z" level=info msg="StartContainer for \"1d3aada14d7ce127a205e5f08f42fce2df3eac720887779ebad3c158e451545c\" returns successfully" Sep 12 17:52:06.655925 containerd[1529]: time="2025-09-12T17:52:06.655472100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-dmb6g,Uid:daf27ccb-7421-43c7-bce8-0410d07f63a4,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:52:06.683843 containerd[1529]: time="2025-09-12T17:52:06.683779903Z" level=info msg="connecting to shim aedeb7a0665a461a8b9ddb4a2496d92ed157feaf250f3b797ef425bb46b595a9" address="unix:///run/containerd/s/f993c98ae982e85845a81315c857534658d54f35942eb99321fe81e88de958d6" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:06.734586 systemd[1]: Started cri-containerd-aedeb7a0665a461a8b9ddb4a2496d92ed157feaf250f3b797ef425bb46b595a9.scope - libcontainer container aedeb7a0665a461a8b9ddb4a2496d92ed157feaf250f3b797ef425bb46b595a9. Sep 12 17:52:06.825492 containerd[1529]: time="2025-09-12T17:52:06.825339622Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-dmb6g,Uid:daf27ccb-7421-43c7-bce8-0410d07f63a4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"aedeb7a0665a461a8b9ddb4a2496d92ed157feaf250f3b797ef425bb46b595a9\"" Sep 12 17:52:06.830959 containerd[1529]: time="2025-09-12T17:52:06.830846923Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:52:06.968259 kubelet[2758]: I0912 17:52:06.967973 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-98csb" podStartSLOduration=1.967946481 podStartE2EDuration="1.967946481s" podCreationTimestamp="2025-09-12 17:52:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:52:06.929213447 +0000 UTC m=+6.304978878" watchObservedRunningTime="2025-09-12 17:52:06.967946481 +0000 UTC m=+6.343711911" Sep 12 17:52:07.643215 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3166375128.mount: Deactivated successfully. Sep 12 17:52:08.716912 containerd[1529]: time="2025-09-12T17:52:08.716835883Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:08.718152 containerd[1529]: time="2025-09-12T17:52:08.718110338Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 17:52:08.719940 containerd[1529]: time="2025-09-12T17:52:08.719425133Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:08.726883 containerd[1529]: time="2025-09-12T17:52:08.726418098Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:08.729877 containerd[1529]: time="2025-09-12T17:52:08.729823581Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 1.898812721s" Sep 12 17:52:08.730025 containerd[1529]: time="2025-09-12T17:52:08.729884700Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 17:52:08.733908 containerd[1529]: time="2025-09-12T17:52:08.732899310Z" level=info msg="CreateContainer within sandbox \"aedeb7a0665a461a8b9ddb4a2496d92ed157feaf250f3b797ef425bb46b595a9\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:52:08.741335 containerd[1529]: time="2025-09-12T17:52:08.741292193Z" level=info msg="Container 77e57903e1872c7be99aeacdd18bcf7ec402ab32311c7ef6c6f4c8d8af15a699: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:08.758841 containerd[1529]: time="2025-09-12T17:52:08.758787672Z" level=info msg="CreateContainer within sandbox \"aedeb7a0665a461a8b9ddb4a2496d92ed157feaf250f3b797ef425bb46b595a9\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"77e57903e1872c7be99aeacdd18bcf7ec402ab32311c7ef6c6f4c8d8af15a699\"" Sep 12 17:52:08.759621 containerd[1529]: time="2025-09-12T17:52:08.759566143Z" level=info msg="StartContainer for \"77e57903e1872c7be99aeacdd18bcf7ec402ab32311c7ef6c6f4c8d8af15a699\"" Sep 12 17:52:08.762743 containerd[1529]: time="2025-09-12T17:52:08.762604685Z" level=info msg="connecting to shim 77e57903e1872c7be99aeacdd18bcf7ec402ab32311c7ef6c6f4c8d8af15a699" address="unix:///run/containerd/s/f993c98ae982e85845a81315c857534658d54f35942eb99321fe81e88de958d6" protocol=ttrpc version=3 Sep 12 17:52:08.797106 systemd[1]: Started cri-containerd-77e57903e1872c7be99aeacdd18bcf7ec402ab32311c7ef6c6f4c8d8af15a699.scope - libcontainer container 77e57903e1872c7be99aeacdd18bcf7ec402ab32311c7ef6c6f4c8d8af15a699. Sep 12 17:52:08.843062 containerd[1529]: time="2025-09-12T17:52:08.842929644Z" level=info msg="StartContainer for \"77e57903e1872c7be99aeacdd18bcf7ec402ab32311c7ef6c6f4c8d8af15a699\" returns successfully" Sep 12 17:52:08.934425 kubelet[2758]: I0912 17:52:08.934351 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-dmb6g" podStartSLOduration=1.033816138 podStartE2EDuration="2.934326187s" podCreationTimestamp="2025-09-12 17:52:06 +0000 UTC" firstStartedPulling="2025-09-12 17:52:06.830319227 +0000 UTC m=+6.206084649" lastFinishedPulling="2025-09-12 17:52:08.73082929 +0000 UTC m=+8.106594698" observedRunningTime="2025-09-12 17:52:08.9334305 +0000 UTC m=+8.309195932" watchObservedRunningTime="2025-09-12 17:52:08.934326187 +0000 UTC m=+8.310091618" Sep 12 17:52:10.241762 update_engine[1490]: I20250912 17:52:10.240911 1490 update_attempter.cc:509] Updating boot flags... Sep 12 17:52:16.652057 sudo[1826]: pam_unix(sudo:session): session closed for user root Sep 12 17:52:16.711037 sshd[1825]: Connection closed by 139.178.68.195 port 47648 Sep 12 17:52:16.712367 sshd-session[1822]: pam_unix(sshd:session): session closed for user core Sep 12 17:52:16.724238 systemd[1]: sshd@6-10.128.0.19:22-139.178.68.195:47648.service: Deactivated successfully. Sep 12 17:52:16.724488 systemd-logind[1487]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:52:16.731444 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:52:16.732015 systemd[1]: session-7.scope: Consumed 6.492s CPU time, 226M memory peak. Sep 12 17:52:16.738073 systemd-logind[1487]: Removed session 7. Sep 12 17:52:24.346677 systemd[1]: Created slice kubepods-besteffort-podf28f0c23_ae73_414a_bb93_3e274e5c4a95.slice - libcontainer container kubepods-besteffort-podf28f0c23_ae73_414a_bb93_3e274e5c4a95.slice. Sep 12 17:52:24.356543 kubelet[2758]: I0912 17:52:24.356358 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f28f0c23-ae73-414a-bb93-3e274e5c4a95-tigera-ca-bundle\") pod \"calico-typha-757df4fd6d-57rd9\" (UID: \"f28f0c23-ae73-414a-bb93-3e274e5c4a95\") " pod="calico-system/calico-typha-757df4fd6d-57rd9" Sep 12 17:52:24.356543 kubelet[2758]: I0912 17:52:24.356416 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f28f0c23-ae73-414a-bb93-3e274e5c4a95-typha-certs\") pod \"calico-typha-757df4fd6d-57rd9\" (UID: \"f28f0c23-ae73-414a-bb93-3e274e5c4a95\") " pod="calico-system/calico-typha-757df4fd6d-57rd9" Sep 12 17:52:24.356543 kubelet[2758]: I0912 17:52:24.356447 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v994w\" (UniqueName: \"kubernetes.io/projected/f28f0c23-ae73-414a-bb93-3e274e5c4a95-kube-api-access-v994w\") pod \"calico-typha-757df4fd6d-57rd9\" (UID: \"f28f0c23-ae73-414a-bb93-3e274e5c4a95\") " pod="calico-system/calico-typha-757df4fd6d-57rd9" Sep 12 17:52:24.607580 systemd[1]: Created slice kubepods-besteffort-pod73a36f19_599c_4b1f_b50e_a523ba155116.slice - libcontainer container kubepods-besteffort-pod73a36f19_599c_4b1f_b50e_a523ba155116.slice. Sep 12 17:52:24.656081 containerd[1529]: time="2025-09-12T17:52:24.656024766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-757df4fd6d-57rd9,Uid:f28f0c23-ae73-414a-bb93-3e274e5c4a95,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:24.715798 containerd[1529]: time="2025-09-12T17:52:24.715731906Z" level=info msg="connecting to shim a8561a57bcee4d286637cb0c6115ce0e616f472ad25edd7eb8f3ffd686d2dd5a" address="unix:///run/containerd/s/bce96f7627ace6afe89a08307735957bfc05a3c2e57348a2b4519967fbf0d194" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:24.760362 kubelet[2758]: I0912 17:52:24.760301 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-var-lib-calico\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760362 kubelet[2758]: I0912 17:52:24.760367 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-cni-log-dir\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760618 kubelet[2758]: I0912 17:52:24.760394 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-var-run-calico\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760618 kubelet[2758]: I0912 17:52:24.760421 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/73a36f19-599c-4b1f-b50e-a523ba155116-node-certs\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760618 kubelet[2758]: I0912 17:52:24.760447 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-cni-bin-dir\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760618 kubelet[2758]: I0912 17:52:24.760471 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-lib-modules\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760618 kubelet[2758]: I0912 17:52:24.760497 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-policysync\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760899 kubelet[2758]: I0912 17:52:24.760531 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-xtables-lock\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760899 kubelet[2758]: I0912 17:52:24.760555 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc28g\" (UniqueName: \"kubernetes.io/projected/73a36f19-599c-4b1f-b50e-a523ba155116-kube-api-access-gc28g\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760899 kubelet[2758]: I0912 17:52:24.760590 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-cni-net-dir\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760899 kubelet[2758]: I0912 17:52:24.760618 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/73a36f19-599c-4b1f-b50e-a523ba155116-flexvol-driver-host\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.760899 kubelet[2758]: I0912 17:52:24.760644 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73a36f19-599c-4b1f-b50e-a523ba155116-tigera-ca-bundle\") pod \"calico-node-6tp4v\" (UID: \"73a36f19-599c-4b1f-b50e-a523ba155116\") " pod="calico-system/calico-node-6tp4v" Sep 12 17:52:24.800158 systemd[1]: Started cri-containerd-a8561a57bcee4d286637cb0c6115ce0e616f472ad25edd7eb8f3ffd686d2dd5a.scope - libcontainer container a8561a57bcee4d286637cb0c6115ce0e616f472ad25edd7eb8f3ffd686d2dd5a. Sep 12 17:52:24.862841 kubelet[2758]: E0912 17:52:24.862688 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.862841 kubelet[2758]: W0912 17:52:24.862722 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.863171 kubelet[2758]: E0912 17:52:24.862925 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.865071 kubelet[2758]: E0912 17:52:24.865000 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.865071 kubelet[2758]: W0912 17:52:24.865038 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.865943 kubelet[2758]: E0912 17:52:24.865580 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.866098 kubelet[2758]: E0912 17:52:24.865965 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.866098 kubelet[2758]: W0912 17:52:24.865983 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.866098 kubelet[2758]: E0912 17:52:24.866029 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.866374 kubelet[2758]: E0912 17:52:24.866356 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.866374 kubelet[2758]: W0912 17:52:24.866374 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.867079 kubelet[2758]: E0912 17:52:24.866970 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.867313 kubelet[2758]: E0912 17:52:24.867284 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.867313 kubelet[2758]: W0912 17:52:24.867307 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.867597 kubelet[2758]: E0912 17:52:24.867570 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.867710 kubelet[2758]: E0912 17:52:24.867687 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.867710 kubelet[2758]: W0912 17:52:24.867707 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.868606 kubelet[2758]: E0912 17:52:24.867803 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.870184 kubelet[2758]: E0912 17:52:24.870159 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.870314 kubelet[2758]: W0912 17:52:24.870187 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.870314 kubelet[2758]: E0912 17:52:24.870296 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.870585 kubelet[2758]: E0912 17:52:24.870563 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.870585 kubelet[2758]: W0912 17:52:24.870585 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.870848 kubelet[2758]: E0912 17:52:24.870805 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.871949 kubelet[2758]: E0912 17:52:24.870942 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.871949 kubelet[2758]: W0912 17:52:24.870956 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.871949 kubelet[2758]: E0912 17:52:24.871027 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.871949 kubelet[2758]: E0912 17:52:24.871259 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.871949 kubelet[2758]: W0912 17:52:24.871272 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.871949 kubelet[2758]: E0912 17:52:24.871361 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.871949 kubelet[2758]: E0912 17:52:24.871584 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.871949 kubelet[2758]: W0912 17:52:24.871597 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.871949 kubelet[2758]: E0912 17:52:24.871802 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.871949 kubelet[2758]: E0912 17:52:24.871924 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.872679 kubelet[2758]: W0912 17:52:24.871937 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.872679 kubelet[2758]: E0912 17:52:24.872035 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.872679 kubelet[2758]: E0912 17:52:24.872273 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.872679 kubelet[2758]: W0912 17:52:24.872286 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.872679 kubelet[2758]: E0912 17:52:24.872486 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.872679 kubelet[2758]: E0912 17:52:24.872579 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.872679 kubelet[2758]: W0912 17:52:24.872590 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.872679 kubelet[2758]: E0912 17:52:24.872679 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.873340 kubelet[2758]: E0912 17:52:24.872928 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.873340 kubelet[2758]: W0912 17:52:24.872942 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.873340 kubelet[2758]: E0912 17:52:24.873031 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.873340 kubelet[2758]: E0912 17:52:24.873316 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.873340 kubelet[2758]: W0912 17:52:24.873330 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.873763 kubelet[2758]: E0912 17:52:24.873423 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.873763 kubelet[2758]: E0912 17:52:24.873666 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.873763 kubelet[2758]: W0912 17:52:24.873679 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.875592 kubelet[2758]: E0912 17:52:24.875568 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.875592 kubelet[2758]: W0912 17:52:24.875591 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.876023 kubelet[2758]: E0912 17:52:24.875997 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.876757 kubelet[2758]: E0912 17:52:24.876734 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.876757 kubelet[2758]: W0912 17:52:24.876757 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.877072 kubelet[2758]: E0912 17:52:24.877051 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.877072 kubelet[2758]: W0912 17:52:24.877071 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.877364 kubelet[2758]: E0912 17:52:24.877331 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.877364 kubelet[2758]: W0912 17:52:24.877354 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.878755 kubelet[2758]: E0912 17:52:24.878687 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.878755 kubelet[2758]: W0912 17:52:24.878730 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.879440 kubelet[2758]: E0912 17:52:24.879407 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.879440 kubelet[2758]: E0912 17:52:24.879441 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.879596 kubelet[2758]: E0912 17:52:24.879455 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.879596 kubelet[2758]: E0912 17:52:24.879467 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.879596 kubelet[2758]: E0912 17:52:24.879479 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.879596 kubelet[2758]: E0912 17:52:24.879557 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.879596 kubelet[2758]: W0912 17:52:24.879570 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.880005 kubelet[2758]: E0912 17:52:24.879825 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.880005 kubelet[2758]: W0912 17:52:24.879837 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.881964 kubelet[2758]: E0912 17:52:24.881930 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.881964 kubelet[2758]: W0912 17:52:24.881956 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.883332 kubelet[2758]: E0912 17:52:24.883299 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.883332 kubelet[2758]: W0912 17:52:24.883321 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.884021 kubelet[2758]: E0912 17:52:24.883931 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.884021 kubelet[2758]: E0912 17:52:24.883974 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.884021 kubelet[2758]: E0912 17:52:24.883991 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.884021 kubelet[2758]: E0912 17:52:24.884021 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.884278 kubelet[2758]: E0912 17:52:24.884126 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.884278 kubelet[2758]: W0912 17:52:24.884138 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.885099 kubelet[2758]: E0912 17:52:24.884442 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.885231 kubelet[2758]: E0912 17:52:24.885206 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.885231 kubelet[2758]: W0912 17:52:24.885231 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.885377 kubelet[2758]: E0912 17:52:24.885328 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.885883 kubelet[2758]: E0912 17:52:24.885828 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.885883 kubelet[2758]: W0912 17:52:24.885847 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.887182 kubelet[2758]: E0912 17:52:24.887149 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.887325 kubelet[2758]: E0912 17:52:24.887304 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.887407 kubelet[2758]: W0912 17:52:24.887325 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.887474 kubelet[2758]: E0912 17:52:24.887405 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.887691 kubelet[2758]: E0912 17:52:24.887656 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.887691 kubelet[2758]: W0912 17:52:24.887679 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.887816 kubelet[2758]: E0912 17:52:24.887773 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.888119 kubelet[2758]: E0912 17:52:24.888093 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.888119 kubelet[2758]: W0912 17:52:24.888117 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.888268 kubelet[2758]: E0912 17:52:24.888221 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.888516 kubelet[2758]: E0912 17:52:24.888496 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.888516 kubelet[2758]: W0912 17:52:24.888514 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.888808 kubelet[2758]: E0912 17:52:24.888781 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.890073 kubelet[2758]: E0912 17:52:24.890020 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.890073 kubelet[2758]: W0912 17:52:24.890042 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.891476 kubelet[2758]: E0912 17:52:24.890468 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.891476 kubelet[2758]: W0912 17:52:24.890488 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.892069 kubelet[2758]: E0912 17:52:24.891627 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.893279 kubelet[2758]: E0912 17:52:24.893065 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.893279 kubelet[2758]: W0912 17:52:24.893091 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.893441 kubelet[2758]: E0912 17:52:24.893408 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.893441 kubelet[2758]: W0912 17:52:24.893423 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.893882 kubelet[2758]: E0912 17:52:24.893825 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.895881 kubelet[2758]: W0912 17:52:24.895011 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.895881 kubelet[2758]: E0912 17:52:24.895774 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.895881 kubelet[2758]: W0912 17:52:24.895789 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.898414 kubelet[2758]: E0912 17:52:24.897001 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.898414 kubelet[2758]: E0912 17:52:24.897181 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.898414 kubelet[2758]: E0912 17:52:24.897261 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.898414 kubelet[2758]: E0912 17:52:24.897286 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.898414 kubelet[2758]: E0912 17:52:24.897303 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.898414 kubelet[2758]: E0912 17:52:24.897388 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.898414 kubelet[2758]: W0912 17:52:24.897424 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.899047 kubelet[2758]: E0912 17:52:24.898908 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.899047 kubelet[2758]: W0912 17:52:24.898933 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.899394 kubelet[2758]: E0912 17:52:24.899371 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.899493 kubelet[2758]: W0912 17:52:24.899417 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.899875 kubelet[2758]: E0912 17:52:24.899794 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.899875 kubelet[2758]: W0912 17:52:24.899837 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.902959 kubelet[2758]: E0912 17:52:24.902932 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.902959 kubelet[2758]: W0912 17:52:24.902958 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.903123 kubelet[2758]: E0912 17:52:24.903005 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.903123 kubelet[2758]: E0912 17:52:24.903105 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.903755 kubelet[2758]: E0912 17:52:24.903524 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.903755 kubelet[2758]: E0912 17:52:24.903560 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.906845 kubelet[2758]: E0912 17:52:24.906817 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.907725 kubelet[2758]: E0912 17:52:24.907594 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.908925 kubelet[2758]: W0912 17:52:24.908901 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.909119 kubelet[2758]: E0912 17:52:24.908991 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.909904 kubelet[2758]: E0912 17:52:24.909795 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.911298 kubelet[2758]: W0912 17:52:24.911164 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.911298 kubelet[2758]: E0912 17:52:24.911191 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.943982 kubelet[2758]: E0912 17:52:24.943941 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:24.943982 kubelet[2758]: W0912 17:52:24.943978 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:24.944217 kubelet[2758]: E0912 17:52:24.944007 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:24.976873 kubelet[2758]: E0912 17:52:24.976775 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gxnn6" podUID="4338b83d-ef4c-4813-8a51-871051a0ada9" Sep 12 17:52:25.070338 kubelet[2758]: E0912 17:52:25.068993 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.070338 kubelet[2758]: W0912 17:52:25.070203 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.070338 kubelet[2758]: E0912 17:52:25.070246 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.071390 kubelet[2758]: E0912 17:52:25.071361 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.071590 kubelet[2758]: W0912 17:52:25.071565 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.071707 kubelet[2758]: E0912 17:52:25.071688 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.072744 kubelet[2758]: E0912 17:52:25.072720 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.072744 kubelet[2758]: W0912 17:52:25.072744 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.073133 kubelet[2758]: E0912 17:52:25.072770 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.074920 kubelet[2758]: E0912 17:52:25.074097 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.074920 kubelet[2758]: W0912 17:52:25.074121 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.074920 kubelet[2758]: E0912 17:52:25.074168 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.075479 kubelet[2758]: E0912 17:52:25.075427 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.075479 kubelet[2758]: W0912 17:52:25.075451 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.075479 kubelet[2758]: E0912 17:52:25.075475 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.076231 kubelet[2758]: E0912 17:52:25.076189 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.076231 kubelet[2758]: W0912 17:52:25.076214 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.077014 kubelet[2758]: E0912 17:52:25.076236 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.077176 kubelet[2758]: E0912 17:52:25.077147 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.077235 kubelet[2758]: W0912 17:52:25.077187 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.077235 kubelet[2758]: E0912 17:52:25.077210 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.079014 kubelet[2758]: E0912 17:52:25.078989 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.079014 kubelet[2758]: W0912 17:52:25.079014 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.079186 kubelet[2758]: E0912 17:52:25.079033 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.079444 kubelet[2758]: E0912 17:52:25.079364 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.079444 kubelet[2758]: W0912 17:52:25.079381 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.079444 kubelet[2758]: E0912 17:52:25.079399 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.080562 kubelet[2758]: E0912 17:52:25.080437 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.080562 kubelet[2758]: W0912 17:52:25.080457 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.080562 kubelet[2758]: E0912 17:52:25.080475 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.081942 kubelet[2758]: E0912 17:52:25.081920 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.082078 kubelet[2758]: W0912 17:52:25.082059 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.082192 kubelet[2758]: E0912 17:52:25.082165 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.084669 kubelet[2758]: E0912 17:52:25.084648 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.084824 kubelet[2758]: W0912 17:52:25.084802 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.084972 kubelet[2758]: E0912 17:52:25.084954 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.085882 kubelet[2758]: E0912 17:52:25.085758 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.085882 kubelet[2758]: W0912 17:52:25.085778 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.085882 kubelet[2758]: E0912 17:52:25.085799 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.087217 kubelet[2758]: E0912 17:52:25.087055 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.087217 kubelet[2758]: W0912 17:52:25.087076 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.087217 kubelet[2758]: E0912 17:52:25.087095 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.088829 kubelet[2758]: E0912 17:52:25.088714 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.088829 kubelet[2758]: W0912 17:52:25.088734 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.088829 kubelet[2758]: E0912 17:52:25.088754 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.090759 kubelet[2758]: E0912 17:52:25.089707 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.090759 kubelet[2758]: W0912 17:52:25.089767 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.090759 kubelet[2758]: E0912 17:52:25.089789 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.092016 kubelet[2758]: E0912 17:52:25.091992 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.092016 kubelet[2758]: W0912 17:52:25.092016 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.092160 kubelet[2758]: E0912 17:52:25.092036 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.092822 kubelet[2758]: E0912 17:52:25.092748 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.094549 kubelet[2758]: W0912 17:52:25.094518 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.094668 kubelet[2758]: E0912 17:52:25.094556 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.095624 kubelet[2758]: E0912 17:52:25.095568 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.095624 kubelet[2758]: W0912 17:52:25.095606 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.095624 kubelet[2758]: E0912 17:52:25.095625 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.096929 kubelet[2758]: E0912 17:52:25.096904 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.097078 kubelet[2758]: W0912 17:52:25.096931 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.097078 kubelet[2758]: E0912 17:52:25.096952 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.097887 containerd[1529]: time="2025-09-12T17:52:25.097815421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-757df4fd6d-57rd9,Uid:f28f0c23-ae73-414a-bb93-3e274e5c4a95,Namespace:calico-system,Attempt:0,} returns sandbox id \"a8561a57bcee4d286637cb0c6115ce0e616f472ad25edd7eb8f3ffd686d2dd5a\"" Sep 12 17:52:25.101795 containerd[1529]: time="2025-09-12T17:52:25.101747072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:52:25.166621 kubelet[2758]: E0912 17:52:25.164434 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.166621 kubelet[2758]: W0912 17:52:25.164472 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.166621 kubelet[2758]: E0912 17:52:25.164503 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.166621 kubelet[2758]: I0912 17:52:25.164552 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4338b83d-ef4c-4813-8a51-871051a0ada9-varrun\") pod \"csi-node-driver-gxnn6\" (UID: \"4338b83d-ef4c-4813-8a51-871051a0ada9\") " pod="calico-system/csi-node-driver-gxnn6" Sep 12 17:52:25.167002 kubelet[2758]: E0912 17:52:25.166645 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.167002 kubelet[2758]: W0912 17:52:25.166669 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.167002 kubelet[2758]: E0912 17:52:25.166694 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.167002 kubelet[2758]: I0912 17:52:25.166734 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfm4j\" (UniqueName: \"kubernetes.io/projected/4338b83d-ef4c-4813-8a51-871051a0ada9-kube-api-access-cfm4j\") pod \"csi-node-driver-gxnn6\" (UID: \"4338b83d-ef4c-4813-8a51-871051a0ada9\") " pod="calico-system/csi-node-driver-gxnn6" Sep 12 17:52:25.168246 kubelet[2758]: E0912 17:52:25.167669 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.168246 kubelet[2758]: W0912 17:52:25.167894 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.168246 kubelet[2758]: E0912 17:52:25.167933 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.168246 kubelet[2758]: I0912 17:52:25.167967 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4338b83d-ef4c-4813-8a51-871051a0ada9-kubelet-dir\") pod \"csi-node-driver-gxnn6\" (UID: \"4338b83d-ef4c-4813-8a51-871051a0ada9\") " pod="calico-system/csi-node-driver-gxnn6" Sep 12 17:52:25.169475 kubelet[2758]: E0912 17:52:25.168765 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.169475 kubelet[2758]: W0912 17:52:25.168785 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.169475 kubelet[2758]: E0912 17:52:25.168815 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.169475 kubelet[2758]: I0912 17:52:25.169352 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4338b83d-ef4c-4813-8a51-871051a0ada9-registration-dir\") pod \"csi-node-driver-gxnn6\" (UID: \"4338b83d-ef4c-4813-8a51-871051a0ada9\") " pod="calico-system/csi-node-driver-gxnn6" Sep 12 17:52:25.171120 kubelet[2758]: E0912 17:52:25.170956 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.171622 kubelet[2758]: W0912 17:52:25.171280 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.171622 kubelet[2758]: E0912 17:52:25.171315 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.172919 kubelet[2758]: E0912 17:52:25.172828 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.173291 kubelet[2758]: W0912 17:52:25.172849 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.173603 kubelet[2758]: E0912 17:52:25.173472 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.175229 kubelet[2758]: E0912 17:52:25.175097 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.175229 kubelet[2758]: W0912 17:52:25.175116 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.176141 kubelet[2758]: E0912 17:52:25.175464 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.179037 kubelet[2758]: E0912 17:52:25.178963 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.179037 kubelet[2758]: W0912 17:52:25.179014 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.179478 kubelet[2758]: E0912 17:52:25.179328 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.181921 kubelet[2758]: E0912 17:52:25.181840 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.182486 kubelet[2758]: W0912 17:52:25.182231 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.183327 kubelet[2758]: E0912 17:52:25.182796 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.183627 kubelet[2758]: I0912 17:52:25.183598 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4338b83d-ef4c-4813-8a51-871051a0ada9-socket-dir\") pod \"csi-node-driver-gxnn6\" (UID: \"4338b83d-ef4c-4813-8a51-871051a0ada9\") " pod="calico-system/csi-node-driver-gxnn6" Sep 12 17:52:25.184224 kubelet[2758]: E0912 17:52:25.184143 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.184727 kubelet[2758]: W0912 17:52:25.184421 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.184923 kubelet[2758]: E0912 17:52:25.184901 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.185136 kubelet[2758]: E0912 17:52:25.185083 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.185136 kubelet[2758]: W0912 17:52:25.185099 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.185136 kubelet[2758]: E0912 17:52:25.185116 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.185788 kubelet[2758]: E0912 17:52:25.185746 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.185788 kubelet[2758]: W0912 17:52:25.185765 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.186119 kubelet[2758]: E0912 17:52:25.185970 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.186358 kubelet[2758]: E0912 17:52:25.186302 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.186358 kubelet[2758]: W0912 17:52:25.186320 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.186358 kubelet[2758]: E0912 17:52:25.186338 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.187132 kubelet[2758]: E0912 17:52:25.186930 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.187132 kubelet[2758]: W0912 17:52:25.186948 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.187132 kubelet[2758]: E0912 17:52:25.186967 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.187427 kubelet[2758]: E0912 17:52:25.187408 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.187640 kubelet[2758]: W0912 17:52:25.187564 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.187640 kubelet[2758]: E0912 17:52:25.187593 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.219230 containerd[1529]: time="2025-09-12T17:52:25.219154836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6tp4v,Uid:73a36f19-599c-4b1f-b50e-a523ba155116,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:25.248400 containerd[1529]: time="2025-09-12T17:52:25.248327677Z" level=info msg="connecting to shim 8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0" address="unix:///run/containerd/s/c29c450e8534bb83d89cf6b73a2851abec3b58b21cd0862fa8a0d0d9691c7786" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:25.287083 kubelet[2758]: E0912 17:52:25.287044 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.287083 kubelet[2758]: W0912 17:52:25.287078 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.287580 kubelet[2758]: E0912 17:52:25.287109 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.288242 kubelet[2758]: E0912 17:52:25.288200 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.288242 kubelet[2758]: W0912 17:52:25.288227 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.289011 kubelet[2758]: E0912 17:52:25.288258 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.293372 kubelet[2758]: E0912 17:52:25.293338 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.293481 kubelet[2758]: W0912 17:52:25.293374 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.295116 kubelet[2758]: E0912 17:52:25.295056 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.295116 kubelet[2758]: W0912 17:52:25.295082 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.297014 kubelet[2758]: E0912 17:52:25.296987 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.297014 kubelet[2758]: W0912 17:52:25.297012 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.297182 kubelet[2758]: E0912 17:52:25.297043 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.297452 kubelet[2758]: E0912 17:52:25.297410 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.297452 kubelet[2758]: W0912 17:52:25.297433 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.297604 kubelet[2758]: E0912 17:52:25.297458 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.297890 kubelet[2758]: E0912 17:52:25.297834 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.297890 kubelet[2758]: W0912 17:52:25.297879 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.298032 kubelet[2758]: E0912 17:52:25.297904 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.298274 kubelet[2758]: E0912 17:52:25.298251 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.298274 kubelet[2758]: W0912 17:52:25.298271 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.298395 kubelet[2758]: E0912 17:52:25.298291 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.298829 kubelet[2758]: E0912 17:52:25.298614 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.300687 systemd[1]: Started cri-containerd-8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0.scope - libcontainer container 8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0. Sep 12 17:52:25.303882 kubelet[2758]: E0912 17:52:25.302987 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.303882 kubelet[2758]: W0912 17:52:25.303021 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.303882 kubelet[2758]: E0912 17:52:25.303056 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.303882 kubelet[2758]: E0912 17:52:25.303491 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.303882 kubelet[2758]: W0912 17:52:25.303509 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.303882 kubelet[2758]: E0912 17:52:25.303534 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.303882 kubelet[2758]: E0912 17:52:25.303848 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.303882 kubelet[2758]: W0912 17:52:25.303880 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.304364 kubelet[2758]: E0912 17:52:25.303898 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.304430 kubelet[2758]: E0912 17:52:25.304369 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.304430 kubelet[2758]: W0912 17:52:25.304385 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.304430 kubelet[2758]: E0912 17:52:25.304406 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.306315 kubelet[2758]: E0912 17:52:25.304822 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.306315 kubelet[2758]: W0912 17:52:25.304842 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.306315 kubelet[2758]: E0912 17:52:25.304957 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.306315 kubelet[2758]: E0912 17:52:25.305700 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.306315 kubelet[2758]: W0912 17:52:25.305715 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.306315 kubelet[2758]: E0912 17:52:25.305737 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.306315 kubelet[2758]: E0912 17:52:25.305772 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.309173 kubelet[2758]: E0912 17:52:25.307098 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.309173 kubelet[2758]: W0912 17:52:25.307771 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.309173 kubelet[2758]: E0912 17:52:25.307791 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.309173 kubelet[2758]: E0912 17:52:25.308586 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.309173 kubelet[2758]: W0912 17:52:25.308604 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.309173 kubelet[2758]: E0912 17:52:25.308623 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.310491 kubelet[2758]: E0912 17:52:25.310346 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.310491 kubelet[2758]: W0912 17:52:25.310386 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.310491 kubelet[2758]: E0912 17:52:25.310433 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.310926 kubelet[2758]: E0912 17:52:25.310896 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.310926 kubelet[2758]: W0912 17:52:25.310917 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.311185 kubelet[2758]: E0912 17:52:25.311136 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.312989 kubelet[2758]: E0912 17:52:25.312935 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.312989 kubelet[2758]: W0912 17:52:25.312958 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.312989 kubelet[2758]: E0912 17:52:25.312979 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.314579 kubelet[2758]: E0912 17:52:25.313969 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.314664 kubelet[2758]: W0912 17:52:25.314581 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.314664 kubelet[2758]: E0912 17:52:25.314605 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.317126 kubelet[2758]: E0912 17:52:25.317094 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.317126 kubelet[2758]: W0912 17:52:25.317125 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.317290 kubelet[2758]: E0912 17:52:25.317144 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.318103 kubelet[2758]: E0912 17:52:25.317977 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.318103 kubelet[2758]: W0912 17:52:25.318000 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.318103 kubelet[2758]: E0912 17:52:25.318020 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.321558 kubelet[2758]: E0912 17:52:25.321530 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.321558 kubelet[2758]: W0912 17:52:25.321556 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.321831 kubelet[2758]: E0912 17:52:25.321767 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.324274 kubelet[2758]: E0912 17:52:25.324141 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.324274 kubelet[2758]: W0912 17:52:25.324173 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.324274 kubelet[2758]: E0912 17:52:25.324193 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.326826 kubelet[2758]: E0912 17:52:25.326691 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.326826 kubelet[2758]: W0912 17:52:25.326713 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.326826 kubelet[2758]: E0912 17:52:25.326734 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.337919 kubelet[2758]: E0912 17:52:25.337839 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:25.338229 kubelet[2758]: W0912 17:52:25.337972 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:25.338229 kubelet[2758]: E0912 17:52:25.338009 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:25.378008 containerd[1529]: time="2025-09-12T17:52:25.377932647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6tp4v,Uid:73a36f19-599c-4b1f-b50e-a523ba155116,Namespace:calico-system,Attempt:0,} returns sandbox id \"8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0\"" Sep 12 17:52:26.276958 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3872622604.mount: Deactivated successfully. Sep 12 17:52:26.825439 kubelet[2758]: E0912 17:52:26.825377 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gxnn6" podUID="4338b83d-ef4c-4813-8a51-871051a0ada9" Sep 12 17:52:27.382787 containerd[1529]: time="2025-09-12T17:52:27.382716876Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:27.384360 containerd[1529]: time="2025-09-12T17:52:27.384138142Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 17:52:27.385617 containerd[1529]: time="2025-09-12T17:52:27.385558149Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:27.388597 containerd[1529]: time="2025-09-12T17:52:27.388547269Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:27.389549 containerd[1529]: time="2025-09-12T17:52:27.389508285Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.287705254s" Sep 12 17:52:27.389710 containerd[1529]: time="2025-09-12T17:52:27.389683937Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 17:52:27.391582 containerd[1529]: time="2025-09-12T17:52:27.391531194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:52:27.419780 containerd[1529]: time="2025-09-12T17:52:27.419620114Z" level=info msg="CreateContainer within sandbox \"a8561a57bcee4d286637cb0c6115ce0e616f472ad25edd7eb8f3ffd686d2dd5a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:52:27.437367 containerd[1529]: time="2025-09-12T17:52:27.437313836Z" level=info msg="Container 19e67489ccec6bf178e0a03085bcd79945b38e6a693bd85e43fa2bee185b30c4: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:27.447802 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3069674909.mount: Deactivated successfully. Sep 12 17:52:27.456325 containerd[1529]: time="2025-09-12T17:52:27.455931047Z" level=info msg="CreateContainer within sandbox \"a8561a57bcee4d286637cb0c6115ce0e616f472ad25edd7eb8f3ffd686d2dd5a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"19e67489ccec6bf178e0a03085bcd79945b38e6a693bd85e43fa2bee185b30c4\"" Sep 12 17:52:27.458869 containerd[1529]: time="2025-09-12T17:52:27.458801774Z" level=info msg="StartContainer for \"19e67489ccec6bf178e0a03085bcd79945b38e6a693bd85e43fa2bee185b30c4\"" Sep 12 17:52:27.465897 containerd[1529]: time="2025-09-12T17:52:27.465044343Z" level=info msg="connecting to shim 19e67489ccec6bf178e0a03085bcd79945b38e6a693bd85e43fa2bee185b30c4" address="unix:///run/containerd/s/bce96f7627ace6afe89a08307735957bfc05a3c2e57348a2b4519967fbf0d194" protocol=ttrpc version=3 Sep 12 17:52:27.502122 systemd[1]: Started cri-containerd-19e67489ccec6bf178e0a03085bcd79945b38e6a693bd85e43fa2bee185b30c4.scope - libcontainer container 19e67489ccec6bf178e0a03085bcd79945b38e6a693bd85e43fa2bee185b30c4. Sep 12 17:52:27.579899 containerd[1529]: time="2025-09-12T17:52:27.579783583Z" level=info msg="StartContainer for \"19e67489ccec6bf178e0a03085bcd79945b38e6a693bd85e43fa2bee185b30c4\" returns successfully" Sep 12 17:52:28.032395 kubelet[2758]: E0912 17:52:28.032086 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.032395 kubelet[2758]: W0912 17:52:28.032143 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.032395 kubelet[2758]: E0912 17:52:28.032188 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.034319 kubelet[2758]: E0912 17:52:28.034031 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.034319 kubelet[2758]: W0912 17:52:28.034058 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.034319 kubelet[2758]: E0912 17:52:28.034084 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.035188 kubelet[2758]: E0912 17:52:28.035164 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.035466 kubelet[2758]: W0912 17:52:28.035259 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.035466 kubelet[2758]: E0912 17:52:28.035288 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.037162 kubelet[2758]: E0912 17:52:28.037025 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.037162 kubelet[2758]: W0912 17:52:28.037047 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.037162 kubelet[2758]: E0912 17:52:28.037070 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.037756 kubelet[2758]: E0912 17:52:28.037648 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.037756 kubelet[2758]: W0912 17:52:28.037669 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.037756 kubelet[2758]: E0912 17:52:28.037688 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.038771 kubelet[2758]: E0912 17:52:28.038662 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.038771 kubelet[2758]: W0912 17:52:28.038682 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.038771 kubelet[2758]: E0912 17:52:28.038703 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.040120 kubelet[2758]: E0912 17:52:28.039996 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.040120 kubelet[2758]: W0912 17:52:28.040017 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.040120 kubelet[2758]: E0912 17:52:28.040035 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.041068 kubelet[2758]: E0912 17:52:28.040928 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.041068 kubelet[2758]: W0912 17:52:28.040947 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.041470 kubelet[2758]: E0912 17:52:28.040965 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.042207 kubelet[2758]: E0912 17:52:28.042150 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.042634 kubelet[2758]: W0912 17:52:28.042353 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.042634 kubelet[2758]: E0912 17:52:28.042408 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.043950 kubelet[2758]: E0912 17:52:28.043933 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.044143 kubelet[2758]: W0912 17:52:28.044058 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.044143 kubelet[2758]: E0912 17:52:28.044082 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.044611 kubelet[2758]: E0912 17:52:28.044508 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.044611 kubelet[2758]: W0912 17:52:28.044526 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.044611 kubelet[2758]: E0912 17:52:28.044541 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.046041 kubelet[2758]: E0912 17:52:28.045929 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.046041 kubelet[2758]: W0912 17:52:28.045949 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.046041 kubelet[2758]: E0912 17:52:28.045966 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.046576 kubelet[2758]: E0912 17:52:28.046481 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.046576 kubelet[2758]: W0912 17:52:28.046498 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.046576 kubelet[2758]: E0912 17:52:28.046515 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.047132 kubelet[2758]: E0912 17:52:28.047020 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.047132 kubelet[2758]: W0912 17:52:28.047037 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.047132 kubelet[2758]: E0912 17:52:28.047052 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.048097 kubelet[2758]: E0912 17:52:28.047795 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.048097 kubelet[2758]: W0912 17:52:28.047899 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.048097 kubelet[2758]: E0912 17:52:28.047919 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.128824 kubelet[2758]: E0912 17:52:28.128719 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.128824 kubelet[2758]: W0912 17:52:28.128755 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.128824 kubelet[2758]: E0912 17:52:28.128786 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.131348 kubelet[2758]: E0912 17:52:28.131006 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.131348 kubelet[2758]: W0912 17:52:28.131282 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.133201 kubelet[2758]: E0912 17:52:28.132977 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.133201 kubelet[2758]: W0912 17:52:28.133003 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.133201 kubelet[2758]: E0912 17:52:28.133034 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.135023 kubelet[2758]: E0912 17:52:28.134175 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.136249 kubelet[2758]: E0912 17:52:28.135391 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.136249 kubelet[2758]: W0912 17:52:28.135416 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.136249 kubelet[2758]: E0912 17:52:28.135945 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.137431 kubelet[2758]: E0912 17:52:28.137282 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.137431 kubelet[2758]: W0912 17:52:28.137303 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.139931 kubelet[2758]: E0912 17:52:28.138915 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.139931 kubelet[2758]: E0912 17:52:28.139051 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.139931 kubelet[2758]: W0912 17:52:28.139065 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.139931 kubelet[2758]: E0912 17:52:28.139166 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.141592 kubelet[2758]: E0912 17:52:28.141306 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.141592 kubelet[2758]: W0912 17:52:28.141332 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.141592 kubelet[2758]: E0912 17:52:28.141525 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.141999 kubelet[2758]: E0912 17:52:28.141972 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.141999 kubelet[2758]: W0912 17:52:28.141997 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.142157 kubelet[2758]: E0912 17:52:28.142105 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.142848 kubelet[2758]: E0912 17:52:28.142822 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.142848 kubelet[2758]: W0912 17:52:28.142847 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.143311 kubelet[2758]: E0912 17:52:28.143283 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.143743 kubelet[2758]: E0912 17:52:28.143717 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.143743 kubelet[2758]: W0912 17:52:28.143742 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.143961 kubelet[2758]: E0912 17:52:28.143879 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.146487 kubelet[2758]: E0912 17:52:28.145130 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.146487 kubelet[2758]: W0912 17:52:28.145151 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.146487 kubelet[2758]: E0912 17:52:28.145361 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.146487 kubelet[2758]: E0912 17:52:28.146099 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.146487 kubelet[2758]: W0912 17:52:28.146116 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.146487 kubelet[2758]: E0912 17:52:28.146211 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.147313 kubelet[2758]: E0912 17:52:28.146619 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.147313 kubelet[2758]: W0912 17:52:28.146635 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.147313 kubelet[2758]: E0912 17:52:28.146661 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.147598 kubelet[2758]: E0912 17:52:28.147572 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.147598 kubelet[2758]: W0912 17:52:28.147595 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.148458 kubelet[2758]: E0912 17:52:28.147840 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.149085 kubelet[2758]: E0912 17:52:28.149031 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.149760 kubelet[2758]: W0912 17:52:28.149179 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.149760 kubelet[2758]: E0912 17:52:28.149211 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.149760 kubelet[2758]: E0912 17:52:28.149635 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.149760 kubelet[2758]: W0912 17:52:28.149652 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.149760 kubelet[2758]: E0912 17:52:28.149671 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.150808 kubelet[2758]: E0912 17:52:28.150784 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.150808 kubelet[2758]: W0912 17:52:28.150805 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.151237 kubelet[2758]: E0912 17:52:28.150827 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.151300 kubelet[2758]: E0912 17:52:28.151267 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:52:28.151300 kubelet[2758]: W0912 17:52:28.151283 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:52:28.151399 kubelet[2758]: E0912 17:52:28.151301 2758 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:52:28.548580 containerd[1529]: time="2025-09-12T17:52:28.548514514Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:28.549837 containerd[1529]: time="2025-09-12T17:52:28.549784950Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 17:52:28.551050 containerd[1529]: time="2025-09-12T17:52:28.550982987Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:28.554030 containerd[1529]: time="2025-09-12T17:52:28.553989004Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:28.554902 containerd[1529]: time="2025-09-12T17:52:28.554831703Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.163254377s" Sep 12 17:52:28.555367 containerd[1529]: time="2025-09-12T17:52:28.555041711Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 17:52:28.559123 containerd[1529]: time="2025-09-12T17:52:28.559069936Z" level=info msg="CreateContainer within sandbox \"8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:52:28.573896 containerd[1529]: time="2025-09-12T17:52:28.570133806Z" level=info msg="Container f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:28.586016 containerd[1529]: time="2025-09-12T17:52:28.585959584Z" level=info msg="CreateContainer within sandbox \"8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89\"" Sep 12 17:52:28.586945 containerd[1529]: time="2025-09-12T17:52:28.586896554Z" level=info msg="StartContainer for \"f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89\"" Sep 12 17:52:28.589390 containerd[1529]: time="2025-09-12T17:52:28.589339730Z" level=info msg="connecting to shim f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89" address="unix:///run/containerd/s/c29c450e8534bb83d89cf6b73a2851abec3b58b21cd0862fa8a0d0d9691c7786" protocol=ttrpc version=3 Sep 12 17:52:28.622171 systemd[1]: Started cri-containerd-f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89.scope - libcontainer container f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89. Sep 12 17:52:28.695533 containerd[1529]: time="2025-09-12T17:52:28.694382127Z" level=info msg="StartContainer for \"f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89\" returns successfully" Sep 12 17:52:28.710965 systemd[1]: cri-containerd-f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89.scope: Deactivated successfully. Sep 12 17:52:28.716158 containerd[1529]: time="2025-09-12T17:52:28.715979059Z" level=info msg="received exit event container_id:\"f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89\" id:\"f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89\" pid:3491 exited_at:{seconds:1757699548 nanos:715402151}" Sep 12 17:52:28.716158 containerd[1529]: time="2025-09-12T17:52:28.715995300Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89\" id:\"f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89\" pid:3491 exited_at:{seconds:1757699548 nanos:715402151}" Sep 12 17:52:28.755788 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f2c32ed7f0cd7c8208135f47e2f675821d0b4c02f6b7d1c218c8414f42c2ac89-rootfs.mount: Deactivated successfully. Sep 12 17:52:28.825905 kubelet[2758]: E0912 17:52:28.825365 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gxnn6" podUID="4338b83d-ef4c-4813-8a51-871051a0ada9" Sep 12 17:52:29.005528 kubelet[2758]: I0912 17:52:29.005400 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:52:29.028363 kubelet[2758]: I0912 17:52:29.027924 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-757df4fd6d-57rd9" podStartSLOduration=2.738293403 podStartE2EDuration="5.027897389s" podCreationTimestamp="2025-09-12 17:52:24 +0000 UTC" firstStartedPulling="2025-09-12 17:52:25.101289014 +0000 UTC m=+24.477054435" lastFinishedPulling="2025-09-12 17:52:27.39089299 +0000 UTC m=+26.766658421" observedRunningTime="2025-09-12 17:52:28.033189158 +0000 UTC m=+27.408954605" watchObservedRunningTime="2025-09-12 17:52:29.027897389 +0000 UTC m=+28.403662820" Sep 12 17:52:30.012517 containerd[1529]: time="2025-09-12T17:52:30.012442966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:52:30.826343 kubelet[2758]: E0912 17:52:30.825978 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gxnn6" podUID="4338b83d-ef4c-4813-8a51-871051a0ada9" Sep 12 17:52:32.825423 kubelet[2758]: E0912 17:52:32.825366 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-gxnn6" podUID="4338b83d-ef4c-4813-8a51-871051a0ada9" Sep 12 17:52:33.358043 containerd[1529]: time="2025-09-12T17:52:33.357977373Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:33.359900 containerd[1529]: time="2025-09-12T17:52:33.359742457Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 17:52:33.363151 containerd[1529]: time="2025-09-12T17:52:33.363045391Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:33.366979 containerd[1529]: time="2025-09-12T17:52:33.366913117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:33.368092 containerd[1529]: time="2025-09-12T17:52:33.367897600Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.355401739s" Sep 12 17:52:33.368092 containerd[1529]: time="2025-09-12T17:52:33.367939388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 17:52:33.371702 containerd[1529]: time="2025-09-12T17:52:33.371527475Z" level=info msg="CreateContainer within sandbox \"8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:52:33.385603 containerd[1529]: time="2025-09-12T17:52:33.384039104Z" level=info msg="Container 53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:33.398995 containerd[1529]: time="2025-09-12T17:52:33.398919981Z" level=info msg="CreateContainer within sandbox \"8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f\"" Sep 12 17:52:33.400241 containerd[1529]: time="2025-09-12T17:52:33.400194088Z" level=info msg="StartContainer for \"53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f\"" Sep 12 17:52:33.405440 containerd[1529]: time="2025-09-12T17:52:33.405361529Z" level=info msg="connecting to shim 53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f" address="unix:///run/containerd/s/c29c450e8534bb83d89cf6b73a2851abec3b58b21cd0862fa8a0d0d9691c7786" protocol=ttrpc version=3 Sep 12 17:52:33.446147 systemd[1]: Started cri-containerd-53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f.scope - libcontainer container 53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f. Sep 12 17:52:33.505652 containerd[1529]: time="2025-09-12T17:52:33.505499163Z" level=info msg="StartContainer for \"53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f\" returns successfully" Sep 12 17:52:34.531391 containerd[1529]: time="2025-09-12T17:52:34.531316027Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:52:34.536234 systemd[1]: cri-containerd-53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f.scope: Deactivated successfully. Sep 12 17:52:34.537108 systemd[1]: cri-containerd-53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f.scope: Consumed 670ms CPU time, 190.5M memory peak, 171.3M written to disk. Sep 12 17:52:34.542042 containerd[1529]: time="2025-09-12T17:52:34.541993635Z" level=info msg="received exit event container_id:\"53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f\" id:\"53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f\" pid:3553 exited_at:{seconds:1757699554 nanos:541386287}" Sep 12 17:52:34.542460 containerd[1529]: time="2025-09-12T17:52:34.542309108Z" level=info msg="TaskExit event in podsandbox handler container_id:\"53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f\" id:\"53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f\" pid:3553 exited_at:{seconds:1757699554 nanos:541386287}" Sep 12 17:52:34.577452 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-53f192f620b3a6d2cd98c745496a5c8719ba59e8e9abd5e79a56daac918cc97f-rootfs.mount: Deactivated successfully. Sep 12 17:52:34.592010 kubelet[2758]: I0912 17:52:34.591979 2758 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 17:52:34.650973 kubelet[2758]: W0912 17:52:34.649922 2758 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' and this object Sep 12 17:52:34.651576 kubelet[2758]: E0912 17:52:34.651370 2758 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' and this object" logger="UnhandledError" Sep 12 17:52:34.653313 kubelet[2758]: W0912 17:52:34.653031 2758 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' and this object Sep 12 17:52:34.654198 kubelet[2758]: E0912 17:52:34.653846 2758 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' and this object" logger="UnhandledError" Sep 12 17:52:34.680170 systemd[1]: Created slice kubepods-besteffort-podfe59d7fb_6f3b_47e8_b60f_2151a307efdf.slice - libcontainer container kubepods-besteffort-podfe59d7fb_6f3b_47e8_b60f_2151a307efdf.slice. Sep 12 17:52:34.711531 systemd[1]: Created slice kubepods-besteffort-pod93d6a848_48e1_44a6_bc47_99e26ab4c431.slice - libcontainer container kubepods-besteffort-pod93d6a848_48e1_44a6_bc47_99e26ab4c431.slice. Sep 12 17:52:34.730196 systemd[1]: Created slice kubepods-besteffort-podcf5b145f_168a_4257_97e2_19a035789165.slice - libcontainer container kubepods-besteffort-podcf5b145f_168a_4257_97e2_19a035789165.slice. Sep 12 17:52:34.747463 systemd[1]: Created slice kubepods-burstable-pod21d3d5c8_7c4c_4959_b54f_6c288eb5a5e1.slice - libcontainer container kubepods-burstable-pod21d3d5c8_7c4c_4959_b54f_6c288eb5a5e1.slice. Sep 12 17:52:34.814693 kubelet[2758]: I0912 17:52:34.799099 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0ef63deb-d966-4b37-8b2c-5c53ee82839d-config-volume\") pod \"coredns-7c65d6cfc9-khtc4\" (UID: \"0ef63deb-d966-4b37-8b2c-5c53ee82839d\") " pod="kube-system/coredns-7c65d6cfc9-khtc4" Sep 12 17:52:34.814693 kubelet[2758]: I0912 17:52:34.799218 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d6a848-48e1-44a6-bc47-99e26ab4c431-whisker-ca-bundle\") pod \"whisker-8647c8f9cf-xt247\" (UID: \"93d6a848-48e1-44a6-bc47-99e26ab4c431\") " pod="calico-system/whisker-8647c8f9cf-xt247" Sep 12 17:52:34.814693 kubelet[2758]: I0912 17:52:34.799254 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152e3c21-778c-40cc-99db-2838fd26dcb1-config\") pod \"goldmane-7988f88666-b95h5\" (UID: \"152e3c21-778c-40cc-99db-2838fd26dcb1\") " pod="calico-system/goldmane-7988f88666-b95h5" Sep 12 17:52:34.814693 kubelet[2758]: I0912 17:52:34.799414 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cf5b145f-168a-4257-97e2-19a035789165-calico-apiserver-certs\") pod \"calico-apiserver-6b55c6c674-l4q6p\" (UID: \"cf5b145f-168a-4257-97e2-19a035789165\") " pod="calico-apiserver/calico-apiserver-6b55c6c674-l4q6p" Sep 12 17:52:34.814693 kubelet[2758]: I0912 17:52:34.799495 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/152e3c21-778c-40cc-99db-2838fd26dcb1-goldmane-key-pair\") pod \"goldmane-7988f88666-b95h5\" (UID: \"152e3c21-778c-40cc-99db-2838fd26dcb1\") " pod="calico-system/goldmane-7988f88666-b95h5" Sep 12 17:52:34.761049 systemd[1]: Created slice kubepods-besteffort-pod4480fbb5_84c1_47c0_b885_79f7e266510e.slice - libcontainer container kubepods-besteffort-pod4480fbb5_84c1_47c0_b885_79f7e266510e.slice. Sep 12 17:52:34.817653 kubelet[2758]: I0912 17:52:34.799573 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjrxl\" (UniqueName: \"kubernetes.io/projected/0ef63deb-d966-4b37-8b2c-5c53ee82839d-kube-api-access-zjrxl\") pod \"coredns-7c65d6cfc9-khtc4\" (UID: \"0ef63deb-d966-4b37-8b2c-5c53ee82839d\") " pod="kube-system/coredns-7c65d6cfc9-khtc4" Sep 12 17:52:34.817653 kubelet[2758]: I0912 17:52:34.799652 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlz8z\" (UniqueName: \"kubernetes.io/projected/93d6a848-48e1-44a6-bc47-99e26ab4c431-kube-api-access-xlz8z\") pod \"whisker-8647c8f9cf-xt247\" (UID: \"93d6a848-48e1-44a6-bc47-99e26ab4c431\") " pod="calico-system/whisker-8647c8f9cf-xt247" Sep 12 17:52:34.817653 kubelet[2758]: I0912 17:52:34.799681 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t4c7\" (UniqueName: \"kubernetes.io/projected/4480fbb5-84c1-47c0-b885-79f7e266510e-kube-api-access-8t4c7\") pod \"calico-kube-controllers-6c5659d975-2sszf\" (UID: \"4480fbb5-84c1-47c0-b885-79f7e266510e\") " pod="calico-system/calico-kube-controllers-6c5659d975-2sszf" Sep 12 17:52:34.817653 kubelet[2758]: I0912 17:52:34.799753 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzpj\" (UniqueName: \"kubernetes.io/projected/152e3c21-778c-40cc-99db-2838fd26dcb1-kube-api-access-4wzpj\") pod \"goldmane-7988f88666-b95h5\" (UID: \"152e3c21-778c-40cc-99db-2838fd26dcb1\") " pod="calico-system/goldmane-7988f88666-b95h5" Sep 12 17:52:34.817653 kubelet[2758]: I0912 17:52:34.799834 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1-config-volume\") pod \"coredns-7c65d6cfc9-sx6rl\" (UID: \"21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1\") " pod="kube-system/coredns-7c65d6cfc9-sx6rl" Sep 12 17:52:34.777311 systemd[1]: Created slice kubepods-burstable-pod0ef63deb_d966_4b37_8b2c_5c53ee82839d.slice - libcontainer container kubepods-burstable-pod0ef63deb_d966_4b37_8b2c_5c53ee82839d.slice. Sep 12 17:52:34.820531 kubelet[2758]: I0912 17:52:34.799930 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/152e3c21-778c-40cc-99db-2838fd26dcb1-goldmane-ca-bundle\") pod \"goldmane-7988f88666-b95h5\" (UID: \"152e3c21-778c-40cc-99db-2838fd26dcb1\") " pod="calico-system/goldmane-7988f88666-b95h5" Sep 12 17:52:34.820531 kubelet[2758]: I0912 17:52:34.800015 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znrzb\" (UniqueName: \"kubernetes.io/projected/cf5b145f-168a-4257-97e2-19a035789165-kube-api-access-znrzb\") pod \"calico-apiserver-6b55c6c674-l4q6p\" (UID: \"cf5b145f-168a-4257-97e2-19a035789165\") " pod="calico-apiserver/calico-apiserver-6b55c6c674-l4q6p" Sep 12 17:52:34.820531 kubelet[2758]: I0912 17:52:34.800177 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z8fl\" (UniqueName: \"kubernetes.io/projected/21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1-kube-api-access-9z8fl\") pod \"coredns-7c65d6cfc9-sx6rl\" (UID: \"21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1\") " pod="kube-system/coredns-7c65d6cfc9-sx6rl" Sep 12 17:52:34.820531 kubelet[2758]: I0912 17:52:34.800272 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4480fbb5-84c1-47c0-b885-79f7e266510e-tigera-ca-bundle\") pod \"calico-kube-controllers-6c5659d975-2sszf\" (UID: \"4480fbb5-84c1-47c0-b885-79f7e266510e\") " pod="calico-system/calico-kube-controllers-6c5659d975-2sszf" Sep 12 17:52:34.820531 kubelet[2758]: I0912 17:52:34.800359 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fe59d7fb-6f3b-47e8-b60f-2151a307efdf-calico-apiserver-certs\") pod \"calico-apiserver-6b55c6c674-wfz7h\" (UID: \"fe59d7fb-6f3b-47e8-b60f-2151a307efdf\") " pod="calico-apiserver/calico-apiserver-6b55c6c674-wfz7h" Sep 12 17:52:34.786459 systemd[1]: Created slice kubepods-besteffort-pod152e3c21_778c_40cc_99db_2838fd26dcb1.slice - libcontainer container kubepods-besteffort-pod152e3c21_778c_40cc_99db_2838fd26dcb1.slice. Sep 12 17:52:34.821182 kubelet[2758]: I0912 17:52:34.800459 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/93d6a848-48e1-44a6-bc47-99e26ab4c431-whisker-backend-key-pair\") pod \"whisker-8647c8f9cf-xt247\" (UID: \"93d6a848-48e1-44a6-bc47-99e26ab4c431\") " pod="calico-system/whisker-8647c8f9cf-xt247" Sep 12 17:52:34.821182 kubelet[2758]: I0912 17:52:34.800612 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5zls\" (UniqueName: \"kubernetes.io/projected/fe59d7fb-6f3b-47e8-b60f-2151a307efdf-kube-api-access-d5zls\") pod \"calico-apiserver-6b55c6c674-wfz7h\" (UID: \"fe59d7fb-6f3b-47e8-b60f-2151a307efdf\") " pod="calico-apiserver/calico-apiserver-6b55c6c674-wfz7h" Sep 12 17:52:34.846023 systemd[1]: Created slice kubepods-besteffort-pod4338b83d_ef4c_4813_8a51_871051a0ada9.slice - libcontainer container kubepods-besteffort-pod4338b83d_ef4c_4813_8a51_871051a0ada9.slice. Sep 12 17:52:34.851336 containerd[1529]: time="2025-09-12T17:52:34.851277585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gxnn6,Uid:4338b83d-ef4c-4813-8a51-871051a0ada9,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:35.118431 containerd[1529]: time="2025-09-12T17:52:35.118265537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-sx6rl,Uid:21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1,Namespace:kube-system,Attempt:0,}" Sep 12 17:52:35.120601 containerd[1529]: time="2025-09-12T17:52:35.120301362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-khtc4,Uid:0ef63deb-d966-4b37-8b2c-5c53ee82839d,Namespace:kube-system,Attempt:0,}" Sep 12 17:52:35.121136 containerd[1529]: time="2025-09-12T17:52:35.121094851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c5659d975-2sszf,Uid:4480fbb5-84c1-47c0-b885-79f7e266510e,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:35.121606 containerd[1529]: time="2025-09-12T17:52:35.121214704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8647c8f9cf-xt247,Uid:93d6a848-48e1-44a6-bc47-99e26ab4c431,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:35.123028 containerd[1529]: time="2025-09-12T17:52:35.122850140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-b95h5,Uid:152e3c21-778c-40cc-99db-2838fd26dcb1,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:35.419743 containerd[1529]: time="2025-09-12T17:52:35.419468327Z" level=error msg="Failed to destroy network for sandbox \"6ef32f4954c9c1c1741f725c16dd1ec0a12c7856aafecbe45b1e31be3cad98f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.423183 containerd[1529]: time="2025-09-12T17:52:35.423024449Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gxnn6,Uid:4338b83d-ef4c-4813-8a51-871051a0ada9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ef32f4954c9c1c1741f725c16dd1ec0a12c7856aafecbe45b1e31be3cad98f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.423748 kubelet[2758]: E0912 17:52:35.423649 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ef32f4954c9c1c1741f725c16dd1ec0a12c7856aafecbe45b1e31be3cad98f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.424076 kubelet[2758]: E0912 17:52:35.423746 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ef32f4954c9c1c1741f725c16dd1ec0a12c7856aafecbe45b1e31be3cad98f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gxnn6" Sep 12 17:52:35.424076 kubelet[2758]: E0912 17:52:35.423776 2758 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ef32f4954c9c1c1741f725c16dd1ec0a12c7856aafecbe45b1e31be3cad98f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-gxnn6" Sep 12 17:52:35.424076 kubelet[2758]: E0912 17:52:35.423851 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-gxnn6_calico-system(4338b83d-ef4c-4813-8a51-871051a0ada9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-gxnn6_calico-system(4338b83d-ef4c-4813-8a51-871051a0ada9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6ef32f4954c9c1c1741f725c16dd1ec0a12c7856aafecbe45b1e31be3cad98f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-gxnn6" podUID="4338b83d-ef4c-4813-8a51-871051a0ada9" Sep 12 17:52:35.437541 containerd[1529]: time="2025-09-12T17:52:35.437465173Z" level=error msg="Failed to destroy network for sandbox \"12378659ac3add9b1ac0244d7469a9817615fa278e37a7bdb96ca9b18a447f1f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.443131 containerd[1529]: time="2025-09-12T17:52:35.442980087Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c5659d975-2sszf,Uid:4480fbb5-84c1-47c0-b885-79f7e266510e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"12378659ac3add9b1ac0244d7469a9817615fa278e37a7bdb96ca9b18a447f1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.443534 kubelet[2758]: E0912 17:52:35.443353 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12378659ac3add9b1ac0244d7469a9817615fa278e37a7bdb96ca9b18a447f1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.443534 kubelet[2758]: E0912 17:52:35.443435 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12378659ac3add9b1ac0244d7469a9817615fa278e37a7bdb96ca9b18a447f1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6c5659d975-2sszf" Sep 12 17:52:35.443534 kubelet[2758]: E0912 17:52:35.443475 2758 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12378659ac3add9b1ac0244d7469a9817615fa278e37a7bdb96ca9b18a447f1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6c5659d975-2sszf" Sep 12 17:52:35.443761 kubelet[2758]: E0912 17:52:35.443539 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6c5659d975-2sszf_calico-system(4480fbb5-84c1-47c0-b885-79f7e266510e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6c5659d975-2sszf_calico-system(4480fbb5-84c1-47c0-b885-79f7e266510e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12378659ac3add9b1ac0244d7469a9817615fa278e37a7bdb96ca9b18a447f1f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6c5659d975-2sszf" podUID="4480fbb5-84c1-47c0-b885-79f7e266510e" Sep 12 17:52:35.457360 containerd[1529]: time="2025-09-12T17:52:35.457180652Z" level=error msg="Failed to destroy network for sandbox \"a805301ef362a208ed0ca8ec3950863a09ecad0947710c8ab418a0dd0da93346\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.461168 containerd[1529]: time="2025-09-12T17:52:35.461011077Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-khtc4,Uid:0ef63deb-d966-4b37-8b2c-5c53ee82839d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a805301ef362a208ed0ca8ec3950863a09ecad0947710c8ab418a0dd0da93346\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.461394 kubelet[2758]: E0912 17:52:35.461329 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a805301ef362a208ed0ca8ec3950863a09ecad0947710c8ab418a0dd0da93346\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.461477 kubelet[2758]: E0912 17:52:35.461397 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a805301ef362a208ed0ca8ec3950863a09ecad0947710c8ab418a0dd0da93346\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-khtc4" Sep 12 17:52:35.461477 kubelet[2758]: E0912 17:52:35.461427 2758 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a805301ef362a208ed0ca8ec3950863a09ecad0947710c8ab418a0dd0da93346\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-khtc4" Sep 12 17:52:35.461583 kubelet[2758]: E0912 17:52:35.461499 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-khtc4_kube-system(0ef63deb-d966-4b37-8b2c-5c53ee82839d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-khtc4_kube-system(0ef63deb-d966-4b37-8b2c-5c53ee82839d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a805301ef362a208ed0ca8ec3950863a09ecad0947710c8ab418a0dd0da93346\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-khtc4" podUID="0ef63deb-d966-4b37-8b2c-5c53ee82839d" Sep 12 17:52:35.481737 containerd[1529]: time="2025-09-12T17:52:35.481381192Z" level=error msg="Failed to destroy network for sandbox \"9b6a5f03ed552cd481384590ad04599009b20eed839262ee3ac4ca792395e141\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.486036 containerd[1529]: time="2025-09-12T17:52:35.485942272Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8647c8f9cf-xt247,Uid:93d6a848-48e1-44a6-bc47-99e26ab4c431,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6a5f03ed552cd481384590ad04599009b20eed839262ee3ac4ca792395e141\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.487061 kubelet[2758]: E0912 17:52:35.486447 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6a5f03ed552cd481384590ad04599009b20eed839262ee3ac4ca792395e141\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.487061 kubelet[2758]: E0912 17:52:35.486526 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6a5f03ed552cd481384590ad04599009b20eed839262ee3ac4ca792395e141\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-8647c8f9cf-xt247" Sep 12 17:52:35.487061 kubelet[2758]: E0912 17:52:35.486558 2758 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6a5f03ed552cd481384590ad04599009b20eed839262ee3ac4ca792395e141\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-8647c8f9cf-xt247" Sep 12 17:52:35.487555 kubelet[2758]: E0912 17:52:35.486630 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-8647c8f9cf-xt247_calico-system(93d6a848-48e1-44a6-bc47-99e26ab4c431)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-8647c8f9cf-xt247_calico-system(93d6a848-48e1-44a6-bc47-99e26ab4c431)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b6a5f03ed552cd481384590ad04599009b20eed839262ee3ac4ca792395e141\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-8647c8f9cf-xt247" podUID="93d6a848-48e1-44a6-bc47-99e26ab4c431" Sep 12 17:52:35.488590 containerd[1529]: time="2025-09-12T17:52:35.488449797Z" level=error msg="Failed to destroy network for sandbox \"dd6de81fb47ae67e85b324bf4c7b364268b87508e2c8eb8c573466dd9662e5b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.491132 containerd[1529]: time="2025-09-12T17:52:35.490268262Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-sx6rl,Uid:21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd6de81fb47ae67e85b324bf4c7b364268b87508e2c8eb8c573466dd9662e5b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.491305 kubelet[2758]: E0912 17:52:35.490531 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd6de81fb47ae67e85b324bf4c7b364268b87508e2c8eb8c573466dd9662e5b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.491305 kubelet[2758]: E0912 17:52:35.490607 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd6de81fb47ae67e85b324bf4c7b364268b87508e2c8eb8c573466dd9662e5b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-sx6rl" Sep 12 17:52:35.491305 kubelet[2758]: E0912 17:52:35.490641 2758 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd6de81fb47ae67e85b324bf4c7b364268b87508e2c8eb8c573466dd9662e5b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-sx6rl" Sep 12 17:52:35.491491 kubelet[2758]: E0912 17:52:35.490701 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-sx6rl_kube-system(21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-sx6rl_kube-system(21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd6de81fb47ae67e85b324bf4c7b364268b87508e2c8eb8c573466dd9662e5b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-sx6rl" podUID="21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1" Sep 12 17:52:35.497572 containerd[1529]: time="2025-09-12T17:52:35.497520135Z" level=error msg="Failed to destroy network for sandbox \"e6aa8a012f2a9e1b4e9ee7c45fc03f49f8ffe21c1fc782200f31c0e720fba454\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.499151 containerd[1529]: time="2025-09-12T17:52:35.499018981Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-b95h5,Uid:152e3c21-778c-40cc-99db-2838fd26dcb1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6aa8a012f2a9e1b4e9ee7c45fc03f49f8ffe21c1fc782200f31c0e720fba454\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.499395 kubelet[2758]: E0912 17:52:35.499302 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6aa8a012f2a9e1b4e9ee7c45fc03f49f8ffe21c1fc782200f31c0e720fba454\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.499395 kubelet[2758]: E0912 17:52:35.499372 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6aa8a012f2a9e1b4e9ee7c45fc03f49f8ffe21c1fc782200f31c0e720fba454\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-b95h5" Sep 12 17:52:35.499531 kubelet[2758]: E0912 17:52:35.499402 2758 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6aa8a012f2a9e1b4e9ee7c45fc03f49f8ffe21c1fc782200f31c0e720fba454\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-b95h5" Sep 12 17:52:35.499531 kubelet[2758]: E0912 17:52:35.499477 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-b95h5_calico-system(152e3c21-778c-40cc-99db-2838fd26dcb1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-b95h5_calico-system(152e3c21-778c-40cc-99db-2838fd26dcb1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e6aa8a012f2a9e1b4e9ee7c45fc03f49f8ffe21c1fc782200f31c0e720fba454\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-b95h5" podUID="152e3c21-778c-40cc-99db-2838fd26dcb1" Sep 12 17:52:35.888177 containerd[1529]: time="2025-09-12T17:52:35.888104696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b55c6c674-wfz7h,Uid:fe59d7fb-6f3b-47e8-b60f-2151a307efdf,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:52:35.957028 containerd[1529]: time="2025-09-12T17:52:35.956953875Z" level=error msg="Failed to destroy network for sandbox \"a51c7abdc4c95b2ec90ecf6776aa5789df36720934627d321c79ef6afb128723\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.958742 containerd[1529]: time="2025-09-12T17:52:35.958678869Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b55c6c674-wfz7h,Uid:fe59d7fb-6f3b-47e8-b60f-2151a307efdf,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a51c7abdc4c95b2ec90ecf6776aa5789df36720934627d321c79ef6afb128723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.959711 kubelet[2758]: E0912 17:52:35.959004 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a51c7abdc4c95b2ec90ecf6776aa5789df36720934627d321c79ef6afb128723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:35.959711 kubelet[2758]: E0912 17:52:35.959076 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a51c7abdc4c95b2ec90ecf6776aa5789df36720934627d321c79ef6afb128723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b55c6c674-wfz7h" Sep 12 17:52:35.959711 kubelet[2758]: E0912 17:52:35.959118 2758 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a51c7abdc4c95b2ec90ecf6776aa5789df36720934627d321c79ef6afb128723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b55c6c674-wfz7h" Sep 12 17:52:35.960372 kubelet[2758]: E0912 17:52:35.959192 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b55c6c674-wfz7h_calico-apiserver(fe59d7fb-6f3b-47e8-b60f-2151a307efdf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b55c6c674-wfz7h_calico-apiserver(fe59d7fb-6f3b-47e8-b60f-2151a307efdf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a51c7abdc4c95b2ec90ecf6776aa5789df36720934627d321c79ef6afb128723\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b55c6c674-wfz7h" podUID="fe59d7fb-6f3b-47e8-b60f-2151a307efdf" Sep 12 17:52:36.021474 containerd[1529]: time="2025-09-12T17:52:36.021413134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b55c6c674-l4q6p,Uid:cf5b145f-168a-4257-97e2-19a035789165,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:52:36.055058 containerd[1529]: time="2025-09-12T17:52:36.054952362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:52:36.127154 containerd[1529]: time="2025-09-12T17:52:36.127092320Z" level=error msg="Failed to destroy network for sandbox \"ee3580c2bc9a29481ca59483aa598298d8278b2406335d443867ea8e513e9121\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:36.128868 containerd[1529]: time="2025-09-12T17:52:36.128784897Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b55c6c674-l4q6p,Uid:cf5b145f-168a-4257-97e2-19a035789165,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee3580c2bc9a29481ca59483aa598298d8278b2406335d443867ea8e513e9121\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:36.129190 kubelet[2758]: E0912 17:52:36.129077 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee3580c2bc9a29481ca59483aa598298d8278b2406335d443867ea8e513e9121\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:52:36.129190 kubelet[2758]: E0912 17:52:36.129154 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee3580c2bc9a29481ca59483aa598298d8278b2406335d443867ea8e513e9121\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b55c6c674-l4q6p" Sep 12 17:52:36.129400 kubelet[2758]: E0912 17:52:36.129272 2758 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ee3580c2bc9a29481ca59483aa598298d8278b2406335d443867ea8e513e9121\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b55c6c674-l4q6p" Sep 12 17:52:36.129400 kubelet[2758]: E0912 17:52:36.129350 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b55c6c674-l4q6p_calico-apiserver(cf5b145f-168a-4257-97e2-19a035789165)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b55c6c674-l4q6p_calico-apiserver(cf5b145f-168a-4257-97e2-19a035789165)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ee3580c2bc9a29481ca59483aa598298d8278b2406335d443867ea8e513e9121\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b55c6c674-l4q6p" podUID="cf5b145f-168a-4257-97e2-19a035789165" Sep 12 17:52:41.724881 kubelet[2758]: I0912 17:52:41.724813 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:52:43.087220 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3918842872.mount: Deactivated successfully. Sep 12 17:52:43.128151 containerd[1529]: time="2025-09-12T17:52:43.128080390Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:43.129622 containerd[1529]: time="2025-09-12T17:52:43.129397202Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 17:52:43.130695 containerd[1529]: time="2025-09-12T17:52:43.130648215Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:43.134234 containerd[1529]: time="2025-09-12T17:52:43.133247443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:43.134234 containerd[1529]: time="2025-09-12T17:52:43.134074115Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 7.079063897s" Sep 12 17:52:43.134234 containerd[1529]: time="2025-09-12T17:52:43.134112633Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 17:52:43.157429 containerd[1529]: time="2025-09-12T17:52:43.157379489Z" level=info msg="CreateContainer within sandbox \"8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:52:43.174228 containerd[1529]: time="2025-09-12T17:52:43.174168684Z" level=info msg="Container af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:43.187646 containerd[1529]: time="2025-09-12T17:52:43.187570220Z" level=info msg="CreateContainer within sandbox \"8dcdd5bb43e15ea8b9a175b28514616b098b775958082965514bd7835edba8d0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b\"" Sep 12 17:52:43.188638 containerd[1529]: time="2025-09-12T17:52:43.188581087Z" level=info msg="StartContainer for \"af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b\"" Sep 12 17:52:43.191079 containerd[1529]: time="2025-09-12T17:52:43.191020059Z" level=info msg="connecting to shim af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b" address="unix:///run/containerd/s/c29c450e8534bb83d89cf6b73a2851abec3b58b21cd0862fa8a0d0d9691c7786" protocol=ttrpc version=3 Sep 12 17:52:43.224109 systemd[1]: Started cri-containerd-af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b.scope - libcontainer container af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b. Sep 12 17:52:43.290557 containerd[1529]: time="2025-09-12T17:52:43.290471788Z" level=info msg="StartContainer for \"af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b\" returns successfully" Sep 12 17:52:43.414933 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:52:43.415132 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:52:43.776014 kubelet[2758]: I0912 17:52:43.774656 2758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/93d6a848-48e1-44a6-bc47-99e26ab4c431-whisker-backend-key-pair\") pod \"93d6a848-48e1-44a6-bc47-99e26ab4c431\" (UID: \"93d6a848-48e1-44a6-bc47-99e26ab4c431\") " Sep 12 17:52:43.777684 kubelet[2758]: I0912 17:52:43.776676 2758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlz8z\" (UniqueName: \"kubernetes.io/projected/93d6a848-48e1-44a6-bc47-99e26ab4c431-kube-api-access-xlz8z\") pod \"93d6a848-48e1-44a6-bc47-99e26ab4c431\" (UID: \"93d6a848-48e1-44a6-bc47-99e26ab4c431\") " Sep 12 17:52:43.777684 kubelet[2758]: I0912 17:52:43.776734 2758 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d6a848-48e1-44a6-bc47-99e26ab4c431-whisker-ca-bundle\") pod \"93d6a848-48e1-44a6-bc47-99e26ab4c431\" (UID: \"93d6a848-48e1-44a6-bc47-99e26ab4c431\") " Sep 12 17:52:43.777684 kubelet[2758]: I0912 17:52:43.777299 2758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d6a848-48e1-44a6-bc47-99e26ab4c431-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "93d6a848-48e1-44a6-bc47-99e26ab4c431" (UID: "93d6a848-48e1-44a6-bc47-99e26ab4c431"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 17:52:43.782506 kubelet[2758]: I0912 17:52:43.782457 2758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93d6a848-48e1-44a6-bc47-99e26ab4c431-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "93d6a848-48e1-44a6-bc47-99e26ab4c431" (UID: "93d6a848-48e1-44a6-bc47-99e26ab4c431"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 17:52:43.783137 kubelet[2758]: I0912 17:52:43.783103 2758 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d6a848-48e1-44a6-bc47-99e26ab4c431-kube-api-access-xlz8z" (OuterVolumeSpecName: "kube-api-access-xlz8z") pod "93d6a848-48e1-44a6-bc47-99e26ab4c431" (UID: "93d6a848-48e1-44a6-bc47-99e26ab4c431"). InnerVolumeSpecName "kube-api-access-xlz8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 17:52:43.877979 kubelet[2758]: I0912 17:52:43.877930 2758 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/93d6a848-48e1-44a6-bc47-99e26ab4c431-whisker-backend-key-pair\") on node \"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" DevicePath \"\"" Sep 12 17:52:43.878275 kubelet[2758]: I0912 17:52:43.878220 2758 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93d6a848-48e1-44a6-bc47-99e26ab4c431-whisker-ca-bundle\") on node \"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" DevicePath \"\"" Sep 12 17:52:43.878275 kubelet[2758]: I0912 17:52:43.878251 2758 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlz8z\" (UniqueName: \"kubernetes.io/projected/93d6a848-48e1-44a6-bc47-99e26ab4c431-kube-api-access-xlz8z\") on node \"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" DevicePath \"\"" Sep 12 17:52:44.094587 systemd[1]: var-lib-kubelet-pods-93d6a848\x2d48e1\x2d44a6\x2dbc47\x2d99e26ab4c431-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dxlz8z.mount: Deactivated successfully. Sep 12 17:52:44.094737 systemd[1]: var-lib-kubelet-pods-93d6a848\x2d48e1\x2d44a6\x2dbc47\x2d99e26ab4c431-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:52:44.109918 systemd[1]: Removed slice kubepods-besteffort-pod93d6a848_48e1_44a6_bc47_99e26ab4c431.slice - libcontainer container kubepods-besteffort-pod93d6a848_48e1_44a6_bc47_99e26ab4c431.slice. Sep 12 17:52:44.123049 kubelet[2758]: I0912 17:52:44.122981 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6tp4v" podStartSLOduration=2.368265351 podStartE2EDuration="20.122956586s" podCreationTimestamp="2025-09-12 17:52:24 +0000 UTC" firstStartedPulling="2025-09-12 17:52:25.380847437 +0000 UTC m=+24.756612847" lastFinishedPulling="2025-09-12 17:52:43.135538662 +0000 UTC m=+42.511304082" observedRunningTime="2025-09-12 17:52:44.120878977 +0000 UTC m=+43.496644423" watchObservedRunningTime="2025-09-12 17:52:44.122956586 +0000 UTC m=+43.498722015" Sep 12 17:52:44.213238 kubelet[2758]: W0912 17:52:44.213190 2758 reflector.go:561] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' and this object Sep 12 17:52:44.213781 kubelet[2758]: E0912 17:52:44.213450 2758 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' and this object" logger="UnhandledError" Sep 12 17:52:44.214219 kubelet[2758]: W0912 17:52:44.214195 2758 reflector.go:561] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' and this object Sep 12 17:52:44.214431 kubelet[2758]: E0912 17:52:44.214392 2758 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' and this object" logger="UnhandledError" Sep 12 17:52:44.218072 systemd[1]: Created slice kubepods-besteffort-pod2f662815_6749_45c5_8bbd_45886fc0d4bf.slice - libcontainer container kubepods-besteffort-pod2f662815_6749_45c5_8bbd_45886fc0d4bf.slice. Sep 12 17:52:44.282486 containerd[1529]: time="2025-09-12T17:52:44.282427296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b\" id:\"398600b54d5826dc8fa9a57a8021722963e58425d958900e6bd30ff22895cea5\" pid:3896 exit_status:1 exited_at:{seconds:1757699564 nanos:281833261}" Sep 12 17:52:44.382946 kubelet[2758]: I0912 17:52:44.382652 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2f662815-6749-45c5-8bbd-45886fc0d4bf-whisker-backend-key-pair\") pod \"whisker-5696b6f678-swglv\" (UID: \"2f662815-6749-45c5-8bbd-45886fc0d4bf\") " pod="calico-system/whisker-5696b6f678-swglv" Sep 12 17:52:44.382946 kubelet[2758]: I0912 17:52:44.382726 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s2fp\" (UniqueName: \"kubernetes.io/projected/2f662815-6749-45c5-8bbd-45886fc0d4bf-kube-api-access-8s2fp\") pod \"whisker-5696b6f678-swglv\" (UID: \"2f662815-6749-45c5-8bbd-45886fc0d4bf\") " pod="calico-system/whisker-5696b6f678-swglv" Sep 12 17:52:44.382946 kubelet[2758]: I0912 17:52:44.382780 2758 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f662815-6749-45c5-8bbd-45886fc0d4bf-whisker-ca-bundle\") pod \"whisker-5696b6f678-swglv\" (UID: \"2f662815-6749-45c5-8bbd-45886fc0d4bf\") " pod="calico-system/whisker-5696b6f678-swglv" Sep 12 17:52:44.833127 kubelet[2758]: I0912 17:52:44.832568 2758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d6a848-48e1-44a6-bc47-99e26ab4c431" path="/var/lib/kubelet/pods/93d6a848-48e1-44a6-bc47-99e26ab4c431/volumes" Sep 12 17:52:45.477409 containerd[1529]: time="2025-09-12T17:52:45.477343366Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b\" id:\"3fefb61ac8cd5483fe3f091f583c43be5ecde808cb068593461e618f8904c888\" pid:4008 exit_status:1 exited_at:{seconds:1757699565 nanos:476569476}" Sep 12 17:52:45.484316 kubelet[2758]: E0912 17:52:45.484267 2758 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 12 17:52:45.484526 kubelet[2758]: E0912 17:52:45.484396 2758 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2f662815-6749-45c5-8bbd-45886fc0d4bf-whisker-ca-bundle podName:2f662815-6749-45c5-8bbd-45886fc0d4bf nodeName:}" failed. No retries permitted until 2025-09-12 17:52:45.984358954 +0000 UTC m=+45.360124373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/2f662815-6749-45c5-8bbd-45886fc0d4bf-whisker-ca-bundle") pod "whisker-5696b6f678-swglv" (UID: "2f662815-6749-45c5-8bbd-45886fc0d4bf") : failed to sync configmap cache: timed out waiting for the condition Sep 12 17:52:45.785842 systemd-networkd[1442]: vxlan.calico: Link UP Sep 12 17:52:45.785869 systemd-networkd[1442]: vxlan.calico: Gained carrier Sep 12 17:52:46.024625 containerd[1529]: time="2025-09-12T17:52:46.024548135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5696b6f678-swglv,Uid:2f662815-6749-45c5-8bbd-45886fc0d4bf,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:46.244290 systemd-networkd[1442]: cali9ddd9fb0963: Link UP Sep 12 17:52:46.247373 systemd-networkd[1442]: cali9ddd9fb0963: Gained carrier Sep 12 17:52:46.288798 containerd[1529]: 2025-09-12 17:52:46.095 [INFO][4096] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0 whisker-5696b6f678- calico-system 2f662815-6749-45c5-8bbd-45886fc0d4bf 879 0 2025-09-12 17:52:44 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5696b6f678 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal whisker-5696b6f678-swglv eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9ddd9fb0963 [] [] }} ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Namespace="calico-system" Pod="whisker-5696b6f678-swglv" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-" Sep 12 17:52:46.288798 containerd[1529]: 2025-09-12 17:52:46.096 [INFO][4096] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Namespace="calico-system" Pod="whisker-5696b6f678-swglv" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" Sep 12 17:52:46.288798 containerd[1529]: 2025-09-12 17:52:46.159 [INFO][4108] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" HandleID="k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" Sep 12 17:52:46.289516 containerd[1529]: 2025-09-12 17:52:46.160 [INFO][4108] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" HandleID="k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5710), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", "pod":"whisker-5696b6f678-swglv", "timestamp":"2025-09-12 17:52:46.159825822 +0000 UTC"}, Hostname:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:52:46.289516 containerd[1529]: 2025-09-12 17:52:46.160 [INFO][4108] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:52:46.289516 containerd[1529]: 2025-09-12 17:52:46.161 [INFO][4108] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:52:46.289516 containerd[1529]: 2025-09-12 17:52:46.161 [INFO][4108] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' Sep 12 17:52:46.289516 containerd[1529]: 2025-09-12 17:52:46.172 [INFO][4108] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289516 containerd[1529]: 2025-09-12 17:52:46.182 [INFO][4108] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289516 containerd[1529]: 2025-09-12 17:52:46.189 [INFO][4108] ipam/ipam.go 511: Trying affinity for 192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289516 containerd[1529]: 2025-09-12 17:52:46.199 [INFO][4108] ipam/ipam.go 158: Attempting to load block cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289999 containerd[1529]: 2025-09-12 17:52:46.202 [INFO][4108] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289999 containerd[1529]: 2025-09-12 17:52:46.202 [INFO][4108] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.84.128/26 handle="k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289999 containerd[1529]: 2025-09-12 17:52:46.204 [INFO][4108] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0 Sep 12 17:52:46.289999 containerd[1529]: 2025-09-12 17:52:46.212 [INFO][4108] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.84.128/26 handle="k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289999 containerd[1529]: 2025-09-12 17:52:46.223 [INFO][4108] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.84.129/26] block=192.168.84.128/26 handle="k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289999 containerd[1529]: 2025-09-12 17:52:46.224 [INFO][4108] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.84.129/26] handle="k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:46.289999 containerd[1529]: 2025-09-12 17:52:46.224 [INFO][4108] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:52:46.289999 containerd[1529]: 2025-09-12 17:52:46.225 [INFO][4108] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.84.129/26] IPv6=[] ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" HandleID="k8s-pod-network.f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" Sep 12 17:52:46.290376 containerd[1529]: 2025-09-12 17:52:46.236 [INFO][4096] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Namespace="calico-system" Pod="whisker-5696b6f678-swglv" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0", GenerateName:"whisker-5696b6f678-", Namespace:"calico-system", SelfLink:"", UID:"2f662815-6749-45c5-8bbd-45886fc0d4bf", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5696b6f678", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"", Pod:"whisker-5696b6f678-swglv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.84.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9ddd9fb0963", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:46.290513 containerd[1529]: 2025-09-12 17:52:46.236 [INFO][4096] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.84.129/32] ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Namespace="calico-system" Pod="whisker-5696b6f678-swglv" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" Sep 12 17:52:46.290513 containerd[1529]: 2025-09-12 17:52:46.236 [INFO][4096] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9ddd9fb0963 ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Namespace="calico-system" Pod="whisker-5696b6f678-swglv" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" Sep 12 17:52:46.290513 containerd[1529]: 2025-09-12 17:52:46.251 [INFO][4096] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Namespace="calico-system" Pod="whisker-5696b6f678-swglv" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" Sep 12 17:52:46.290676 containerd[1529]: 2025-09-12 17:52:46.252 [INFO][4096] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Namespace="calico-system" Pod="whisker-5696b6f678-swglv" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0", GenerateName:"whisker-5696b6f678-", Namespace:"calico-system", SelfLink:"", UID:"2f662815-6749-45c5-8bbd-45886fc0d4bf", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5696b6f678", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0", Pod:"whisker-5696b6f678-swglv", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.84.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9ddd9fb0963", MAC:"96:70:5b:13:ec:b5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:46.290799 containerd[1529]: 2025-09-12 17:52:46.280 [INFO][4096] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" Namespace="calico-system" Pod="whisker-5696b6f678-swglv" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-whisker--5696b6f678--swglv-eth0" Sep 12 17:52:46.332492 containerd[1529]: time="2025-09-12T17:52:46.332426105Z" level=info msg="connecting to shim f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0" address="unix:///run/containerd/s/7a3184055ba44494f014fec95c682b6e70f826425d76d8bc0a2e79cf6b3a7189" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:46.383248 systemd[1]: Started cri-containerd-f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0.scope - libcontainer container f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0. Sep 12 17:52:46.473452 containerd[1529]: time="2025-09-12T17:52:46.473404652Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5696b6f678-swglv,Uid:2f662815-6749-45c5-8bbd-45886fc0d4bf,Namespace:calico-system,Attempt:0,} returns sandbox id \"f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0\"" Sep 12 17:52:46.475488 containerd[1529]: time="2025-09-12T17:52:46.475434307Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:52:46.826100 containerd[1529]: time="2025-09-12T17:52:46.826035030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gxnn6,Uid:4338b83d-ef4c-4813-8a51-871051a0ada9,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:46.998516 systemd-networkd[1442]: calia49b76c7ac1: Link UP Sep 12 17:52:47.000148 systemd-networkd[1442]: calia49b76c7ac1: Gained carrier Sep 12 17:52:47.038030 containerd[1529]: 2025-09-12 17:52:46.885 [INFO][4201] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0 csi-node-driver- calico-system 4338b83d-ef4c-4813-8a51-871051a0ada9 706 0 2025-09-12 17:52:24 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal csi-node-driver-gxnn6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calia49b76c7ac1 [] [] }} ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Namespace="calico-system" Pod="csi-node-driver-gxnn6" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-" Sep 12 17:52:47.038030 containerd[1529]: 2025-09-12 17:52:46.885 [INFO][4201] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Namespace="calico-system" Pod="csi-node-driver-gxnn6" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" Sep 12 17:52:47.038030 containerd[1529]: 2025-09-12 17:52:46.926 [INFO][4215] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" HandleID="k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" Sep 12 17:52:47.038402 containerd[1529]: 2025-09-12 17:52:46.926 [INFO][4215] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" HandleID="k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f060), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", "pod":"csi-node-driver-gxnn6", "timestamp":"2025-09-12 17:52:46.926082011 +0000 UTC"}, Hostname:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:52:47.038402 containerd[1529]: 2025-09-12 17:52:46.926 [INFO][4215] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:52:47.038402 containerd[1529]: 2025-09-12 17:52:46.926 [INFO][4215] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:52:47.038402 containerd[1529]: 2025-09-12 17:52:46.926 [INFO][4215] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' Sep 12 17:52:47.038402 containerd[1529]: 2025-09-12 17:52:46.936 [INFO][4215] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038402 containerd[1529]: 2025-09-12 17:52:46.942 [INFO][4215] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038402 containerd[1529]: 2025-09-12 17:52:46.947 [INFO][4215] ipam/ipam.go 511: Trying affinity for 192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038402 containerd[1529]: 2025-09-12 17:52:46.951 [INFO][4215] ipam/ipam.go 158: Attempting to load block cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038798 containerd[1529]: 2025-09-12 17:52:46.954 [INFO][4215] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038798 containerd[1529]: 2025-09-12 17:52:46.954 [INFO][4215] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.84.128/26 handle="k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038798 containerd[1529]: 2025-09-12 17:52:46.955 [INFO][4215] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9 Sep 12 17:52:47.038798 containerd[1529]: 2025-09-12 17:52:46.962 [INFO][4215] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.84.128/26 handle="k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038798 containerd[1529]: 2025-09-12 17:52:46.982 [INFO][4215] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.84.130/26] block=192.168.84.128/26 handle="k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038798 containerd[1529]: 2025-09-12 17:52:46.984 [INFO][4215] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.84.130/26] handle="k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:47.038798 containerd[1529]: 2025-09-12 17:52:46.984 [INFO][4215] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:52:47.038798 containerd[1529]: 2025-09-12 17:52:46.984 [INFO][4215] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.84.130/26] IPv6=[] ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" HandleID="k8s-pod-network.87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" Sep 12 17:52:47.040297 containerd[1529]: 2025-09-12 17:52:46.988 [INFO][4201] cni-plugin/k8s.go 418: Populated endpoint ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Namespace="calico-system" Pod="csi-node-driver-gxnn6" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4338b83d-ef4c-4813-8a51-871051a0ada9", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"", Pod:"csi-node-driver-gxnn6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.84.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia49b76c7ac1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:47.040424 containerd[1529]: 2025-09-12 17:52:46.989 [INFO][4201] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.84.130/32] ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Namespace="calico-system" Pod="csi-node-driver-gxnn6" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" Sep 12 17:52:47.040424 containerd[1529]: 2025-09-12 17:52:46.989 [INFO][4201] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia49b76c7ac1 ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Namespace="calico-system" Pod="csi-node-driver-gxnn6" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" Sep 12 17:52:47.040424 containerd[1529]: 2025-09-12 17:52:46.999 [INFO][4201] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Namespace="calico-system" Pod="csi-node-driver-gxnn6" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" Sep 12 17:52:47.040564 containerd[1529]: 2025-09-12 17:52:47.000 [INFO][4201] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Namespace="calico-system" Pod="csi-node-driver-gxnn6" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4338b83d-ef4c-4813-8a51-871051a0ada9", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9", Pod:"csi-node-driver-gxnn6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.84.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calia49b76c7ac1", MAC:"1e:28:bc:ac:31:b9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:47.040699 containerd[1529]: 2025-09-12 17:52:47.032 [INFO][4201] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" Namespace="calico-system" Pod="csi-node-driver-gxnn6" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-csi--node--driver--gxnn6-eth0" Sep 12 17:52:47.092982 containerd[1529]: time="2025-09-12T17:52:47.092664886Z" level=info msg="connecting to shim 87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9" address="unix:///run/containerd/s/694455fca1f0ccb7ff44f7ab55fbc06d01c94d070afaa85c96c07e8e97f6f0bf" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:47.179590 systemd[1]: Started cri-containerd-87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9.scope - libcontainer container 87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9. Sep 12 17:52:47.242146 containerd[1529]: time="2025-09-12T17:52:47.242084924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-gxnn6,Uid:4338b83d-ef4c-4813-8a51-871051a0ada9,Namespace:calico-system,Attempt:0,} returns sandbox id \"87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9\"" Sep 12 17:52:47.283324 systemd-networkd[1442]: vxlan.calico: Gained IPv6LL Sep 12 17:52:47.460758 containerd[1529]: time="2025-09-12T17:52:47.460605299Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:47.462207 containerd[1529]: time="2025-09-12T17:52:47.462118378Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 17:52:47.463221 containerd[1529]: time="2025-09-12T17:52:47.462920148Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:47.470333 containerd[1529]: time="2025-09-12T17:52:47.470292329Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:47.474235 containerd[1529]: time="2025-09-12T17:52:47.474173758Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 998.69659ms" Sep 12 17:52:47.474235 containerd[1529]: time="2025-09-12T17:52:47.474215791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 17:52:47.477232 containerd[1529]: time="2025-09-12T17:52:47.477127278Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:52:47.479574 containerd[1529]: time="2025-09-12T17:52:47.479537544Z" level=info msg="CreateContainer within sandbox \"f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:52:47.491886 containerd[1529]: time="2025-09-12T17:52:47.490022044Z" level=info msg="Container a46e4b823f312058627ea5b37026dbc8e5553a02512bc21ba8db3e9849c0b5ae: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:47.501795 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1620921857.mount: Deactivated successfully. Sep 12 17:52:47.505880 containerd[1529]: time="2025-09-12T17:52:47.505783615Z" level=info msg="CreateContainer within sandbox \"f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"a46e4b823f312058627ea5b37026dbc8e5553a02512bc21ba8db3e9849c0b5ae\"" Sep 12 17:52:47.506602 containerd[1529]: time="2025-09-12T17:52:47.506569888Z" level=info msg="StartContainer for \"a46e4b823f312058627ea5b37026dbc8e5553a02512bc21ba8db3e9849c0b5ae\"" Sep 12 17:52:47.508450 containerd[1529]: time="2025-09-12T17:52:47.508410474Z" level=info msg="connecting to shim a46e4b823f312058627ea5b37026dbc8e5553a02512bc21ba8db3e9849c0b5ae" address="unix:///run/containerd/s/7a3184055ba44494f014fec95c682b6e70f826425d76d8bc0a2e79cf6b3a7189" protocol=ttrpc version=3 Sep 12 17:52:47.539494 systemd-networkd[1442]: cali9ddd9fb0963: Gained IPv6LL Sep 12 17:52:47.546253 systemd[1]: Started cri-containerd-a46e4b823f312058627ea5b37026dbc8e5553a02512bc21ba8db3e9849c0b5ae.scope - libcontainer container a46e4b823f312058627ea5b37026dbc8e5553a02512bc21ba8db3e9849c0b5ae. Sep 12 17:52:47.627348 containerd[1529]: time="2025-09-12T17:52:47.626954440Z" level=info msg="StartContainer for \"a46e4b823f312058627ea5b37026dbc8e5553a02512bc21ba8db3e9849c0b5ae\" returns successfully" Sep 12 17:52:47.827216 containerd[1529]: time="2025-09-12T17:52:47.826681534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-sx6rl,Uid:21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1,Namespace:kube-system,Attempt:0,}" Sep 12 17:52:47.827216 containerd[1529]: time="2025-09-12T17:52:47.826740484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b55c6c674-wfz7h,Uid:fe59d7fb-6f3b-47e8-b60f-2151a307efdf,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:52:47.827216 containerd[1529]: time="2025-09-12T17:52:47.826689273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c5659d975-2sszf,Uid:4480fbb5-84c1-47c0-b885-79f7e266510e,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:47.827216 containerd[1529]: time="2025-09-12T17:52:47.827078257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-b95h5,Uid:152e3c21-778c-40cc-99db-2838fd26dcb1,Namespace:calico-system,Attempt:0,}" Sep 12 17:52:48.136029 systemd-networkd[1442]: calia48d9eb0429: Link UP Sep 12 17:52:48.138570 systemd-networkd[1442]: calia48d9eb0429: Gained carrier Sep 12 17:52:48.169871 containerd[1529]: 2025-09-12 17:52:47.939 [INFO][4315] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0 calico-apiserver-6b55c6c674- calico-apiserver fe59d7fb-6f3b-47e8-b60f-2151a307efdf 800 0 2025-09-12 17:52:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b55c6c674 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal calico-apiserver-6b55c6c674-wfz7h eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia48d9eb0429 [] [] }} ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-wfz7h" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-" Sep 12 17:52:48.169871 containerd[1529]: 2025-09-12 17:52:47.940 [INFO][4315] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-wfz7h" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" Sep 12 17:52:48.169871 containerd[1529]: 2025-09-12 17:52:48.019 [INFO][4358] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" HandleID="k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" Sep 12 17:52:48.170321 containerd[1529]: 2025-09-12 17:52:48.020 [INFO][4358] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" HandleID="k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", "pod":"calico-apiserver-6b55c6c674-wfz7h", "timestamp":"2025-09-12 17:52:48.019846656 +0000 UTC"}, Hostname:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:52:48.170321 containerd[1529]: 2025-09-12 17:52:48.021 [INFO][4358] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:52:48.170321 containerd[1529]: 2025-09-12 17:52:48.021 [INFO][4358] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:52:48.170321 containerd[1529]: 2025-09-12 17:52:48.021 [INFO][4358] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' Sep 12 17:52:48.170321 containerd[1529]: 2025-09-12 17:52:48.037 [INFO][4358] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170321 containerd[1529]: 2025-09-12 17:52:48.059 [INFO][4358] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170321 containerd[1529]: 2025-09-12 17:52:48.072 [INFO][4358] ipam/ipam.go 511: Trying affinity for 192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170321 containerd[1529]: 2025-09-12 17:52:48.079 [INFO][4358] ipam/ipam.go 158: Attempting to load block cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170895 containerd[1529]: 2025-09-12 17:52:48.085 [INFO][4358] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170895 containerd[1529]: 2025-09-12 17:52:48.085 [INFO][4358] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.84.128/26 handle="k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170895 containerd[1529]: 2025-09-12 17:52:48.088 [INFO][4358] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7 Sep 12 17:52:48.170895 containerd[1529]: 2025-09-12 17:52:48.097 [INFO][4358] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.84.128/26 handle="k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170895 containerd[1529]: 2025-09-12 17:52:48.119 [INFO][4358] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.84.131/26] block=192.168.84.128/26 handle="k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170895 containerd[1529]: 2025-09-12 17:52:48.120 [INFO][4358] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.84.131/26] handle="k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.170895 containerd[1529]: 2025-09-12 17:52:48.120 [INFO][4358] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:52:48.170895 containerd[1529]: 2025-09-12 17:52:48.120 [INFO][4358] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.84.131/26] IPv6=[] ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" HandleID="k8s-pod-network.307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" Sep 12 17:52:48.171302 containerd[1529]: 2025-09-12 17:52:48.124 [INFO][4315] cni-plugin/k8s.go 418: Populated endpoint ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-wfz7h" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0", GenerateName:"calico-apiserver-6b55c6c674-", Namespace:"calico-apiserver", SelfLink:"", UID:"fe59d7fb-6f3b-47e8-b60f-2151a307efdf", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b55c6c674", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"", Pod:"calico-apiserver-6b55c6c674-wfz7h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.84.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia48d9eb0429", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:48.171430 containerd[1529]: 2025-09-12 17:52:48.124 [INFO][4315] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.84.131/32] ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-wfz7h" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" Sep 12 17:52:48.171430 containerd[1529]: 2025-09-12 17:52:48.124 [INFO][4315] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia48d9eb0429 ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-wfz7h" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" Sep 12 17:52:48.171430 containerd[1529]: 2025-09-12 17:52:48.139 [INFO][4315] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-wfz7h" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" Sep 12 17:52:48.171590 containerd[1529]: 2025-09-12 17:52:48.140 [INFO][4315] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-wfz7h" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0", GenerateName:"calico-apiserver-6b55c6c674-", Namespace:"calico-apiserver", SelfLink:"", UID:"fe59d7fb-6f3b-47e8-b60f-2151a307efdf", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b55c6c674", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7", Pod:"calico-apiserver-6b55c6c674-wfz7h", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.84.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia48d9eb0429", MAC:"e2:70:9c:39:59:6d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:48.171590 containerd[1529]: 2025-09-12 17:52:48.165 [INFO][4315] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-wfz7h" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--wfz7h-eth0" Sep 12 17:52:48.229749 containerd[1529]: time="2025-09-12T17:52:48.229691865Z" level=info msg="connecting to shim 307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7" address="unix:///run/containerd/s/15b1df06340f8f2952bfabfb1095a6df9c78ea56de879aa988262422a829ba0e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:48.308240 systemd[1]: Started cri-containerd-307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7.scope - libcontainer container 307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7. Sep 12 17:52:48.337012 systemd-networkd[1442]: cali12cc9bf260b: Link UP Sep 12 17:52:48.354480 systemd-networkd[1442]: cali12cc9bf260b: Gained carrier Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:47.992 [INFO][4311] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0 calico-kube-controllers-6c5659d975- calico-system 4480fbb5-84c1-47c0-b885-79f7e266510e 809 0 2025-09-12 17:52:25 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6c5659d975 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal calico-kube-controllers-6c5659d975-2sszf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali12cc9bf260b [] [] }} ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Namespace="calico-system" Pod="calico-kube-controllers-6c5659d975-2sszf" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:47.993 [INFO][4311] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Namespace="calico-system" Pod="calico-kube-controllers-6c5659d975-2sszf" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.173 [INFO][4367] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" HandleID="k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.175 [INFO][4367] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" HandleID="k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031a1b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", "pod":"calico-kube-controllers-6c5659d975-2sszf", "timestamp":"2025-09-12 17:52:48.1729981 +0000 UTC"}, Hostname:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.175 [INFO][4367] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.176 [INFO][4367] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.176 [INFO][4367] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.198 [INFO][4367] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.220 [INFO][4367] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.239 [INFO][4367] ipam/ipam.go 511: Trying affinity for 192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.249 [INFO][4367] ipam/ipam.go 158: Attempting to load block cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.256 [INFO][4367] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.256 [INFO][4367] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.84.128/26 handle="k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.265 [INFO][4367] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538 Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.273 [INFO][4367] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.84.128/26 handle="k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.292 [INFO][4367] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.84.132/26] block=192.168.84.128/26 handle="k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.293 [INFO][4367] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.84.132/26] handle="k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.294 [INFO][4367] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:52:48.397913 containerd[1529]: 2025-09-12 17:52:48.298 [INFO][4367] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.84.132/26] IPv6=[] ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" HandleID="k8s-pod-network.c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" Sep 12 17:52:48.402169 containerd[1529]: 2025-09-12 17:52:48.316 [INFO][4311] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Namespace="calico-system" Pod="calico-kube-controllers-6c5659d975-2sszf" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0", GenerateName:"calico-kube-controllers-6c5659d975-", Namespace:"calico-system", SelfLink:"", UID:"4480fbb5-84c1-47c0-b885-79f7e266510e", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c5659d975", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"", Pod:"calico-kube-controllers-6c5659d975-2sszf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.84.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali12cc9bf260b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:48.402169 containerd[1529]: 2025-09-12 17:52:48.318 [INFO][4311] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.84.132/32] ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Namespace="calico-system" Pod="calico-kube-controllers-6c5659d975-2sszf" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" Sep 12 17:52:48.402169 containerd[1529]: 2025-09-12 17:52:48.318 [INFO][4311] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali12cc9bf260b ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Namespace="calico-system" Pod="calico-kube-controllers-6c5659d975-2sszf" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" Sep 12 17:52:48.402169 containerd[1529]: 2025-09-12 17:52:48.357 [INFO][4311] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Namespace="calico-system" Pod="calico-kube-controllers-6c5659d975-2sszf" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" Sep 12 17:52:48.402169 containerd[1529]: 2025-09-12 17:52:48.359 [INFO][4311] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Namespace="calico-system" Pod="calico-kube-controllers-6c5659d975-2sszf" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0", GenerateName:"calico-kube-controllers-6c5659d975-", Namespace:"calico-system", SelfLink:"", UID:"4480fbb5-84c1-47c0-b885-79f7e266510e", ResourceVersion:"809", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c5659d975", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538", Pod:"calico-kube-controllers-6c5659d975-2sszf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.84.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali12cc9bf260b", MAC:"da:4d:6d:4a:bb:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:48.402169 containerd[1529]: 2025-09-12 17:52:48.388 [INFO][4311] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" Namespace="calico-system" Pod="calico-kube-controllers-6c5659d975-2sszf" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--kube--controllers--6c5659d975--2sszf-eth0" Sep 12 17:52:48.435326 systemd-networkd[1442]: calia49b76c7ac1: Gained IPv6LL Sep 12 17:52:48.477705 systemd-networkd[1442]: cali9712ff81dbd: Link UP Sep 12 17:52:48.480609 systemd-networkd[1442]: cali9712ff81dbd: Gained carrier Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.060 [INFO][4324] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0 goldmane-7988f88666- calico-system 152e3c21-778c-40cc-99db-2838fd26dcb1 805 0 2025-09-12 17:52:24 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal goldmane-7988f88666-b95h5 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali9712ff81dbd [] [] }} ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Namespace="calico-system" Pod="goldmane-7988f88666-b95h5" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.062 [INFO][4324] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Namespace="calico-system" Pod="goldmane-7988f88666-b95h5" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.281 [INFO][4378] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" HandleID="k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.282 [INFO][4378] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" HandleID="k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5f60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", "pod":"goldmane-7988f88666-b95h5", "timestamp":"2025-09-12 17:52:48.28152229 +0000 UTC"}, Hostname:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.282 [INFO][4378] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.295 [INFO][4378] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.302 [INFO][4378] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.355 [INFO][4378] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.382 [INFO][4378] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.411 [INFO][4378] ipam/ipam.go 511: Trying affinity for 192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.418 [INFO][4378] ipam/ipam.go 158: Attempting to load block cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.424 [INFO][4378] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.424 [INFO][4378] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.84.128/26 handle="k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.429 [INFO][4378] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.443 [INFO][4378] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.84.128/26 handle="k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.454 [INFO][4378] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.84.133/26] block=192.168.84.128/26 handle="k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.455 [INFO][4378] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.84.133/26] handle="k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.455 [INFO][4378] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:52:48.536139 containerd[1529]: 2025-09-12 17:52:48.455 [INFO][4378] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.84.133/26] IPv6=[] ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" HandleID="k8s-pod-network.0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" Sep 12 17:52:48.537392 containerd[1529]: 2025-09-12 17:52:48.462 [INFO][4324] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Namespace="calico-system" Pod="goldmane-7988f88666-b95h5" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"152e3c21-778c-40cc-99db-2838fd26dcb1", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"", Pod:"goldmane-7988f88666-b95h5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.84.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali9712ff81dbd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:48.537392 containerd[1529]: 2025-09-12 17:52:48.463 [INFO][4324] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.84.133/32] ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Namespace="calico-system" Pod="goldmane-7988f88666-b95h5" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" Sep 12 17:52:48.537392 containerd[1529]: 2025-09-12 17:52:48.463 [INFO][4324] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9712ff81dbd ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Namespace="calico-system" Pod="goldmane-7988f88666-b95h5" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" Sep 12 17:52:48.537392 containerd[1529]: 2025-09-12 17:52:48.482 [INFO][4324] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Namespace="calico-system" Pod="goldmane-7988f88666-b95h5" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" Sep 12 17:52:48.537392 containerd[1529]: 2025-09-12 17:52:48.484 [INFO][4324] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Namespace="calico-system" Pod="goldmane-7988f88666-b95h5" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"152e3c21-778c-40cc-99db-2838fd26dcb1", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d", Pod:"goldmane-7988f88666-b95h5", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.84.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali9712ff81dbd", MAC:"8e:65:94:c5:d5:41", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:48.537392 containerd[1529]: 2025-09-12 17:52:48.528 [INFO][4324] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" Namespace="calico-system" Pod="goldmane-7988f88666-b95h5" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-goldmane--7988f88666--b95h5-eth0" Sep 12 17:52:48.544257 containerd[1529]: time="2025-09-12T17:52:48.543036753Z" level=info msg="connecting to shim c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538" address="unix:///run/containerd/s/a89b9cf3b89b8e3258a3f7c58d0e8e6520ff0a8cb0e3f9012c27df93545fd49d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:48.672659 systemd[1]: Started cri-containerd-c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538.scope - libcontainer container c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538. Sep 12 17:52:48.678278 containerd[1529]: time="2025-09-12T17:52:48.678174025Z" level=info msg="connecting to shim 0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d" address="unix:///run/containerd/s/ec224cb8ebf1d866e1737a1fae604d88997e4bdefb6c04a5488427662eaea336" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:48.701293 systemd-networkd[1442]: calia0710fed5d2: Link UP Sep 12 17:52:48.703042 systemd-networkd[1442]: calia0710fed5d2: Gained carrier Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.055 [INFO][4328] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0 coredns-7c65d6cfc9- kube-system 21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1 804 0 2025-09-12 17:52:06 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal coredns-7c65d6cfc9-sx6rl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia0710fed5d2 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sx6rl" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.057 [INFO][4328] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sx6rl" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.319 [INFO][4375] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" HandleID="k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.327 [INFO][4375] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" HandleID="k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000344ad0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", "pod":"coredns-7c65d6cfc9-sx6rl", "timestamp":"2025-09-12 17:52:48.319112865 +0000 UTC"}, Hostname:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.330 [INFO][4375] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.455 [INFO][4375] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.456 [INFO][4375] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.484 [INFO][4375] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.522 [INFO][4375] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.545 [INFO][4375] ipam/ipam.go 511: Trying affinity for 192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.556 [INFO][4375] ipam/ipam.go 158: Attempting to load block cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.565 [INFO][4375] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.565 [INFO][4375] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.84.128/26 handle="k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.571 [INFO][4375] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031 Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.605 [INFO][4375] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.84.128/26 handle="k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.647 [INFO][4375] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.84.134/26] block=192.168.84.128/26 handle="k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.649 [INFO][4375] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.84.134/26] handle="k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.649 [INFO][4375] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:52:48.789009 containerd[1529]: 2025-09-12 17:52:48.649 [INFO][4375] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.84.134/26] IPv6=[] ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" HandleID="k8s-pod-network.cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" Sep 12 17:52:48.791768 containerd[1529]: 2025-09-12 17:52:48.673 [INFO][4328] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sx6rl" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"", Pod:"coredns-7c65d6cfc9-sx6rl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.84.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia0710fed5d2", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:48.791768 containerd[1529]: 2025-09-12 17:52:48.683 [INFO][4328] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.84.134/32] ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sx6rl" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" Sep 12 17:52:48.791768 containerd[1529]: 2025-09-12 17:52:48.684 [INFO][4328] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia0710fed5d2 ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sx6rl" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" Sep 12 17:52:48.791768 containerd[1529]: 2025-09-12 17:52:48.706 [INFO][4328] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sx6rl" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" Sep 12 17:52:48.791768 containerd[1529]: 2025-09-12 17:52:48.710 [INFO][4328] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sx6rl" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031", Pod:"coredns-7c65d6cfc9-sx6rl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.84.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia0710fed5d2", MAC:"9a:10:25:6a:26:22", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:48.791768 containerd[1529]: 2025-09-12 17:52:48.753 [INFO][4328] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" Namespace="kube-system" Pod="coredns-7c65d6cfc9-sx6rl" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--sx6rl-eth0" Sep 12 17:52:48.834884 containerd[1529]: time="2025-09-12T17:52:48.831632733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b55c6c674-l4q6p,Uid:cf5b145f-168a-4257-97e2-19a035789165,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:52:48.937478 systemd[1]: Started cri-containerd-0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d.scope - libcontainer container 0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d. Sep 12 17:52:48.966661 containerd[1529]: time="2025-09-12T17:52:48.966500346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b55c6c674-wfz7h,Uid:fe59d7fb-6f3b-47e8-b60f-2151a307efdf,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7\"" Sep 12 17:52:49.009191 containerd[1529]: time="2025-09-12T17:52:49.009127566Z" level=info msg="connecting to shim cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031" address="unix:///run/containerd/s/a8ac974c2520a67fca708c475e0cb3d791716a69e260f0633cf8093219f43ecf" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:49.072719 containerd[1529]: time="2025-09-12T17:52:49.072476586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c5659d975-2sszf,Uid:4480fbb5-84c1-47c0-b885-79f7e266510e,Namespace:calico-system,Attempt:0,} returns sandbox id \"c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538\"" Sep 12 17:52:49.157446 systemd[1]: Started cri-containerd-cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031.scope - libcontainer container cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031. Sep 12 17:52:49.203449 systemd-networkd[1442]: calia48d9eb0429: Gained IPv6LL Sep 12 17:52:49.256795 containerd[1529]: time="2025-09-12T17:52:49.256688315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-b95h5,Uid:152e3c21-778c-40cc-99db-2838fd26dcb1,Namespace:calico-system,Attempt:0,} returns sandbox id \"0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d\"" Sep 12 17:52:49.384755 containerd[1529]: time="2025-09-12T17:52:49.384693546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-sx6rl,Uid:21d3d5c8-7c4c-4959-b54f-6c288eb5a5e1,Namespace:kube-system,Attempt:0,} returns sandbox id \"cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031\"" Sep 12 17:52:49.397210 containerd[1529]: time="2025-09-12T17:52:49.397155917Z" level=info msg="CreateContainer within sandbox \"cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:52:49.420156 containerd[1529]: time="2025-09-12T17:52:49.420058856Z" level=info msg="Container 1f0d3d476b52b96f172a1b658a23c243e5eec7e5ab57383c04a154c8fb351e30: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:49.423550 systemd-networkd[1442]: cali3644f752cff: Link UP Sep 12 17:52:49.427036 systemd-networkd[1442]: cali3644f752cff: Gained carrier Sep 12 17:52:49.469658 containerd[1529]: time="2025-09-12T17:52:49.469500858Z" level=info msg="CreateContainer within sandbox \"cf858b3500c55893961664286afb90d3bd946bebe049a334fa32621fa747c031\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1f0d3d476b52b96f172a1b658a23c243e5eec7e5ab57383c04a154c8fb351e30\"" Sep 12 17:52:49.471562 containerd[1529]: time="2025-09-12T17:52:49.471524627Z" level=info msg="StartContainer for \"1f0d3d476b52b96f172a1b658a23c243e5eec7e5ab57383c04a154c8fb351e30\"" Sep 12 17:52:49.475302 containerd[1529]: time="2025-09-12T17:52:49.475240856Z" level=info msg="connecting to shim 1f0d3d476b52b96f172a1b658a23c243e5eec7e5ab57383c04a154c8fb351e30" address="unix:///run/containerd/s/a8ac974c2520a67fca708c475e0cb3d791716a69e260f0633cf8093219f43ecf" protocol=ttrpc version=3 Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.080 [INFO][4544] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0 calico-apiserver-6b55c6c674- calico-apiserver cf5b145f-168a-4257-97e2-19a035789165 808 0 2025-09-12 17:52:18 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b55c6c674 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal calico-apiserver-6b55c6c674-l4q6p eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3644f752cff [] [] }} ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-l4q6p" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.081 [INFO][4544] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-l4q6p" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.263 [INFO][4607] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" HandleID="k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.265 [INFO][4607] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" HandleID="k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0005c6200), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", "pod":"calico-apiserver-6b55c6c674-l4q6p", "timestamp":"2025-09-12 17:52:49.263526948 +0000 UTC"}, Hostname:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.265 [INFO][4607] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.265 [INFO][4607] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.266 [INFO][4607] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.306 [INFO][4607] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.331 [INFO][4607] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.346 [INFO][4607] ipam/ipam.go 511: Trying affinity for 192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.351 [INFO][4607] ipam/ipam.go 158: Attempting to load block cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.361 [INFO][4607] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.363 [INFO][4607] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.84.128/26 handle="k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.371 [INFO][4607] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.389 [INFO][4607] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.84.128/26 handle="k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.414 [INFO][4607] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.84.135/26] block=192.168.84.128/26 handle="k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.415 [INFO][4607] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.84.135/26] handle="k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.415 [INFO][4607] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:52:49.506798 containerd[1529]: 2025-09-12 17:52:49.415 [INFO][4607] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.84.135/26] IPv6=[] ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" HandleID="k8s-pod-network.19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" Sep 12 17:52:49.519836 containerd[1529]: 2025-09-12 17:52:49.419 [INFO][4544] cni-plugin/k8s.go 418: Populated endpoint ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-l4q6p" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0", GenerateName:"calico-apiserver-6b55c6c674-", Namespace:"calico-apiserver", SelfLink:"", UID:"cf5b145f-168a-4257-97e2-19a035789165", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b55c6c674", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"", Pod:"calico-apiserver-6b55c6c674-l4q6p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.84.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3644f752cff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:49.519836 containerd[1529]: 2025-09-12 17:52:49.419 [INFO][4544] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.84.135/32] ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-l4q6p" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" Sep 12 17:52:49.519836 containerd[1529]: 2025-09-12 17:52:49.419 [INFO][4544] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3644f752cff ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-l4q6p" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" Sep 12 17:52:49.519836 containerd[1529]: 2025-09-12 17:52:49.425 [INFO][4544] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-l4q6p" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" Sep 12 17:52:49.519836 containerd[1529]: 2025-09-12 17:52:49.428 [INFO][4544] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-l4q6p" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0", GenerateName:"calico-apiserver-6b55c6c674-", Namespace:"calico-apiserver", SelfLink:"", UID:"cf5b145f-168a-4257-97e2-19a035789165", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b55c6c674", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d", Pod:"calico-apiserver-6b55c6c674-l4q6p", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.84.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3644f752cff", MAC:"ca:10:5a:29:39:f1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:49.519836 containerd[1529]: 2025-09-12 17:52:49.485 [INFO][4544] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" Namespace="calico-apiserver" Pod="calico-apiserver-6b55c6c674-l4q6p" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-calico--apiserver--6b55c6c674--l4q6p-eth0" Sep 12 17:52:49.523248 systemd-networkd[1442]: cali9712ff81dbd: Gained IPv6LL Sep 12 17:52:49.625648 systemd[1]: Started cri-containerd-1f0d3d476b52b96f172a1b658a23c243e5eec7e5ab57383c04a154c8fb351e30.scope - libcontainer container 1f0d3d476b52b96f172a1b658a23c243e5eec7e5ab57383c04a154c8fb351e30. Sep 12 17:52:49.668563 containerd[1529]: time="2025-09-12T17:52:49.668451943Z" level=info msg="connecting to shim 19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d" address="unix:///run/containerd/s/a65e9cd57d2ee2c96493939e757628c673eb5beaeb828c8a80c97f2f2eff7c55" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:49.735526 containerd[1529]: time="2025-09-12T17:52:49.735194917Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:49.740264 containerd[1529]: time="2025-09-12T17:52:49.740176073Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 17:52:49.743894 containerd[1529]: time="2025-09-12T17:52:49.743781198Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:49.754879 containerd[1529]: time="2025-09-12T17:52:49.754808010Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:49.762522 containerd[1529]: time="2025-09-12T17:52:49.762445534Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.285253498s" Sep 12 17:52:49.762801 containerd[1529]: time="2025-09-12T17:52:49.762764337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 17:52:49.768124 containerd[1529]: time="2025-09-12T17:52:49.768044779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:52:49.777518 containerd[1529]: time="2025-09-12T17:52:49.777367836Z" level=info msg="CreateContainer within sandbox \"87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:52:49.781375 containerd[1529]: time="2025-09-12T17:52:49.779768569Z" level=info msg="StartContainer for \"1f0d3d476b52b96f172a1b658a23c243e5eec7e5ab57383c04a154c8fb351e30\" returns successfully" Sep 12 17:52:49.790252 systemd[1]: Started cri-containerd-19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d.scope - libcontainer container 19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d. Sep 12 17:52:49.810786 containerd[1529]: time="2025-09-12T17:52:49.809057202Z" level=info msg="Container 5e481ea5ec3fd036f2b6afdf087e40f197d4ff80d6d6808c2a00356baa73e0a8: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:49.837979 containerd[1529]: time="2025-09-12T17:52:49.837769346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-khtc4,Uid:0ef63deb-d966-4b37-8b2c-5c53ee82839d,Namespace:kube-system,Attempt:0,}" Sep 12 17:52:49.849503 containerd[1529]: time="2025-09-12T17:52:49.849442551Z" level=info msg="CreateContainer within sandbox \"87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5e481ea5ec3fd036f2b6afdf087e40f197d4ff80d6d6808c2a00356baa73e0a8\"" Sep 12 17:52:49.853419 containerd[1529]: time="2025-09-12T17:52:49.853103719Z" level=info msg="StartContainer for \"5e481ea5ec3fd036f2b6afdf087e40f197d4ff80d6d6808c2a00356baa73e0a8\"" Sep 12 17:52:49.866172 containerd[1529]: time="2025-09-12T17:52:49.866114213Z" level=info msg="connecting to shim 5e481ea5ec3fd036f2b6afdf087e40f197d4ff80d6d6808c2a00356baa73e0a8" address="unix:///run/containerd/s/694455fca1f0ccb7ff44f7ab55fbc06d01c94d070afaa85c96c07e8e97f6f0bf" protocol=ttrpc version=3 Sep 12 17:52:49.930540 systemd[1]: Started cri-containerd-5e481ea5ec3fd036f2b6afdf087e40f197d4ff80d6d6808c2a00356baa73e0a8.scope - libcontainer container 5e481ea5ec3fd036f2b6afdf087e40f197d4ff80d6d6808c2a00356baa73e0a8. Sep 12 17:52:50.135675 containerd[1529]: time="2025-09-12T17:52:50.135621878Z" level=info msg="StartContainer for \"5e481ea5ec3fd036f2b6afdf087e40f197d4ff80d6d6808c2a00356baa73e0a8\" returns successfully" Sep 12 17:52:50.163068 systemd-networkd[1442]: cali12cc9bf260b: Gained IPv6LL Sep 12 17:52:50.168715 systemd-networkd[1442]: cali8110abdd33a: Link UP Sep 12 17:52:50.173725 systemd-networkd[1442]: cali8110abdd33a: Gained carrier Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:49.968 [INFO][4719] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0 coredns-7c65d6cfc9- kube-system 0ef63deb-d966-4b37-8b2c-5c53ee82839d 807 0 2025-09-12 17:52:06 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal coredns-7c65d6cfc9-khtc4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8110abdd33a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khtc4" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:49.969 [INFO][4719] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khtc4" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.054 [INFO][4755] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" HandleID="k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.055 [INFO][4755] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" HandleID="k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032d4b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", "pod":"coredns-7c65d6cfc9-khtc4", "timestamp":"2025-09-12 17:52:50.054722686 +0000 UTC"}, Hostname:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.055 [INFO][4755] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.055 [INFO][4755] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.055 [INFO][4755] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal' Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.068 [INFO][4755] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.078 [INFO][4755] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.089 [INFO][4755] ipam/ipam.go 511: Trying affinity for 192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.096 [INFO][4755] ipam/ipam.go 158: Attempting to load block cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.102 [INFO][4755] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.84.128/26 host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.102 [INFO][4755] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.84.128/26 handle="k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.106 [INFO][4755] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0 Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.123 [INFO][4755] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.84.128/26 handle="k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.138 [INFO][4755] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.84.136/26] block=192.168.84.128/26 handle="k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.138 [INFO][4755] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.84.136/26] handle="k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" host="ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal" Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.139 [INFO][4755] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:52:50.217897 containerd[1529]: 2025-09-12 17:52:50.139 [INFO][4755] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.84.136/26] IPv6=[] ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" HandleID="k8s-pod-network.b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Workload="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" Sep 12 17:52:50.220873 containerd[1529]: 2025-09-12 17:52:50.148 [INFO][4719] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khtc4" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0ef63deb-d966-4b37-8b2c-5c53ee82839d", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"", Pod:"coredns-7c65d6cfc9-khtc4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.84.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8110abdd33a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:50.220873 containerd[1529]: 2025-09-12 17:52:50.148 [INFO][4719] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.84.136/32] ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khtc4" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" Sep 12 17:52:50.220873 containerd[1529]: 2025-09-12 17:52:50.148 [INFO][4719] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8110abdd33a ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khtc4" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" Sep 12 17:52:50.220873 containerd[1529]: 2025-09-12 17:52:50.177 [INFO][4719] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khtc4" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" Sep 12 17:52:50.220873 containerd[1529]: 2025-09-12 17:52:50.180 [INFO][4719] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khtc4" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"0ef63deb-d966-4b37-8b2c-5c53ee82839d", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 52, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-485698b0dd1d62c7d33c.c.flatcar-212911.internal", ContainerID:"b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0", Pod:"coredns-7c65d6cfc9-khtc4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.84.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8110abdd33a", MAC:"96:31:61:9b:a7:ec", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:52:50.220873 containerd[1529]: 2025-09-12 17:52:50.206 [INFO][4719] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-khtc4" WorkloadEndpoint="ci--4426--1--0--485698b0dd1d62c7d33c.c.flatcar--212911.internal-k8s-coredns--7c65d6cfc9--khtc4-eth0" Sep 12 17:52:50.223034 kubelet[2758]: I0912 17:52:50.218378 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-sx6rl" podStartSLOduration=44.218353158 podStartE2EDuration="44.218353158s" podCreationTimestamp="2025-09-12 17:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:52:50.21791786 +0000 UTC m=+49.593683295" watchObservedRunningTime="2025-09-12 17:52:50.218353158 +0000 UTC m=+49.594118588" Sep 12 17:52:50.283789 containerd[1529]: time="2025-09-12T17:52:50.281572505Z" level=info msg="connecting to shim b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0" address="unix:///run/containerd/s/99bef1b7207fc1e3e1e5cc031114875224d107f9d8da6c82420c9d98ed148f40" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:52:50.358114 systemd[1]: Started cri-containerd-b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0.scope - libcontainer container b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0. Sep 12 17:52:50.420023 containerd[1529]: time="2025-09-12T17:52:50.419732315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b55c6c674-l4q6p,Uid:cf5b145f-168a-4257-97e2-19a035789165,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d\"" Sep 12 17:52:50.420381 systemd-networkd[1442]: calia0710fed5d2: Gained IPv6LL Sep 12 17:52:50.536525 containerd[1529]: time="2025-09-12T17:52:50.536341602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-khtc4,Uid:0ef63deb-d966-4b37-8b2c-5c53ee82839d,Namespace:kube-system,Attempt:0,} returns sandbox id \"b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0\"" Sep 12 17:52:50.544498 containerd[1529]: time="2025-09-12T17:52:50.544402891Z" level=info msg="CreateContainer within sandbox \"b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:52:50.565479 containerd[1529]: time="2025-09-12T17:52:50.565419891Z" level=info msg="Container 16d5fd30d27862c0f4b6e70699f6f828be08566effc5cc64e327b84b805b2998: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:50.583998 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount401328466.mount: Deactivated successfully. Sep 12 17:52:50.588691 containerd[1529]: time="2025-09-12T17:52:50.588526173Z" level=info msg="CreateContainer within sandbox \"b05edd4388672bb3984fab0b3baaf31303b0474eb9de7c333648c8b180bb23f0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"16d5fd30d27862c0f4b6e70699f6f828be08566effc5cc64e327b84b805b2998\"" Sep 12 17:52:50.591659 containerd[1529]: time="2025-09-12T17:52:50.591546886Z" level=info msg="StartContainer for \"16d5fd30d27862c0f4b6e70699f6f828be08566effc5cc64e327b84b805b2998\"" Sep 12 17:52:50.598489 containerd[1529]: time="2025-09-12T17:52:50.598242096Z" level=info msg="connecting to shim 16d5fd30d27862c0f4b6e70699f6f828be08566effc5cc64e327b84b805b2998" address="unix:///run/containerd/s/99bef1b7207fc1e3e1e5cc031114875224d107f9d8da6c82420c9d98ed148f40" protocol=ttrpc version=3 Sep 12 17:52:50.652201 systemd[1]: Started cri-containerd-16d5fd30d27862c0f4b6e70699f6f828be08566effc5cc64e327b84b805b2998.scope - libcontainer container 16d5fd30d27862c0f4b6e70699f6f828be08566effc5cc64e327b84b805b2998. Sep 12 17:52:50.730236 containerd[1529]: time="2025-09-12T17:52:50.729700412Z" level=info msg="StartContainer for \"16d5fd30d27862c0f4b6e70699f6f828be08566effc5cc64e327b84b805b2998\" returns successfully" Sep 12 17:52:50.868703 systemd-networkd[1442]: cali3644f752cff: Gained IPv6LL Sep 12 17:52:51.251321 systemd-networkd[1442]: cali8110abdd33a: Gained IPv6LL Sep 12 17:52:51.257970 kubelet[2758]: I0912 17:52:51.257307 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-khtc4" podStartSLOduration=45.257279022 podStartE2EDuration="45.257279022s" podCreationTimestamp="2025-09-12 17:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:52:51.25348973 +0000 UTC m=+50.629255163" watchObservedRunningTime="2025-09-12 17:52:51.257279022 +0000 UTC m=+50.633044456" Sep 12 17:52:52.138749 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2440298455.mount: Deactivated successfully. Sep 12 17:52:52.157167 containerd[1529]: time="2025-09-12T17:52:52.157100866Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:52.158686 containerd[1529]: time="2025-09-12T17:52:52.158479174Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 17:52:52.159790 containerd[1529]: time="2025-09-12T17:52:52.159743691Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:52.163320 containerd[1529]: time="2025-09-12T17:52:52.163250239Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:52.164697 containerd[1529]: time="2025-09-12T17:52:52.164090751Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.39599186s" Sep 12 17:52:52.164697 containerd[1529]: time="2025-09-12T17:52:52.164143499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 17:52:52.165896 containerd[1529]: time="2025-09-12T17:52:52.165835439Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:52:52.168009 containerd[1529]: time="2025-09-12T17:52:52.167955527Z" level=info msg="CreateContainer within sandbox \"f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:52:52.177896 containerd[1529]: time="2025-09-12T17:52:52.177108188Z" level=info msg="Container 9176dd6118154d61331346b2fc90780ad2c751f392722b7c14808e3490e0f3e1: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:52.191888 containerd[1529]: time="2025-09-12T17:52:52.191811671Z" level=info msg="CreateContainer within sandbox \"f19271044f4ef5c7322b12982134eec4a437076330ecbf6102d9e8c9f001c7b0\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"9176dd6118154d61331346b2fc90780ad2c751f392722b7c14808e3490e0f3e1\"" Sep 12 17:52:52.194808 containerd[1529]: time="2025-09-12T17:52:52.194751812Z" level=info msg="StartContainer for \"9176dd6118154d61331346b2fc90780ad2c751f392722b7c14808e3490e0f3e1\"" Sep 12 17:52:52.197065 containerd[1529]: time="2025-09-12T17:52:52.197027091Z" level=info msg="connecting to shim 9176dd6118154d61331346b2fc90780ad2c751f392722b7c14808e3490e0f3e1" address="unix:///run/containerd/s/7a3184055ba44494f014fec95c682b6e70f826425d76d8bc0a2e79cf6b3a7189" protocol=ttrpc version=3 Sep 12 17:52:52.237145 systemd[1]: Started cri-containerd-9176dd6118154d61331346b2fc90780ad2c751f392722b7c14808e3490e0f3e1.scope - libcontainer container 9176dd6118154d61331346b2fc90780ad2c751f392722b7c14808e3490e0f3e1. Sep 12 17:52:52.324081 containerd[1529]: time="2025-09-12T17:52:52.323443355Z" level=info msg="StartContainer for \"9176dd6118154d61331346b2fc90780ad2c751f392722b7c14808e3490e0f3e1\" returns successfully" Sep 12 17:52:53.264916 kubelet[2758]: I0912 17:52:53.264538 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5696b6f678-swglv" podStartSLOduration=3.573924233 podStartE2EDuration="9.264513641s" podCreationTimestamp="2025-09-12 17:52:44 +0000 UTC" firstStartedPulling="2025-09-12 17:52:46.475094399 +0000 UTC m=+45.850859815" lastFinishedPulling="2025-09-12 17:52:52.165683809 +0000 UTC m=+51.541449223" observedRunningTime="2025-09-12 17:52:53.264512854 +0000 UTC m=+52.640278284" watchObservedRunningTime="2025-09-12 17:52:53.264513641 +0000 UTC m=+52.640279071" Sep 12 17:52:53.778892 ntpd[1482]: Listen normally on 8 vxlan.calico 192.168.84.128:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 8 vxlan.calico 192.168.84.128:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 9 vxlan.calico [fe80::647b:bcff:fe0e:3738%4]:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 10 cali9ddd9fb0963 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 11 calia49b76c7ac1 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 12 calia48d9eb0429 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 13 cali12cc9bf260b [fe80::ecee:eeff:feee:eeee%10]:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 14 cali9712ff81dbd [fe80::ecee:eeff:feee:eeee%11]:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 15 calia0710fed5d2 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 16 cali3644f752cff [fe80::ecee:eeff:feee:eeee%13]:123 Sep 12 17:52:53.780368 ntpd[1482]: 12 Sep 17:52:53 ntpd[1482]: Listen normally on 17 cali8110abdd33a [fe80::ecee:eeff:feee:eeee%14]:123 Sep 12 17:52:53.779537 ntpd[1482]: Listen normally on 9 vxlan.calico [fe80::647b:bcff:fe0e:3738%4]:123 Sep 12 17:52:53.779629 ntpd[1482]: Listen normally on 10 cali9ddd9fb0963 [fe80::ecee:eeff:feee:eeee%7]:123 Sep 12 17:52:53.779672 ntpd[1482]: Listen normally on 11 calia49b76c7ac1 [fe80::ecee:eeff:feee:eeee%8]:123 Sep 12 17:52:53.779713 ntpd[1482]: Listen normally on 12 calia48d9eb0429 [fe80::ecee:eeff:feee:eeee%9]:123 Sep 12 17:52:53.779755 ntpd[1482]: Listen normally on 13 cali12cc9bf260b [fe80::ecee:eeff:feee:eeee%10]:123 Sep 12 17:52:53.779793 ntpd[1482]: Listen normally on 14 cali9712ff81dbd [fe80::ecee:eeff:feee:eeee%11]:123 Sep 12 17:52:53.779828 ntpd[1482]: Listen normally on 15 calia0710fed5d2 [fe80::ecee:eeff:feee:eeee%12]:123 Sep 12 17:52:53.779888 ntpd[1482]: Listen normally on 16 cali3644f752cff [fe80::ecee:eeff:feee:eeee%13]:123 Sep 12 17:52:53.779941 ntpd[1482]: Listen normally on 17 cali8110abdd33a [fe80::ecee:eeff:feee:eeee%14]:123 Sep 12 17:52:54.812341 containerd[1529]: time="2025-09-12T17:52:54.812252287Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:54.814104 containerd[1529]: time="2025-09-12T17:52:54.814051497Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 17:52:54.815264 containerd[1529]: time="2025-09-12T17:52:54.814544088Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:54.817934 containerd[1529]: time="2025-09-12T17:52:54.817892336Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:54.819734 containerd[1529]: time="2025-09-12T17:52:54.819409331Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.653346771s" Sep 12 17:52:54.819734 containerd[1529]: time="2025-09-12T17:52:54.819458433Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:52:54.822808 containerd[1529]: time="2025-09-12T17:52:54.822758418Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:52:54.826424 containerd[1529]: time="2025-09-12T17:52:54.826204134Z" level=info msg="CreateContainer within sandbox \"307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:52:54.839884 containerd[1529]: time="2025-09-12T17:52:54.839083609Z" level=info msg="Container 055cde9e4149ba2bc1cc33bdd0e0bbb68f2b6eda333e3eb7da35bb5dc25614aa: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:54.853311 containerd[1529]: time="2025-09-12T17:52:54.853258906Z" level=info msg="CreateContainer within sandbox \"307f7c72fd4269d9ca5e790bd3c8f2cd80094914db6728f359e760602c2eb4a7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"055cde9e4149ba2bc1cc33bdd0e0bbb68f2b6eda333e3eb7da35bb5dc25614aa\"" Sep 12 17:52:54.854665 containerd[1529]: time="2025-09-12T17:52:54.854616321Z" level=info msg="StartContainer for \"055cde9e4149ba2bc1cc33bdd0e0bbb68f2b6eda333e3eb7da35bb5dc25614aa\"" Sep 12 17:52:54.857180 containerd[1529]: time="2025-09-12T17:52:54.857134465Z" level=info msg="connecting to shim 055cde9e4149ba2bc1cc33bdd0e0bbb68f2b6eda333e3eb7da35bb5dc25614aa" address="unix:///run/containerd/s/15b1df06340f8f2952bfabfb1095a6df9c78ea56de879aa988262422a829ba0e" protocol=ttrpc version=3 Sep 12 17:52:54.910247 systemd[1]: Started cri-containerd-055cde9e4149ba2bc1cc33bdd0e0bbb68f2b6eda333e3eb7da35bb5dc25614aa.scope - libcontainer container 055cde9e4149ba2bc1cc33bdd0e0bbb68f2b6eda333e3eb7da35bb5dc25614aa. Sep 12 17:52:54.986185 containerd[1529]: time="2025-09-12T17:52:54.986040621Z" level=info msg="StartContainer for \"055cde9e4149ba2bc1cc33bdd0e0bbb68f2b6eda333e3eb7da35bb5dc25614aa\" returns successfully" Sep 12 17:52:57.436175 kubelet[2758]: I0912 17:52:57.435827 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b55c6c674-wfz7h" podStartSLOduration=33.606356321 podStartE2EDuration="39.435797225s" podCreationTimestamp="2025-09-12 17:52:18 +0000 UTC" firstStartedPulling="2025-09-12 17:52:48.992542003 +0000 UTC m=+48.368307414" lastFinishedPulling="2025-09-12 17:52:54.821982891 +0000 UTC m=+54.197748318" observedRunningTime="2025-09-12 17:52:55.281020583 +0000 UTC m=+54.656786039" watchObservedRunningTime="2025-09-12 17:52:57.435797225 +0000 UTC m=+56.811562656" Sep 12 17:52:58.510077 containerd[1529]: time="2025-09-12T17:52:58.509825507Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:58.511054 containerd[1529]: time="2025-09-12T17:52:58.511014648Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 17:52:58.512730 containerd[1529]: time="2025-09-12T17:52:58.512687754Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:58.516747 containerd[1529]: time="2025-09-12T17:52:58.516702562Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:52:58.518971 containerd[1529]: time="2025-09-12T17:52:58.518930077Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.696131827s" Sep 12 17:52:58.519086 containerd[1529]: time="2025-09-12T17:52:58.518977087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 17:52:58.521492 containerd[1529]: time="2025-09-12T17:52:58.520594344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:52:58.556473 containerd[1529]: time="2025-09-12T17:52:58.555994368Z" level=info msg="CreateContainer within sandbox \"c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:52:58.571884 containerd[1529]: time="2025-09-12T17:52:58.568589607Z" level=info msg="Container 54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:52:58.582669 containerd[1529]: time="2025-09-12T17:52:58.582595923Z" level=info msg="CreateContainer within sandbox \"c8d3ba23a5e0d9b10ffcf90ae2d3da4fadf424e84c49835eaf0d6368d05a4538\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994\"" Sep 12 17:52:58.584962 containerd[1529]: time="2025-09-12T17:52:58.584285025Z" level=info msg="StartContainer for \"54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994\"" Sep 12 17:52:58.586281 containerd[1529]: time="2025-09-12T17:52:58.586209533Z" level=info msg="connecting to shim 54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994" address="unix:///run/containerd/s/a89b9cf3b89b8e3258a3f7c58d0e8e6520ff0a8cb0e3f9012c27df93545fd49d" protocol=ttrpc version=3 Sep 12 17:52:58.628155 systemd[1]: Started cri-containerd-54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994.scope - libcontainer container 54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994. Sep 12 17:52:58.707656 containerd[1529]: time="2025-09-12T17:52:58.707505034Z" level=info msg="StartContainer for \"54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994\" returns successfully" Sep 12 17:52:59.305443 kubelet[2758]: I0912 17:52:59.305322 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6c5659d975-2sszf" podStartSLOduration=24.879181237 podStartE2EDuration="34.305295751s" podCreationTimestamp="2025-09-12 17:52:25 +0000 UTC" firstStartedPulling="2025-09-12 17:52:49.09421943 +0000 UTC m=+48.469984842" lastFinishedPulling="2025-09-12 17:52:58.520333936 +0000 UTC m=+57.896099356" observedRunningTime="2025-09-12 17:52:59.303156707 +0000 UTC m=+58.678922128" watchObservedRunningTime="2025-09-12 17:52:59.305295751 +0000 UTC m=+58.681061182" Sep 12 17:52:59.396275 containerd[1529]: time="2025-09-12T17:52:59.396206681Z" level=info msg="TaskExit event in podsandbox handler container_id:\"54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994\" id:\"16105487ea075570f58dbe59de6ffabf1c2e9aa7188abe25e5816f6e1d9d8077\" pid:5039 exited_at:{seconds:1757699579 nanos:394221174}" Sep 12 17:53:01.024632 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount198751304.mount: Deactivated successfully. Sep 12 17:53:01.887846 containerd[1529]: time="2025-09-12T17:53:01.887768913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:53:01.890022 containerd[1529]: time="2025-09-12T17:53:01.889913642Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 17:53:01.891727 containerd[1529]: time="2025-09-12T17:53:01.891636016Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:53:01.895242 containerd[1529]: time="2025-09-12T17:53:01.895143976Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:53:01.896449 containerd[1529]: time="2025-09-12T17:53:01.896268583Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.37562888s" Sep 12 17:53:01.896449 containerd[1529]: time="2025-09-12T17:53:01.896323962Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 17:53:01.898486 containerd[1529]: time="2025-09-12T17:53:01.898182363Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:53:01.901363 containerd[1529]: time="2025-09-12T17:53:01.901313309Z" level=info msg="CreateContainer within sandbox \"0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:53:01.913608 containerd[1529]: time="2025-09-12T17:53:01.913561398Z" level=info msg="Container 919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:53:01.925099 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3486056822.mount: Deactivated successfully. Sep 12 17:53:01.932945 containerd[1529]: time="2025-09-12T17:53:01.932891863Z" level=info msg="CreateContainer within sandbox \"0a9c81a4c27664ae9cd0c987ab8d6fa662598024c297a4daad3ddb35d474353d\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\"" Sep 12 17:53:01.934270 containerd[1529]: time="2025-09-12T17:53:01.933963271Z" level=info msg="StartContainer for \"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\"" Sep 12 17:53:01.935997 containerd[1529]: time="2025-09-12T17:53:01.935938860Z" level=info msg="connecting to shim 919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19" address="unix:///run/containerd/s/ec224cb8ebf1d866e1737a1fae604d88997e4bdefb6c04a5488427662eaea336" protocol=ttrpc version=3 Sep 12 17:53:01.974186 systemd[1]: Started cri-containerd-919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19.scope - libcontainer container 919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19. Sep 12 17:53:02.053024 containerd[1529]: time="2025-09-12T17:53:02.052975443Z" level=info msg="StartContainer for \"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\" returns successfully" Sep 12 17:53:03.371899 containerd[1529]: time="2025-09-12T17:53:03.370803067Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b\" id:\"9578750f3face52f643d0e6747a88ca98a80c425c70f76aaaadc0dcc8f6e2030\" pid:5119 exited_at:{seconds:1757699583 nanos:361938620}" Sep 12 17:53:03.408995 kubelet[2758]: I0912 17:53:03.408905 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-b95h5" podStartSLOduration=26.776931567 podStartE2EDuration="39.408873053s" podCreationTimestamp="2025-09-12 17:52:24 +0000 UTC" firstStartedPulling="2025-09-12 17:52:49.265893326 +0000 UTC m=+48.641658736" lastFinishedPulling="2025-09-12 17:53:01.89783479 +0000 UTC m=+61.273600222" observedRunningTime="2025-09-12 17:53:02.314261283 +0000 UTC m=+61.690026714" watchObservedRunningTime="2025-09-12 17:53:03.408873053 +0000 UTC m=+62.784638480" Sep 12 17:53:03.432220 containerd[1529]: time="2025-09-12T17:53:03.431194942Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:53:03.433809 containerd[1529]: time="2025-09-12T17:53:03.433764468Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 17:53:03.434918 containerd[1529]: time="2025-09-12T17:53:03.434883180Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:53:03.439982 containerd[1529]: time="2025-09-12T17:53:03.439944537Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:53:03.441669 containerd[1529]: time="2025-09-12T17:53:03.441599437Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.543368298s" Sep 12 17:53:03.441971 containerd[1529]: time="2025-09-12T17:53:03.441843761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 17:53:03.444829 containerd[1529]: time="2025-09-12T17:53:03.444788460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:53:03.451644 containerd[1529]: time="2025-09-12T17:53:03.451575650Z" level=info msg="CreateContainer within sandbox \"87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:53:03.468347 containerd[1529]: time="2025-09-12T17:53:03.468283469Z" level=info msg="Container dd1fe6aba2cb02996c131551048cffea28cbdf8b31db6bad4d5f0f84582fa226: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:53:03.492548 containerd[1529]: time="2025-09-12T17:53:03.492487504Z" level=info msg="CreateContainer within sandbox \"87236d89952ebf75d2026ad7ec5a2d59947f5cb78c8fac34015f2e28af9194f9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"dd1fe6aba2cb02996c131551048cffea28cbdf8b31db6bad4d5f0f84582fa226\"" Sep 12 17:53:03.493903 containerd[1529]: time="2025-09-12T17:53:03.493683399Z" level=info msg="StartContainer for \"dd1fe6aba2cb02996c131551048cffea28cbdf8b31db6bad4d5f0f84582fa226\"" Sep 12 17:53:03.508998 containerd[1529]: time="2025-09-12T17:53:03.508915659Z" level=info msg="connecting to shim dd1fe6aba2cb02996c131551048cffea28cbdf8b31db6bad4d5f0f84582fa226" address="unix:///run/containerd/s/694455fca1f0ccb7ff44f7ab55fbc06d01c94d070afaa85c96c07e8e97f6f0bf" protocol=ttrpc version=3 Sep 12 17:53:03.568318 systemd[1]: Started cri-containerd-dd1fe6aba2cb02996c131551048cffea28cbdf8b31db6bad4d5f0f84582fa226.scope - libcontainer container dd1fe6aba2cb02996c131551048cffea28cbdf8b31db6bad4d5f0f84582fa226. Sep 12 17:53:03.573027 containerd[1529]: time="2025-09-12T17:53:03.572945357Z" level=info msg="TaskExit event in podsandbox handler container_id:\"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\" id:\"46a26bc4b1644c644e2e89b2b4dd6bb94c71d01e1ed7c212d978d2ee3ca0f0a3\" pid:5144 exit_status:1 exited_at:{seconds:1757699583 nanos:571063254}" Sep 12 17:53:03.640818 containerd[1529]: time="2025-09-12T17:53:03.640116289Z" level=info msg="StartContainer for \"dd1fe6aba2cb02996c131551048cffea28cbdf8b31db6bad4d5f0f84582fa226\" returns successfully" Sep 12 17:53:03.659483 containerd[1529]: time="2025-09-12T17:53:03.659397029Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:53:03.661243 containerd[1529]: time="2025-09-12T17:53:03.661181387Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:53:03.663758 containerd[1529]: time="2025-09-12T17:53:03.663712991Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 218.639559ms" Sep 12 17:53:03.663758 containerd[1529]: time="2025-09-12T17:53:03.663758428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:53:03.668455 containerd[1529]: time="2025-09-12T17:53:03.668401253Z" level=info msg="CreateContainer within sandbox \"19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:53:03.680394 containerd[1529]: time="2025-09-12T17:53:03.677196142Z" level=info msg="Container ced9a970c8a3ec357d6ede100beba09c92d6accccb72b2d13e974d189b127d5d: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:53:03.694479 containerd[1529]: time="2025-09-12T17:53:03.694315244Z" level=info msg="CreateContainer within sandbox \"19cf85de27cb6f655869c6c0e6a456a9bc5c9ff9e1b7e0443250e6d2a811903d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ced9a970c8a3ec357d6ede100beba09c92d6accccb72b2d13e974d189b127d5d\"" Sep 12 17:53:03.697889 containerd[1529]: time="2025-09-12T17:53:03.697787722Z" level=info msg="StartContainer for \"ced9a970c8a3ec357d6ede100beba09c92d6accccb72b2d13e974d189b127d5d\"" Sep 12 17:53:03.699346 containerd[1529]: time="2025-09-12T17:53:03.699255091Z" level=info msg="connecting to shim ced9a970c8a3ec357d6ede100beba09c92d6accccb72b2d13e974d189b127d5d" address="unix:///run/containerd/s/a65e9cd57d2ee2c96493939e757628c673eb5beaeb828c8a80c97f2f2eff7c55" protocol=ttrpc version=3 Sep 12 17:53:03.731129 systemd[1]: Started cri-containerd-ced9a970c8a3ec357d6ede100beba09c92d6accccb72b2d13e974d189b127d5d.scope - libcontainer container ced9a970c8a3ec357d6ede100beba09c92d6accccb72b2d13e974d189b127d5d. Sep 12 17:53:03.801871 containerd[1529]: time="2025-09-12T17:53:03.801702156Z" level=info msg="StartContainer for \"ced9a970c8a3ec357d6ede100beba09c92d6accccb72b2d13e974d189b127d5d\" returns successfully" Sep 12 17:53:03.998547 kubelet[2758]: I0912 17:53:03.998408 2758 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:53:03.998547 kubelet[2758]: I0912 17:53:03.998452 2758 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:53:04.338253 kubelet[2758]: I0912 17:53:04.337958 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-gxnn6" podStartSLOduration=24.1374336 podStartE2EDuration="40.337919023s" podCreationTimestamp="2025-09-12 17:52:24 +0000 UTC" firstStartedPulling="2025-09-12 17:52:47.244069507 +0000 UTC m=+46.619834928" lastFinishedPulling="2025-09-12 17:53:03.444554923 +0000 UTC m=+62.820320351" observedRunningTime="2025-09-12 17:53:04.33728332 +0000 UTC m=+63.713048754" watchObservedRunningTime="2025-09-12 17:53:04.337919023 +0000 UTC m=+63.713684452" Sep 12 17:53:04.573421 containerd[1529]: time="2025-09-12T17:53:04.573350553Z" level=info msg="TaskExit event in podsandbox handler container_id:\"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\" id:\"70ec0deee04ddcb8f12203911145dbf98164d79bd9f863df8a94cec87d95705b\" pid:5236 exit_status:1 exited_at:{seconds:1757699584 nanos:572951573}" Sep 12 17:53:05.309710 containerd[1529]: time="2025-09-12T17:53:05.309553425Z" level=info msg="TaskExit event in podsandbox handler container_id:\"54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994\" id:\"f3de2d7a65cdc7e409f07298b873496b4a81f3b9fa26e028e287c12cc060d397\" pid:5278 exited_at:{seconds:1757699585 nanos:308956415}" Sep 12 17:53:05.316290 kubelet[2758]: I0912 17:53:05.316247 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:53:05.443236 containerd[1529]: time="2025-09-12T17:53:05.443130336Z" level=info msg="TaskExit event in podsandbox handler container_id:\"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\" id:\"82ec8485e28bec7506e7045f99a2237956f2034f01e782411ca6f8cd3e826065\" pid:5272 exit_status:1 exited_at:{seconds:1757699585 nanos:442346745}" Sep 12 17:53:09.623004 systemd[1]: Started sshd@7-10.128.0.19:22-139.178.68.195:56618.service - OpenSSH per-connection server daemon (139.178.68.195:56618). Sep 12 17:53:10.041146 sshd[5307]: Accepted publickey for core from 139.178.68.195 port 56618 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:10.044265 sshd-session[5307]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:10.060689 systemd-logind[1487]: New session 8 of user core. Sep 12 17:53:10.067116 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:53:10.495010 sshd[5310]: Connection closed by 139.178.68.195 port 56618 Sep 12 17:53:10.498250 sshd-session[5307]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:10.508134 systemd-logind[1487]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:53:10.509025 systemd[1]: sshd@7-10.128.0.19:22-139.178.68.195:56618.service: Deactivated successfully. Sep 12 17:53:10.515742 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:53:10.519075 systemd-logind[1487]: Removed session 8. Sep 12 17:53:15.563788 systemd[1]: Started sshd@8-10.128.0.19:22-139.178.68.195:58028.service - OpenSSH per-connection server daemon (139.178.68.195:58028). Sep 12 17:53:15.970127 sshd[5325]: Accepted publickey for core from 139.178.68.195 port 58028 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:15.974311 sshd-session[5325]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:15.991351 systemd-logind[1487]: New session 9 of user core. Sep 12 17:53:15.996159 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:53:16.438496 containerd[1529]: time="2025-09-12T17:53:16.438243187Z" level=info msg="TaskExit event in podsandbox handler container_id:\"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\" id:\"2eddbe2026bc1e3a8f873f2ac167517c90ec6afb57c01775ce83e96275c5a918\" pid:5339 exited_at:{seconds:1757699596 nanos:436603238}" Sep 12 17:53:16.443040 sshd[5344]: Connection closed by 139.178.68.195 port 58028 Sep 12 17:53:16.446340 sshd-session[5325]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:16.453689 systemd-logind[1487]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:53:16.454689 systemd[1]: sshd@8-10.128.0.19:22-139.178.68.195:58028.service: Deactivated successfully. Sep 12 17:53:16.459742 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:53:16.466956 systemd-logind[1487]: Removed session 9. Sep 12 17:53:21.523229 systemd[1]: Started sshd@9-10.128.0.19:22-139.178.68.195:55234.service - OpenSSH per-connection server daemon (139.178.68.195:55234). Sep 12 17:53:21.931029 sshd[5363]: Accepted publickey for core from 139.178.68.195 port 55234 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:21.932979 sshd-session[5363]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:21.940046 systemd-logind[1487]: New session 10 of user core. Sep 12 17:53:21.947201 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:53:22.336745 sshd[5366]: Connection closed by 139.178.68.195 port 55234 Sep 12 17:53:22.339151 sshd-session[5363]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:22.350288 systemd[1]: sshd@9-10.128.0.19:22-139.178.68.195:55234.service: Deactivated successfully. Sep 12 17:53:22.357236 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:53:22.363429 systemd-logind[1487]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:53:22.367200 systemd-logind[1487]: Removed session 10. Sep 12 17:53:22.412042 systemd[1]: Started sshd@10-10.128.0.19:22-139.178.68.195:55240.service - OpenSSH per-connection server daemon (139.178.68.195:55240). Sep 12 17:53:22.835150 sshd[5379]: Accepted publickey for core from 139.178.68.195 port 55240 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:22.839786 sshd-session[5379]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:22.855977 systemd-logind[1487]: New session 11 of user core. Sep 12 17:53:22.859081 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:53:23.357104 sshd[5383]: Connection closed by 139.178.68.195 port 55240 Sep 12 17:53:23.358505 sshd-session[5379]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:23.369286 systemd[1]: sshd@10-10.128.0.19:22-139.178.68.195:55240.service: Deactivated successfully. Sep 12 17:53:23.369599 systemd-logind[1487]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:53:23.376655 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:53:23.386891 systemd-logind[1487]: Removed session 11. Sep 12 17:53:23.431109 systemd[1]: Started sshd@11-10.128.0.19:22-139.178.68.195:55242.service - OpenSSH per-connection server daemon (139.178.68.195:55242). Sep 12 17:53:23.860820 sshd[5393]: Accepted publickey for core from 139.178.68.195 port 55242 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:23.864780 sshd-session[5393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:23.877673 systemd-logind[1487]: New session 12 of user core. Sep 12 17:53:23.885114 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:53:24.243344 sshd[5396]: Connection closed by 139.178.68.195 port 55242 Sep 12 17:53:24.244224 sshd-session[5393]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:24.252699 systemd-logind[1487]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:53:24.253628 systemd[1]: sshd@11-10.128.0.19:22-139.178.68.195:55242.service: Deactivated successfully. Sep 12 17:53:24.256812 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:53:24.260328 systemd-logind[1487]: Removed session 12. Sep 12 17:53:27.575747 containerd[1529]: time="2025-09-12T17:53:27.575691308Z" level=info msg="TaskExit event in podsandbox handler container_id:\"54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994\" id:\"7306442272e6c3e869911384608195d059df301ca098799097b2570d08a9c93a\" pid:5425 exited_at:{seconds:1757699607 nanos:575134927}" Sep 12 17:53:29.319647 systemd[1]: Started sshd@12-10.128.0.19:22-139.178.68.195:55258.service - OpenSSH per-connection server daemon (139.178.68.195:55258). Sep 12 17:53:29.739038 sshd[5438]: Accepted publickey for core from 139.178.68.195 port 55258 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:29.741728 sshd-session[5438]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:29.757392 systemd-logind[1487]: New session 13 of user core. Sep 12 17:53:29.767092 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:53:30.158841 sshd[5441]: Connection closed by 139.178.68.195 port 55258 Sep 12 17:53:30.161095 sshd-session[5438]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:30.170807 systemd-logind[1487]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:53:30.172220 systemd[1]: sshd@12-10.128.0.19:22-139.178.68.195:55258.service: Deactivated successfully. Sep 12 17:53:30.177488 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:53:30.182778 systemd-logind[1487]: Removed session 13. Sep 12 17:53:32.824168 kubelet[2758]: I0912 17:53:32.824117 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:53:33.012723 kubelet[2758]: I0912 17:53:33.012640 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b55c6c674-l4q6p" podStartSLOduration=61.775686765 podStartE2EDuration="1m15.012615469s" podCreationTimestamp="2025-09-12 17:52:18 +0000 UTC" firstStartedPulling="2025-09-12 17:52:50.427697641 +0000 UTC m=+49.803463060" lastFinishedPulling="2025-09-12 17:53:03.664626337 +0000 UTC m=+63.040391764" observedRunningTime="2025-09-12 17:53:04.364839596 +0000 UTC m=+63.740605027" watchObservedRunningTime="2025-09-12 17:53:33.012615469 +0000 UTC m=+92.388380898" Sep 12 17:53:33.375972 containerd[1529]: time="2025-09-12T17:53:33.375656837Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b\" id:\"0ce8f4c85b873b341c4804789c6d4cdae20c12e245e847c6862a5f95e1493023\" pid:5472 exit_status:1 exited_at:{seconds:1757699613 nanos:375247710}" Sep 12 17:53:35.239710 systemd[1]: Started sshd@13-10.128.0.19:22-139.178.68.195:52980.service - OpenSSH per-connection server daemon (139.178.68.195:52980). Sep 12 17:53:35.246706 containerd[1529]: time="2025-09-12T17:53:35.246246004Z" level=info msg="TaskExit event in podsandbox handler container_id:\"54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994\" id:\"92fc4f69f5aa48cf9589f5abe172233ba3d26b1e4ebd928e587b5f38d7532d95\" pid:5501 exited_at:{seconds:1757699615 nanos:245475198}" Sep 12 17:53:35.410733 containerd[1529]: time="2025-09-12T17:53:35.410641425Z" level=info msg="TaskExit event in podsandbox handler container_id:\"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\" id:\"056ed285b251f6bec79bafa11bf8ddcba02abb296a50e038eb6baa6d37dcd4ab\" pid:5514 exited_at:{seconds:1757699615 nanos:408956740}" Sep 12 17:53:35.683013 sshd[5523]: Accepted publickey for core from 139.178.68.195 port 52980 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:35.686055 sshd-session[5523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:35.698302 systemd-logind[1487]: New session 14 of user core. Sep 12 17:53:35.706185 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:53:36.106885 sshd[5533]: Connection closed by 139.178.68.195 port 52980 Sep 12 17:53:36.107762 sshd-session[5523]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:36.116205 systemd-logind[1487]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:53:36.117118 systemd[1]: sshd@13-10.128.0.19:22-139.178.68.195:52980.service: Deactivated successfully. Sep 12 17:53:36.123961 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:53:36.129389 systemd-logind[1487]: Removed session 14. Sep 12 17:53:41.184291 systemd[1]: Started sshd@14-10.128.0.19:22-139.178.68.195:54280.service - OpenSSH per-connection server daemon (139.178.68.195:54280). Sep 12 17:53:41.603925 sshd[5547]: Accepted publickey for core from 139.178.68.195 port 54280 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:41.605933 sshd-session[5547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:41.613943 systemd-logind[1487]: New session 15 of user core. Sep 12 17:53:41.621130 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:53:42.012014 sshd[5550]: Connection closed by 139.178.68.195 port 54280 Sep 12 17:53:42.013246 sshd-session[5547]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:42.024932 systemd-logind[1487]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:53:42.025672 systemd[1]: sshd@14-10.128.0.19:22-139.178.68.195:54280.service: Deactivated successfully. Sep 12 17:53:42.031564 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:53:42.037454 systemd-logind[1487]: Removed session 15. Sep 12 17:53:42.083057 systemd[1]: Started sshd@15-10.128.0.19:22-139.178.68.195:54296.service - OpenSSH per-connection server daemon (139.178.68.195:54296). Sep 12 17:53:42.482811 sshd[5562]: Accepted publickey for core from 139.178.68.195 port 54296 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:42.487437 sshd-session[5562]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:42.498163 systemd-logind[1487]: New session 16 of user core. Sep 12 17:53:42.506997 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:53:42.963723 sshd[5565]: Connection closed by 139.178.68.195 port 54296 Sep 12 17:53:42.966282 sshd-session[5562]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:42.977537 systemd[1]: sshd@15-10.128.0.19:22-139.178.68.195:54296.service: Deactivated successfully. Sep 12 17:53:42.982769 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:53:42.985098 systemd-logind[1487]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:53:42.987845 systemd-logind[1487]: Removed session 16. Sep 12 17:53:43.039260 systemd[1]: Started sshd@16-10.128.0.19:22-139.178.68.195:54300.service - OpenSSH per-connection server daemon (139.178.68.195:54300). Sep 12 17:53:43.466687 sshd[5575]: Accepted publickey for core from 139.178.68.195 port 54300 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:43.469609 sshd-session[5575]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:43.487166 systemd-logind[1487]: New session 17 of user core. Sep 12 17:53:43.492154 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:53:47.021966 sshd[5578]: Connection closed by 139.178.68.195 port 54300 Sep 12 17:53:47.023143 sshd-session[5575]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:47.038662 systemd[1]: sshd@16-10.128.0.19:22-139.178.68.195:54300.service: Deactivated successfully. Sep 12 17:53:47.039349 systemd-logind[1487]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:53:47.043532 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:53:47.044680 systemd[1]: session-17.scope: Consumed 928ms CPU time, 84.7M memory peak. Sep 12 17:53:47.049368 systemd-logind[1487]: Removed session 17. Sep 12 17:53:47.090918 systemd[1]: Started sshd@17-10.128.0.19:22-139.178.68.195:54302.service - OpenSSH per-connection server daemon (139.178.68.195:54302). Sep 12 17:53:47.489366 sshd[5595]: Accepted publickey for core from 139.178.68.195 port 54302 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:47.493909 sshd-session[5595]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:47.507139 systemd-logind[1487]: New session 18 of user core. Sep 12 17:53:47.514256 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:53:48.137919 sshd[5598]: Connection closed by 139.178.68.195 port 54302 Sep 12 17:53:48.138230 sshd-session[5595]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:48.149418 systemd-logind[1487]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:53:48.150127 systemd[1]: sshd@17-10.128.0.19:22-139.178.68.195:54302.service: Deactivated successfully. Sep 12 17:53:48.156195 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:53:48.164799 systemd-logind[1487]: Removed session 18. Sep 12 17:53:48.212251 systemd[1]: Started sshd@18-10.128.0.19:22-139.178.68.195:54304.service - OpenSSH per-connection server daemon (139.178.68.195:54304). Sep 12 17:53:48.639735 sshd[5608]: Accepted publickey for core from 139.178.68.195 port 54304 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:48.642805 sshd-session[5608]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:48.655777 systemd-logind[1487]: New session 19 of user core. Sep 12 17:53:48.661221 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:53:49.090439 sshd[5611]: Connection closed by 139.178.68.195 port 54304 Sep 12 17:53:49.092153 sshd-session[5608]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:49.100266 systemd-logind[1487]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:53:49.101219 systemd[1]: sshd@18-10.128.0.19:22-139.178.68.195:54304.service: Deactivated successfully. Sep 12 17:53:49.108369 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:53:49.114224 systemd-logind[1487]: Removed session 19. Sep 12 17:53:54.168789 systemd[1]: Started sshd@19-10.128.0.19:22-139.178.68.195:35422.service - OpenSSH per-connection server daemon (139.178.68.195:35422). Sep 12 17:53:54.586155 sshd[5623]: Accepted publickey for core from 139.178.68.195 port 35422 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:53:54.589193 sshd-session[5623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:53:54.601943 systemd-logind[1487]: New session 20 of user core. Sep 12 17:53:54.607490 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 17:53:55.020134 sshd[5627]: Connection closed by 139.178.68.195 port 35422 Sep 12 17:53:55.022231 sshd-session[5623]: pam_unix(sshd:session): session closed for user core Sep 12 17:53:55.033922 systemd[1]: sshd@19-10.128.0.19:22-139.178.68.195:35422.service: Deactivated successfully. Sep 12 17:53:55.037389 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 17:53:55.040488 systemd-logind[1487]: Session 20 logged out. Waiting for processes to exit. Sep 12 17:53:55.044091 systemd-logind[1487]: Removed session 20. Sep 12 17:54:00.089762 systemd[1]: Started sshd@20-10.128.0.19:22-139.178.68.195:55854.service - OpenSSH per-connection server daemon (139.178.68.195:55854). Sep 12 17:54:00.490409 sshd[5641]: Accepted publickey for core from 139.178.68.195 port 55854 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:54:00.493785 sshd-session[5641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:00.507024 systemd-logind[1487]: New session 21 of user core. Sep 12 17:54:00.514095 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 12 17:54:00.896606 sshd[5644]: Connection closed by 139.178.68.195 port 55854 Sep 12 17:54:00.898225 sshd-session[5641]: pam_unix(sshd:session): session closed for user core Sep 12 17:54:00.909424 systemd-logind[1487]: Session 21 logged out. Waiting for processes to exit. Sep 12 17:54:00.909820 systemd[1]: sshd@20-10.128.0.19:22-139.178.68.195:55854.service: Deactivated successfully. Sep 12 17:54:00.916593 systemd[1]: session-21.scope: Deactivated successfully. Sep 12 17:54:00.923429 systemd-logind[1487]: Removed session 21. Sep 12 17:54:03.341279 containerd[1529]: time="2025-09-12T17:54:03.341206872Z" level=info msg="TaskExit event in podsandbox handler container_id:\"af37ede9b2bdb4ef039235365d645d6e326f69504ee17ba2b4f1072114f0d12b\" id:\"7a2b62b986b4c7c0dc3dc20f854a95f6a933900e9fa9e9de16f73950b435e953\" pid:5669 exited_at:{seconds:1757699643 nanos:339420848}" Sep 12 17:54:05.252637 containerd[1529]: time="2025-09-12T17:54:05.252232031Z" level=info msg="TaskExit event in podsandbox handler container_id:\"54da67e6e59b466e1126912279496688235a463570daa921be6ec5663393f994\" id:\"ce07c2622d146a70249a6dc79c6b77a0db92ed190eb31f254cb196f68a610318\" pid:5703 exited_at:{seconds:1757699645 nanos:250400933}" Sep 12 17:54:05.405395 containerd[1529]: time="2025-09-12T17:54:05.405335001Z" level=info msg="TaskExit event in podsandbox handler container_id:\"919661de885b2d716ff02ce49c9d7a2b2f1c527ed9a1e7c701afdc15fed67f19\" id:\"14614edc5779f7038acf3a72e1f1d1313e1365f1c7e93946bf38099ab2de608b\" pid:5712 exited_at:{seconds:1757699645 nanos:404970216}" Sep 12 17:54:05.975282 systemd[1]: Started sshd@21-10.128.0.19:22-139.178.68.195:55868.service - OpenSSH per-connection server daemon (139.178.68.195:55868). Sep 12 17:54:06.400576 sshd[5726]: Accepted publickey for core from 139.178.68.195 port 55868 ssh2: RSA SHA256:nWHxAgnTwjfxyndBbNSgynLNsaSUYjfzuT8jkGwZQK4 Sep 12 17:54:06.404451 sshd-session[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:54:06.414930 systemd-logind[1487]: New session 22 of user core. Sep 12 17:54:06.422095 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 12 17:54:06.824004 sshd[5735]: Connection closed by 139.178.68.195 port 55868 Sep 12 17:54:06.828208 sshd-session[5726]: pam_unix(sshd:session): session closed for user core Sep 12 17:54:06.846202 systemd-logind[1487]: Session 22 logged out. Waiting for processes to exit. Sep 12 17:54:06.847124 systemd[1]: sshd@21-10.128.0.19:22-139.178.68.195:55868.service: Deactivated successfully. Sep 12 17:54:06.853715 systemd[1]: session-22.scope: Deactivated successfully. Sep 12 17:54:06.858439 systemd-logind[1487]: Removed session 22.