Sep 12 17:42:34.797844 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 15:34:39 -00 2025 Sep 12 17:42:34.797868 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:42:34.797876 kernel: BIOS-provided physical RAM map: Sep 12 17:42:34.797881 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 12 17:42:34.797886 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 12 17:42:34.797891 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 12 17:42:34.797898 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Sep 12 17:42:34.797903 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Sep 12 17:42:34.797908 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 12 17:42:34.797913 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 12 17:42:34.797918 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 17:42:34.797923 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 12 17:42:34.797928 kernel: NX (Execute Disable) protection: active Sep 12 17:42:34.797933 kernel: APIC: Static calls initialized Sep 12 17:42:34.797940 kernel: SMBIOS 2.8 present. Sep 12 17:42:34.797946 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Sep 12 17:42:34.797951 kernel: DMI: Memory slots populated: 1/1 Sep 12 17:42:34.797956 kernel: Hypervisor detected: KVM Sep 12 17:42:34.797961 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 12 17:42:34.797967 kernel: kvm-clock: using sched offset of 4069768832 cycles Sep 12 17:42:34.797973 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 17:42:34.797978 kernel: tsc: Detected 2445.406 MHz processor Sep 12 17:42:34.797985 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 17:42:34.797991 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 17:42:34.797996 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Sep 12 17:42:34.798002 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 12 17:42:34.798007 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 17:42:34.798013 kernel: Using GB pages for direct mapping Sep 12 17:42:34.798018 kernel: ACPI: Early table checksum verification disabled Sep 12 17:42:34.798023 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Sep 12 17:42:34.798029 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:42:34.798035 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:42:34.798041 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:42:34.798046 kernel: ACPI: FACS 0x000000007CFE0000 000040 Sep 12 17:42:34.798051 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:42:34.798057 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:42:34.798062 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:42:34.798068 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 17:42:34.798073 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Sep 12 17:42:34.798080 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Sep 12 17:42:34.798087 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Sep 12 17:42:34.798093 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Sep 12 17:42:34.798099 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Sep 12 17:42:34.798105 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Sep 12 17:42:34.798110 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Sep 12 17:42:34.798117 kernel: No NUMA configuration found Sep 12 17:42:34.798123 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Sep 12 17:42:34.798128 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Sep 12 17:42:34.798134 kernel: Zone ranges: Sep 12 17:42:34.798139 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 17:42:34.798145 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Sep 12 17:42:34.798151 kernel: Normal empty Sep 12 17:42:34.798156 kernel: Device empty Sep 12 17:42:34.798161 kernel: Movable zone start for each node Sep 12 17:42:34.798168 kernel: Early memory node ranges Sep 12 17:42:34.798174 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 12 17:42:34.798179 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Sep 12 17:42:34.798185 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Sep 12 17:42:34.798191 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 17:42:34.798196 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 12 17:42:34.798202 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 12 17:42:34.798207 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 12 17:42:34.798213 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 12 17:42:34.798219 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 12 17:42:34.798225 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 12 17:42:34.798231 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 12 17:42:34.798237 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 17:42:34.798242 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 12 17:42:34.798248 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 12 17:42:34.798253 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 17:42:34.798259 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 17:42:34.798265 kernel: CPU topo: Max. logical packages: 1 Sep 12 17:42:34.798270 kernel: CPU topo: Max. logical dies: 1 Sep 12 17:42:34.798277 kernel: CPU topo: Max. dies per package: 1 Sep 12 17:42:34.798283 kernel: CPU topo: Max. threads per core: 1 Sep 12 17:42:34.798288 kernel: CPU topo: Num. cores per package: 2 Sep 12 17:42:34.798294 kernel: CPU topo: Num. threads per package: 2 Sep 12 17:42:34.798299 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 12 17:42:34.798305 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 12 17:42:34.798310 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 12 17:42:34.798316 kernel: Booting paravirtualized kernel on KVM Sep 12 17:42:34.798322 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 17:42:34.798329 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 12 17:42:34.798334 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 12 17:42:34.798340 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 12 17:42:34.798346 kernel: pcpu-alloc: [0] 0 1 Sep 12 17:42:34.798351 kernel: kvm-guest: PV spinlocks disabled, no host support Sep 12 17:42:34.798358 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:42:34.798364 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 17:42:34.798369 kernel: random: crng init done Sep 12 17:42:34.798375 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 17:42:34.798382 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 17:42:34.798387 kernel: Fallback order for Node 0: 0 Sep 12 17:42:34.798393 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Sep 12 17:42:34.798398 kernel: Policy zone: DMA32 Sep 12 17:42:34.798404 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 17:42:34.798410 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 17:42:34.798416 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 17:42:34.798421 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 17:42:34.798427 kernel: Dynamic Preempt: voluntary Sep 12 17:42:34.798434 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 17:42:34.798440 kernel: rcu: RCU event tracing is enabled. Sep 12 17:42:34.798446 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 17:42:34.798451 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 17:42:34.798457 kernel: Rude variant of Tasks RCU enabled. Sep 12 17:42:34.798463 kernel: Tracing variant of Tasks RCU enabled. Sep 12 17:42:34.798468 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 17:42:34.798474 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 17:42:34.798479 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:42:34.798486 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:42:34.798492 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 17:42:34.798498 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 12 17:42:34.798503 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 17:42:34.798509 kernel: Console: colour VGA+ 80x25 Sep 12 17:42:34.798515 kernel: printk: legacy console [tty0] enabled Sep 12 17:42:34.798520 kernel: printk: legacy console [ttyS0] enabled Sep 12 17:42:34.798526 kernel: ACPI: Core revision 20240827 Sep 12 17:42:34.798532 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 12 17:42:34.798543 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 17:42:34.798549 kernel: x2apic enabled Sep 12 17:42:34.798555 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 17:42:34.798562 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 17:42:34.798568 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Sep 12 17:42:34.798574 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) Sep 12 17:42:34.798580 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 12 17:42:34.798586 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 12 17:42:34.798592 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 12 17:42:34.798600 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 17:42:34.798605 kernel: Spectre V2 : Mitigation: Retpolines Sep 12 17:42:34.798611 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 17:42:34.798617 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 12 17:42:34.798623 kernel: active return thunk: retbleed_return_thunk Sep 12 17:42:34.798629 kernel: RETBleed: Mitigation: untrained return thunk Sep 12 17:42:34.798635 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 17:42:34.798642 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 17:42:34.798648 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 17:42:34.798654 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 17:42:34.798670 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 17:42:34.798676 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 17:42:34.798682 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 17:42:34.798688 kernel: Freeing SMP alternatives memory: 32K Sep 12 17:42:34.798694 kernel: pid_max: default: 32768 minimum: 301 Sep 12 17:42:34.798700 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 17:42:34.798707 kernel: landlock: Up and running. Sep 12 17:42:34.798713 kernel: SELinux: Initializing. Sep 12 17:42:34.798719 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 17:42:34.798725 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 17:42:34.798731 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 12 17:42:34.798737 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 12 17:42:34.798743 kernel: ... version: 0 Sep 12 17:42:34.798749 kernel: ... bit width: 48 Sep 12 17:42:34.798755 kernel: ... generic registers: 6 Sep 12 17:42:34.798762 kernel: ... value mask: 0000ffffffffffff Sep 12 17:42:34.798767 kernel: ... max period: 00007fffffffffff Sep 12 17:42:34.798773 kernel: ... fixed-purpose events: 0 Sep 12 17:42:34.798779 kernel: ... event mask: 000000000000003f Sep 12 17:42:34.798785 kernel: signal: max sigframe size: 1776 Sep 12 17:42:34.798791 kernel: rcu: Hierarchical SRCU implementation. Sep 12 17:42:34.798797 kernel: rcu: Max phase no-delay instances is 400. Sep 12 17:42:34.798803 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 17:42:34.799074 kernel: smp: Bringing up secondary CPUs ... Sep 12 17:42:34.799086 kernel: smpboot: x86: Booting SMP configuration: Sep 12 17:42:34.799092 kernel: .... node #0, CPUs: #1 Sep 12 17:42:34.799098 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 17:42:34.799104 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) Sep 12 17:42:34.799111 kernel: Memory: 1917788K/2047464K available (14336K kernel code, 2432K rwdata, 9960K rodata, 54040K init, 2924K bss, 125140K reserved, 0K cma-reserved) Sep 12 17:42:34.799117 kernel: devtmpfs: initialized Sep 12 17:42:34.799123 kernel: x86/mm: Memory block size: 128MB Sep 12 17:42:34.799130 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 17:42:34.799135 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 17:42:34.799143 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 17:42:34.799149 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 17:42:34.799155 kernel: audit: initializing netlink subsys (disabled) Sep 12 17:42:34.799161 kernel: audit: type=2000 audit(1757698951.775:1): state=initialized audit_enabled=0 res=1 Sep 12 17:42:34.799166 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 17:42:34.799172 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 17:42:34.799178 kernel: cpuidle: using governor menu Sep 12 17:42:34.799184 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 17:42:34.799190 kernel: dca service started, version 1.12.1 Sep 12 17:42:34.799197 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 12 17:42:34.799203 kernel: PCI: Using configuration type 1 for base access Sep 12 17:42:34.799209 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 17:42:34.799215 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 17:42:34.799221 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 17:42:34.799227 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 17:42:34.799233 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 17:42:34.799239 kernel: ACPI: Added _OSI(Module Device) Sep 12 17:42:34.799244 kernel: ACPI: Added _OSI(Processor Device) Sep 12 17:42:34.799252 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 17:42:34.799258 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 17:42:34.799263 kernel: ACPI: Interpreter enabled Sep 12 17:42:34.799269 kernel: ACPI: PM: (supports S0 S5) Sep 12 17:42:34.799275 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 17:42:34.799281 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 17:42:34.799287 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 17:42:34.799293 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 12 17:42:34.799299 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 17:42:34.799404 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 17:42:34.799468 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 12 17:42:34.799525 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 12 17:42:34.799533 kernel: PCI host bridge to bus 0000:00 Sep 12 17:42:34.799597 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 17:42:34.799697 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 17:42:34.799885 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 17:42:34.799953 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Sep 12 17:42:34.800048 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 12 17:42:34.800103 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 12 17:42:34.800160 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 17:42:34.800233 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 12 17:42:34.800498 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Sep 12 17:42:34.800572 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Sep 12 17:42:34.800632 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Sep 12 17:42:34.800706 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Sep 12 17:42:34.800764 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Sep 12 17:42:34.800844 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 17:42:34.801149 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.801224 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Sep 12 17:42:34.801291 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 17:42:34.801349 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 12 17:42:34.801407 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 17:42:34.801477 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.801537 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Sep 12 17:42:34.801594 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 17:42:34.801666 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 12 17:42:34.801730 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 17:42:34.801794 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.801877 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Sep 12 17:42:34.801937 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 17:42:34.801993 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 12 17:42:34.802049 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 17:42:34.802113 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.802175 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Sep 12 17:42:34.802271 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 17:42:34.802330 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 12 17:42:34.802386 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 17:42:34.802448 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.802507 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Sep 12 17:42:34.802564 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 17:42:34.802625 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 12 17:42:34.802697 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 17:42:34.802764 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.802967 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Sep 12 17:42:34.803038 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 17:42:34.803097 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 12 17:42:34.803156 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 17:42:34.803228 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.803287 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Sep 12 17:42:34.803344 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 17:42:34.803401 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 12 17:42:34.803458 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 17:42:34.803525 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.803588 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Sep 12 17:42:34.803645 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 17:42:34.803716 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 12 17:42:34.803774 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 17:42:34.803881 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 17:42:34.803947 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Sep 12 17:42:34.804005 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 17:42:34.804067 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 12 17:42:34.804124 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 17:42:34.804188 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 12 17:42:34.804246 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 12 17:42:34.804311 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 12 17:42:34.804369 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Sep 12 17:42:34.804425 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Sep 12 17:42:34.804490 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 12 17:42:34.804548 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 12 17:42:34.804613 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 12 17:42:34.804691 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Sep 12 17:42:34.804753 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Sep 12 17:42:34.804831 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Sep 12 17:42:34.804894 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 17:42:34.804964 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 12 17:42:34.805024 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Sep 12 17:42:34.805081 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 17:42:34.805149 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Sep 12 17:42:34.805209 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Sep 12 17:42:34.805268 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Sep 12 17:42:34.805329 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 17:42:34.805393 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Sep 12 17:42:34.805480 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Sep 12 17:42:34.805548 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 17:42:34.805617 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 12 17:42:34.805692 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Sep 12 17:42:34.805751 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 17:42:34.805871 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Sep 12 17:42:34.805938 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Sep 12 17:42:34.805998 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Sep 12 17:42:34.806055 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 17:42:34.806063 kernel: acpiphp: Slot [0] registered Sep 12 17:42:34.806130 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 12 17:42:34.806196 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Sep 12 17:42:34.806256 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Sep 12 17:42:34.806316 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Sep 12 17:42:34.806374 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 17:42:34.806383 kernel: acpiphp: Slot [0-2] registered Sep 12 17:42:34.806440 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 17:42:34.806448 kernel: acpiphp: Slot [0-3] registered Sep 12 17:42:34.806503 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 17:42:34.806514 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 12 17:42:34.806520 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 12 17:42:34.806526 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 17:42:34.807033 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 12 17:42:34.807041 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 12 17:42:34.807048 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 12 17:42:34.807054 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 12 17:42:34.807060 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 12 17:42:34.807065 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 12 17:42:34.807074 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 12 17:42:34.807080 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 12 17:42:34.807086 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 12 17:42:34.807092 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 12 17:42:34.807098 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 12 17:42:34.807104 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 12 17:42:34.807110 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 12 17:42:34.807116 kernel: iommu: Default domain type: Translated Sep 12 17:42:34.807122 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 17:42:34.807130 kernel: PCI: Using ACPI for IRQ routing Sep 12 17:42:34.807136 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 17:42:34.807142 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 12 17:42:34.807148 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Sep 12 17:42:34.807226 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 12 17:42:34.807288 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 12 17:42:34.807348 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 17:42:34.807357 kernel: vgaarb: loaded Sep 12 17:42:34.807363 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 12 17:42:34.807372 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 12 17:42:34.807378 kernel: clocksource: Switched to clocksource kvm-clock Sep 12 17:42:34.807384 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 17:42:34.807390 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 17:42:34.807396 kernel: pnp: PnP ACPI init Sep 12 17:42:34.807460 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 12 17:42:34.807470 kernel: pnp: PnP ACPI: found 5 devices Sep 12 17:42:34.807477 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 17:42:34.807485 kernel: NET: Registered PF_INET protocol family Sep 12 17:42:34.807491 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 17:42:34.807497 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 12 17:42:34.807503 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 17:42:34.807509 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 17:42:34.807515 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 17:42:34.807522 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 12 17:42:34.807528 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 17:42:34.807534 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 17:42:34.807541 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 17:42:34.807547 kernel: NET: Registered PF_XDP protocol family Sep 12 17:42:34.807607 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 17:42:34.807681 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 17:42:34.807743 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 17:42:34.807802 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Sep 12 17:42:34.807899 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Sep 12 17:42:34.807960 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Sep 12 17:42:34.808022 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 17:42:34.808180 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 12 17:42:34.808242 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 17:42:34.808299 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 17:42:34.808356 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 12 17:42:34.808413 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 17:42:34.808470 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 17:42:34.809071 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 12 17:42:34.809137 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 17:42:34.809201 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 17:42:34.809369 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 12 17:42:34.809427 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 17:42:34.809485 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 17:42:34.809543 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 12 17:42:34.809604 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 17:42:34.809673 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 17:42:34.809737 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 12 17:42:34.809795 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 17:42:34.810891 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 17:42:34.810954 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Sep 12 17:42:34.811013 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 12 17:42:34.811077 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 17:42:34.811135 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 17:42:34.811193 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Sep 12 17:42:34.811251 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 12 17:42:34.811308 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 17:42:34.811365 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 17:42:34.811422 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Sep 12 17:42:34.811479 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 12 17:42:34.811536 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 17:42:34.811593 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 17:42:34.811646 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 17:42:34.811737 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 17:42:34.811799 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Sep 12 17:42:34.812050 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 12 17:42:34.812107 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 12 17:42:34.812168 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 12 17:42:34.812223 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 17:42:34.812289 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 12 17:42:34.812344 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 17:42:34.812403 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 12 17:42:34.812456 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 17:42:34.812515 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 12 17:42:34.812568 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 17:42:34.812630 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 12 17:42:34.812698 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 17:42:34.812758 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Sep 12 17:42:34.812828 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 17:42:34.812894 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Sep 12 17:42:34.812948 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 12 17:42:34.813005 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 17:42:34.813063 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Sep 12 17:42:34.813117 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Sep 12 17:42:34.813168 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 17:42:34.813225 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Sep 12 17:42:34.813277 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 12 17:42:34.813329 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 17:42:34.813341 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 12 17:42:34.813348 kernel: PCI: CLS 0 bytes, default 64 Sep 12 17:42:34.813355 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Sep 12 17:42:34.813361 kernel: Initialise system trusted keyrings Sep 12 17:42:34.813367 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 12 17:42:34.813374 kernel: Key type asymmetric registered Sep 12 17:42:34.813380 kernel: Asymmetric key parser 'x509' registered Sep 12 17:42:34.813386 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 17:42:34.813393 kernel: io scheduler mq-deadline registered Sep 12 17:42:34.813401 kernel: io scheduler kyber registered Sep 12 17:42:34.813407 kernel: io scheduler bfq registered Sep 12 17:42:34.813467 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 12 17:42:34.813529 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 12 17:42:34.813588 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 12 17:42:34.813646 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 12 17:42:34.813720 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 12 17:42:34.813780 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 12 17:42:34.814879 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 12 17:42:34.814946 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 12 17:42:34.815007 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 12 17:42:34.815065 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 12 17:42:34.815122 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 12 17:42:34.815180 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 12 17:42:34.815238 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 12 17:42:34.815295 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 12 17:42:34.815357 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 12 17:42:34.815414 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 12 17:42:34.815424 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 12 17:42:34.815479 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Sep 12 17:42:34.815536 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Sep 12 17:42:34.815545 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 17:42:34.815555 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Sep 12 17:42:34.815561 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 17:42:34.815568 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 17:42:34.815576 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 12 17:42:34.815582 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 17:42:34.815588 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 17:42:34.815649 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 12 17:42:34.815671 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 17:42:34.815729 kernel: rtc_cmos 00:03: registered as rtc0 Sep 12 17:42:34.815783 kernel: rtc_cmos 00:03: setting system clock to 2025-09-12T17:42:34 UTC (1757698954) Sep 12 17:42:34.815861 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 12 17:42:34.815871 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 12 17:42:34.815878 kernel: NET: Registered PF_INET6 protocol family Sep 12 17:42:34.815885 kernel: Segment Routing with IPv6 Sep 12 17:42:34.815891 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 17:42:34.815898 kernel: NET: Registered PF_PACKET protocol family Sep 12 17:42:34.815904 kernel: Key type dns_resolver registered Sep 12 17:42:34.815913 kernel: IPI shorthand broadcast: enabled Sep 12 17:42:34.815920 kernel: sched_clock: Marking stable (2959007938, 142725240)->(3106624197, -4891019) Sep 12 17:42:34.815926 kernel: registered taskstats version 1 Sep 12 17:42:34.815932 kernel: Loading compiled-in X.509 certificates Sep 12 17:42:34.815938 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: f1ae8d6e9bfae84d90f4136cf098b0465b2a5bd7' Sep 12 17:42:34.815945 kernel: Demotion targets for Node 0: null Sep 12 17:42:34.815951 kernel: Key type .fscrypt registered Sep 12 17:42:34.815957 kernel: Key type fscrypt-provisioning registered Sep 12 17:42:34.815963 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 17:42:34.815971 kernel: ima: Allocated hash algorithm: sha1 Sep 12 17:42:34.815977 kernel: ima: No architecture policies found Sep 12 17:42:34.815983 kernel: clk: Disabling unused clocks Sep 12 17:42:34.815989 kernel: Warning: unable to open an initial console. Sep 12 17:42:34.815996 kernel: Freeing unused kernel image (initmem) memory: 54040K Sep 12 17:42:34.816003 kernel: Write protecting the kernel read-only data: 24576k Sep 12 17:42:34.816009 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Sep 12 17:42:34.816015 kernel: Run /init as init process Sep 12 17:42:34.816023 kernel: with arguments: Sep 12 17:42:34.816029 kernel: /init Sep 12 17:42:34.816035 kernel: with environment: Sep 12 17:42:34.816041 kernel: HOME=/ Sep 12 17:42:34.816047 kernel: TERM=linux Sep 12 17:42:34.816054 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 17:42:34.816061 systemd[1]: Successfully made /usr/ read-only. Sep 12 17:42:34.816070 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:42:34.816079 systemd[1]: Detected virtualization kvm. Sep 12 17:42:34.816085 systemd[1]: Detected architecture x86-64. Sep 12 17:42:34.816092 systemd[1]: Running in initrd. Sep 12 17:42:34.816098 systemd[1]: No hostname configured, using default hostname. Sep 12 17:42:34.816105 systemd[1]: Hostname set to . Sep 12 17:42:34.816112 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:42:34.816119 systemd[1]: Queued start job for default target initrd.target. Sep 12 17:42:34.816125 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:42:34.816134 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:42:34.816141 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 17:42:34.816148 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:42:34.816154 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 17:42:34.816162 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 17:42:34.816170 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 17:42:34.816177 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 17:42:34.816186 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:42:34.816192 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:42:34.816199 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:42:34.816206 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:42:34.816212 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:42:34.816219 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:42:34.816226 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:42:34.816233 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:42:34.816240 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 17:42:34.816248 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 17:42:34.816254 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:42:34.816261 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:42:34.816267 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:42:34.816275 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:42:34.816282 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 17:42:34.816288 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:42:34.816295 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 17:42:34.816304 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 17:42:34.816310 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 17:42:34.816317 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:42:34.816324 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:42:34.816331 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:42:34.816337 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 17:42:34.816347 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:42:34.816353 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 17:42:34.816360 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 17:42:34.816383 systemd-journald[217]: Collecting audit messages is disabled. Sep 12 17:42:34.816404 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 17:42:34.816412 systemd-journald[217]: Journal started Sep 12 17:42:34.816428 systemd-journald[217]: Runtime Journal (/run/log/journal/d954c4229211415ba8467298b62a36bf) is 4.8M, max 38.6M, 33.7M free. Sep 12 17:42:34.801176 systemd-modules-load[218]: Inserted module 'overlay' Sep 12 17:42:34.856033 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 17:42:34.856055 kernel: Bridge firewalling registered Sep 12 17:42:34.856064 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:42:34.825593 systemd-modules-load[218]: Inserted module 'br_netfilter' Sep 12 17:42:34.856755 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:42:34.857641 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:42:34.860090 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 17:42:34.862107 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:42:34.869886 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:42:34.877346 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:42:34.883280 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:42:34.885102 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:42:34.886865 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:42:34.889115 systemd-tmpfiles[237]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 17:42:34.890900 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 17:42:34.893694 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:42:34.901891 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:42:34.909043 dracut-cmdline[253]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=271a44cc8ea1639cfb6fdf777202a5f025fda0b3ce9b293cc4e0e7047aecb858 Sep 12 17:42:34.937879 systemd-resolved[257]: Positive Trust Anchors: Sep 12 17:42:34.937891 systemd-resolved[257]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:42:34.937920 systemd-resolved[257]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:42:34.942968 systemd-resolved[257]: Defaulting to hostname 'linux'. Sep 12 17:42:34.943674 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:42:34.944337 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:42:34.968841 kernel: SCSI subsystem initialized Sep 12 17:42:34.976856 kernel: Loading iSCSI transport class v2.0-870. Sep 12 17:42:34.984836 kernel: iscsi: registered transport (tcp) Sep 12 17:42:34.999949 kernel: iscsi: registered transport (qla4xxx) Sep 12 17:42:34.999984 kernel: QLogic iSCSI HBA Driver Sep 12 17:42:35.012985 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:42:35.024120 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:42:35.026413 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:42:35.056867 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 17:42:35.059029 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 17:42:35.105849 kernel: raid6: avx2x4 gen() 27131 MB/s Sep 12 17:42:35.122859 kernel: raid6: avx2x2 gen() 30125 MB/s Sep 12 17:42:35.139986 kernel: raid6: avx2x1 gen() 21225 MB/s Sep 12 17:42:35.140037 kernel: raid6: using algorithm avx2x2 gen() 30125 MB/s Sep 12 17:42:35.158068 kernel: raid6: .... xor() 31798 MB/s, rmw enabled Sep 12 17:42:35.158123 kernel: raid6: using avx2x2 recovery algorithm Sep 12 17:42:35.175845 kernel: xor: automatically using best checksumming function avx Sep 12 17:42:35.299865 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 17:42:35.305055 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:42:35.307050 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:42:35.326231 systemd-udevd[466]: Using default interface naming scheme 'v255'. Sep 12 17:42:35.330026 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:42:35.333517 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 17:42:35.351518 dracut-pre-trigger[480]: rd.md=0: removing MD RAID activation Sep 12 17:42:35.371111 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:42:35.373901 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:42:35.417630 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:42:35.422269 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 17:42:35.485832 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Sep 12 17:42:35.493827 kernel: scsi host0: Virtio SCSI HBA Sep 12 17:42:35.514923 kernel: libata version 3.00 loaded. Sep 12 17:42:35.516862 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:42:35.523121 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 12 17:42:35.523525 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 17:42:35.516973 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:42:35.526273 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:42:35.553844 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 12 17:42:35.552537 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:42:35.559838 kernel: ACPI: bus type USB registered Sep 12 17:42:35.563074 kernel: usbcore: registered new interface driver usbfs Sep 12 17:42:35.563098 kernel: usbcore: registered new interface driver hub Sep 12 17:42:35.565805 kernel: usbcore: registered new device driver usb Sep 12 17:42:35.581834 kernel: AES CTR mode by8 optimization enabled Sep 12 17:42:35.591611 kernel: ahci 0000:00:1f.2: version 3.0 Sep 12 17:42:35.591776 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 12 17:42:35.591788 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 12 17:42:35.591891 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 12 17:42:35.591969 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 12 17:42:35.596861 kernel: sd 0:0:0:0: Power-on or device reset occurred Sep 12 17:42:35.596988 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 12 17:42:35.597069 kernel: scsi host1: ahci Sep 12 17:42:35.597145 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 17:42:35.597221 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Sep 12 17:42:35.597978 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 17:42:35.599466 kernel: scsi host2: ahci Sep 12 17:42:35.602858 kernel: scsi host3: ahci Sep 12 17:42:35.604328 kernel: scsi host4: ahci Sep 12 17:42:35.608845 kernel: scsi host5: ahci Sep 12 17:42:35.609236 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 17:42:35.609265 kernel: GPT:17805311 != 80003071 Sep 12 17:42:35.609304 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 17:42:35.609332 kernel: GPT:17805311 != 80003071 Sep 12 17:42:35.609357 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 17:42:35.609381 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:42:35.609408 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 17:42:35.614162 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 17:42:35.614287 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 12 17:42:35.614371 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 12 17:42:35.614925 kernel: scsi host6: ahci Sep 12 17:42:35.615034 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 46 lpm-pol 1 Sep 12 17:42:35.615045 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 46 lpm-pol 1 Sep 12 17:42:35.615053 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 46 lpm-pol 1 Sep 12 17:42:35.615060 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 46 lpm-pol 1 Sep 12 17:42:35.615068 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 46 lpm-pol 1 Sep 12 17:42:35.615075 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 46 lpm-pol 1 Sep 12 17:42:35.616889 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 17:42:35.617016 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 12 17:42:35.617100 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 12 17:42:35.617176 kernel: hub 1-0:1.0: USB hub found Sep 12 17:42:35.617289 kernel: hub 1-0:1.0: 4 ports detected Sep 12 17:42:35.618850 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 12 17:42:35.618969 kernel: hub 2-0:1.0: USB hub found Sep 12 17:42:35.619057 kernel: hub 2-0:1.0: 4 ports detected Sep 12 17:42:35.687726 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 12 17:42:35.713338 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:42:35.721019 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 12 17:42:35.728396 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 17:42:35.734423 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 12 17:42:35.734955 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 12 17:42:35.738992 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 17:42:35.759447 disk-uuid[620]: Primary Header is updated. Sep 12 17:42:35.759447 disk-uuid[620]: Secondary Entries is updated. Sep 12 17:42:35.759447 disk-uuid[620]: Secondary Header is updated. Sep 12 17:42:35.769848 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:42:35.854843 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 12 17:42:35.924828 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 12 17:42:35.924898 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 12 17:42:35.927190 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 12 17:42:35.927820 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 12 17:42:35.931028 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 12 17:42:35.931833 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 12 17:42:35.934384 kernel: ata1.00: LPM support broken, forcing max_power Sep 12 17:42:35.938111 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 12 17:42:35.938137 kernel: ata1.00: applying bridge limits Sep 12 17:42:35.942214 kernel: ata1.00: LPM support broken, forcing max_power Sep 12 17:42:35.942237 kernel: ata1.00: configured for UDMA/100 Sep 12 17:42:35.946837 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 17:42:35.987828 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 12 17:42:35.988040 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 17:42:35.992837 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 17:42:35.998228 kernel: usbcore: registered new interface driver usbhid Sep 12 17:42:35.998273 kernel: usbhid: USB HID core driver Sep 12 17:42:36.000837 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Sep 12 17:42:36.001349 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Sep 12 17:42:36.004945 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 12 17:42:36.293068 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 17:42:36.294115 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:42:36.295194 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:42:36.296535 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:42:36.298521 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 17:42:36.328410 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:42:36.785843 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 17:42:36.787141 disk-uuid[621]: The operation has completed successfully. Sep 12 17:42:36.839022 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 17:42:36.839124 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 17:42:36.874715 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 17:42:36.888345 sh[666]: Success Sep 12 17:42:36.902831 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 17:42:36.902876 kernel: device-mapper: uevent: version 1.0.3 Sep 12 17:42:36.904844 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 17:42:36.912835 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 12 17:42:36.947520 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 17:42:36.950880 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 17:42:36.965093 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 17:42:36.976850 kernel: BTRFS: device fsid 74707491-1b86-4926-8bdb-c533ce2a0c32 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (678) Sep 12 17:42:36.976891 kernel: BTRFS info (device dm-0): first mount of filesystem 74707491-1b86-4926-8bdb-c533ce2a0c32 Sep 12 17:42:36.978855 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:42:36.989829 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 17:42:36.989861 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 17:42:36.989873 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 17:42:36.993021 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 17:42:36.994410 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:42:36.995768 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 17:42:36.996423 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 17:42:36.999378 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 17:42:37.031487 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (715) Sep 12 17:42:37.031547 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:42:37.033766 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:42:37.041642 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:42:37.041696 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:42:37.041714 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:42:37.048839 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:42:37.049893 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 17:42:37.052972 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 17:42:37.072540 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:42:37.076934 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:42:37.137727 systemd-networkd[847]: lo: Link UP Sep 12 17:42:37.137735 systemd-networkd[847]: lo: Gained carrier Sep 12 17:42:37.140841 systemd-networkd[847]: Enumeration completed Sep 12 17:42:37.141005 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:42:37.141623 systemd-networkd[847]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:37.141626 systemd-networkd[847]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:42:37.142399 systemd[1]: Reached target network.target - Network. Sep 12 17:42:37.143102 systemd-networkd[847]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:37.143104 systemd-networkd[847]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:42:37.147981 ignition[814]: Ignition 2.21.0 Sep 12 17:42:37.143326 systemd-networkd[847]: eth0: Link UP Sep 12 17:42:37.147987 ignition[814]: Stage: fetch-offline Sep 12 17:42:37.143444 systemd-networkd[847]: eth1: Link UP Sep 12 17:42:37.148007 ignition[814]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:42:37.143550 systemd-networkd[847]: eth0: Gained carrier Sep 12 17:42:37.148013 ignition[814]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:42:37.143556 systemd-networkd[847]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:37.148069 ignition[814]: parsed url from cmdline: "" Sep 12 17:42:37.148323 systemd-networkd[847]: eth1: Gained carrier Sep 12 17:42:37.148071 ignition[814]: no config URL provided Sep 12 17:42:37.148332 systemd-networkd[847]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:37.148074 ignition[814]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:42:37.149412 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:42:37.148079 ignition[814]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:42:37.153002 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 17:42:37.148083 ignition[814]: failed to fetch config: resource requires networking Sep 12 17:42:37.148189 ignition[814]: Ignition finished successfully Sep 12 17:42:37.167244 ignition[856]: Ignition 2.21.0 Sep 12 17:42:37.167252 ignition[856]: Stage: fetch Sep 12 17:42:37.167362 ignition[856]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:42:37.167370 ignition[856]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:42:37.167428 ignition[856]: parsed url from cmdline: "" Sep 12 17:42:37.167431 ignition[856]: no config URL provided Sep 12 17:42:37.167434 ignition[856]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 17:42:37.167439 ignition[856]: no config at "/usr/lib/ignition/user.ign" Sep 12 17:42:37.167460 ignition[856]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 12 17:42:37.167568 ignition[856]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 12 17:42:37.175873 systemd-networkd[847]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 17:42:37.212862 systemd-networkd[847]: eth0: DHCPv4 address 95.216.139.29/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 17:42:37.367986 ignition[856]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 12 17:42:37.373617 ignition[856]: GET result: OK Sep 12 17:42:37.373707 ignition[856]: parsing config with SHA512: 436cb81bd85d4f12316938470e4a4c664cdb982000a5d369eae05a797c9d6086da195450ff6c1677a43858bc54a5a60cde5af9c11ee0501447198a463aeef349 Sep 12 17:42:37.377071 unknown[856]: fetched base config from "system" Sep 12 17:42:37.377083 unknown[856]: fetched base config from "system" Sep 12 17:42:37.377370 ignition[856]: fetch: fetch complete Sep 12 17:42:37.377087 unknown[856]: fetched user config from "hetzner" Sep 12 17:42:37.377374 ignition[856]: fetch: fetch passed Sep 12 17:42:37.379711 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 17:42:37.377409 ignition[856]: Ignition finished successfully Sep 12 17:42:37.382509 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 17:42:37.409869 ignition[864]: Ignition 2.21.0 Sep 12 17:42:37.409882 ignition[864]: Stage: kargs Sep 12 17:42:37.410003 ignition[864]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:42:37.410012 ignition[864]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:42:37.410923 ignition[864]: kargs: kargs passed Sep 12 17:42:37.410965 ignition[864]: Ignition finished successfully Sep 12 17:42:37.414419 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 17:42:37.416715 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 17:42:37.444402 ignition[871]: Ignition 2.21.0 Sep 12 17:42:37.444415 ignition[871]: Stage: disks Sep 12 17:42:37.444540 ignition[871]: no configs at "/usr/lib/ignition/base.d" Sep 12 17:42:37.447014 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 17:42:37.444548 ignition[871]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:42:37.445574 ignition[871]: disks: disks passed Sep 12 17:42:37.449033 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 17:42:37.445614 ignition[871]: Ignition finished successfully Sep 12 17:42:37.450266 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 17:42:37.451355 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:42:37.452582 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:42:37.453751 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:42:37.455697 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 17:42:37.488768 systemd-fsck[880]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 12 17:42:37.491432 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 17:42:37.493738 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 17:42:37.596884 kernel: EXT4-fs (sda9): mounted filesystem 26739aba-b0be-4ce3-bfbd-ca4dbcbe2426 r/w with ordered data mode. Quota mode: none. Sep 12 17:42:37.596412 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 17:42:37.597686 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 17:42:37.599962 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:42:37.603940 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 17:42:37.611963 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 17:42:37.614131 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 17:42:37.615913 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:42:37.620000 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 17:42:37.636602 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (888) Sep 12 17:42:37.636627 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:42:37.636639 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:42:37.636648 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:42:37.636658 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:42:37.636686 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:42:37.638578 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:42:37.641922 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 17:42:37.677742 coreos-metadata[890]: Sep 12 17:42:37.677 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 12 17:42:37.679238 coreos-metadata[890]: Sep 12 17:42:37.679 INFO Fetch successful Sep 12 17:42:37.680281 coreos-metadata[890]: Sep 12 17:42:37.680 INFO wrote hostname ci-4426-1-0-d-1f6ac31256 to /sysroot/etc/hostname Sep 12 17:42:37.682454 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:42:37.690140 initrd-setup-root[916]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 17:42:37.694833 initrd-setup-root[923]: cut: /sysroot/etc/group: No such file or directory Sep 12 17:42:37.699545 initrd-setup-root[930]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 17:42:37.703440 initrd-setup-root[937]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 17:42:37.776237 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 17:42:37.778265 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 17:42:37.780034 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 17:42:37.791837 kernel: BTRFS info (device sda6): last unmount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:42:37.806226 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 17:42:37.813899 ignition[1005]: INFO : Ignition 2.21.0 Sep 12 17:42:37.813899 ignition[1005]: INFO : Stage: mount Sep 12 17:42:37.816462 ignition[1005]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:42:37.816462 ignition[1005]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:42:37.816462 ignition[1005]: INFO : mount: mount passed Sep 12 17:42:37.816462 ignition[1005]: INFO : Ignition finished successfully Sep 12 17:42:37.816760 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 17:42:37.818314 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 17:42:37.974804 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 17:42:37.976317 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 17:42:37.990852 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1017) Sep 12 17:42:37.995056 kernel: BTRFS info (device sda6): first mount of filesystem 5410dae6-8d31-4ea4-a4b4-868064445761 Sep 12 17:42:37.995082 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 17:42:37.999157 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 17:42:37.999182 kernel: BTRFS info (device sda6): turning on async discard Sep 12 17:42:38.001461 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 17:42:38.003211 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 17:42:38.023945 ignition[1034]: INFO : Ignition 2.21.0 Sep 12 17:42:38.023945 ignition[1034]: INFO : Stage: files Sep 12 17:42:38.025214 ignition[1034]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:42:38.025214 ignition[1034]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:42:38.025214 ignition[1034]: DEBUG : files: compiled without relabeling support, skipping Sep 12 17:42:38.027917 ignition[1034]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 17:42:38.027917 ignition[1034]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 17:42:38.029971 ignition[1034]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 17:42:38.030857 ignition[1034]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 17:42:38.031978 unknown[1034]: wrote ssh authorized keys file for user: core Sep 12 17:42:38.032919 ignition[1034]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 17:42:38.034387 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 17:42:38.039641 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 12 17:42:38.292110 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 17:42:38.840608 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 12 17:42:38.842085 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 17:42:38.842085 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 17:42:38.842085 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:42:38.842085 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 17:42:38.842085 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:42:38.842085 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 17:42:38.842085 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:42:38.842085 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 17:42:38.849364 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:42:38.849364 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 17:42:38.849364 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:42:38.849364 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:42:38.849364 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:42:38.849364 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 12 17:42:39.035978 systemd-networkd[847]: eth0: Gained IPv6LL Sep 12 17:42:39.036826 systemd-networkd[847]: eth1: Gained IPv6LL Sep 12 17:42:39.228637 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 17:42:39.706061 ignition[1034]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 12 17:42:39.706061 ignition[1034]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 17:42:39.709503 ignition[1034]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:42:39.712012 ignition[1034]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 17:42:39.712012 ignition[1034]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 17:42:39.712012 ignition[1034]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 17:42:39.717913 ignition[1034]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 17:42:39.717913 ignition[1034]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 17:42:39.717913 ignition[1034]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 17:42:39.717913 ignition[1034]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 12 17:42:39.717913 ignition[1034]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 17:42:39.717913 ignition[1034]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:42:39.717913 ignition[1034]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 17:42:39.717913 ignition[1034]: INFO : files: files passed Sep 12 17:42:39.717913 ignition[1034]: INFO : Ignition finished successfully Sep 12 17:42:39.715425 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 17:42:39.720083 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 17:42:39.727008 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 17:42:39.737283 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 17:42:39.737450 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 17:42:39.752254 initrd-setup-root-after-ignition[1064]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:42:39.754615 initrd-setup-root-after-ignition[1068]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:42:39.756144 initrd-setup-root-after-ignition[1064]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 17:42:39.755230 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:42:39.757603 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 17:42:39.759484 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 17:42:39.819615 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 17:42:39.820318 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 17:42:39.821005 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 17:42:39.822266 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 17:42:39.823552 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 17:42:39.824198 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 17:42:39.857454 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:42:39.861500 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 17:42:39.893769 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:42:39.896110 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:42:39.897409 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 17:42:39.899879 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 17:42:39.900122 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 17:42:39.902603 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 17:42:39.904107 systemd[1]: Stopped target basic.target - Basic System. Sep 12 17:42:39.906608 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 17:42:39.909107 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 17:42:39.911246 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 17:42:39.913606 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 17:42:39.916082 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 17:42:39.918380 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 17:42:39.921050 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 17:42:39.923640 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 17:42:39.926344 systemd[1]: Stopped target swap.target - Swaps. Sep 12 17:42:39.928447 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 17:42:39.928643 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 17:42:39.931249 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:42:39.932805 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:42:39.934867 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 17:42:39.936009 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:42:39.937413 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 17:42:39.937709 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 17:42:39.941152 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 17:42:39.941335 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 17:42:39.943087 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 17:42:39.943320 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 17:42:39.952312 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 17:42:39.952541 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 17:42:39.957035 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 17:42:39.958490 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 17:42:39.959984 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:42:39.966849 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 17:42:39.967986 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 17:42:39.968382 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:42:39.971265 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 17:42:39.971632 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 17:42:39.996869 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 17:42:39.997017 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 17:42:40.004856 ignition[1088]: INFO : Ignition 2.21.0 Sep 12 17:42:40.004856 ignition[1088]: INFO : Stage: umount Sep 12 17:42:40.004856 ignition[1088]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 17:42:40.004856 ignition[1088]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 17:42:40.015997 ignition[1088]: INFO : umount: umount passed Sep 12 17:42:40.015997 ignition[1088]: INFO : Ignition finished successfully Sep 12 17:42:40.006954 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 17:42:40.007083 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 17:42:40.009441 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 17:42:40.009493 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 17:42:40.013166 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 17:42:40.013218 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 17:42:40.015418 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 17:42:40.015501 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 17:42:40.019034 systemd[1]: Stopped target network.target - Network. Sep 12 17:42:40.023347 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 17:42:40.023433 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 17:42:40.025743 systemd[1]: Stopped target paths.target - Path Units. Sep 12 17:42:40.033589 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 17:42:40.033698 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:42:40.042634 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 17:42:40.043547 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 17:42:40.047993 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 17:42:40.048058 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 17:42:40.051098 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 17:42:40.051181 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 17:42:40.054503 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 17:42:40.054590 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 17:42:40.056975 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 17:42:40.057039 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 17:42:40.059922 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 17:42:40.062162 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 17:42:40.067052 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 17:42:40.070027 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 17:42:40.070208 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 17:42:40.072675 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 17:42:40.073015 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 17:42:40.079096 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 17:42:40.079453 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 17:42:40.079590 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 17:42:40.082428 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 17:42:40.083736 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 17:42:40.085582 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 17:42:40.085647 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:42:40.088377 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 17:42:40.088494 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 17:42:40.092973 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 17:42:40.094918 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 17:42:40.094996 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 17:42:40.098892 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 17:42:40.098966 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:42:40.103231 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 17:42:40.103338 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 17:42:40.105185 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 17:42:40.105290 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:42:40.109523 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:42:40.118408 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 17:42:40.118539 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:42:40.128758 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 17:42:40.129022 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:42:40.131092 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 17:42:40.131147 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 17:42:40.134736 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 17:42:40.134807 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:42:40.137165 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 17:42:40.137261 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 17:42:40.140852 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 17:42:40.140931 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 17:42:40.143776 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 17:42:40.143881 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 17:42:40.147953 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 17:42:40.151332 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 17:42:40.151464 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:42:40.157719 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 17:42:40.157797 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:42:40.160632 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:42:40.160768 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:42:40.167150 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 17:42:40.167233 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 17:42:40.167295 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:42:40.168024 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 17:42:40.168148 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 17:42:40.170297 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 17:42:40.170413 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 17:42:40.173137 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 17:42:40.175577 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 17:42:40.195614 systemd[1]: Switching root. Sep 12 17:42:40.253799 systemd-journald[217]: Journal stopped Sep 12 17:42:41.337793 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Sep 12 17:42:41.339867 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 17:42:41.339885 kernel: SELinux: policy capability open_perms=1 Sep 12 17:42:41.339894 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 17:42:41.339902 kernel: SELinux: policy capability always_check_network=0 Sep 12 17:42:41.339909 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 17:42:41.339917 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 17:42:41.339927 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 17:42:41.339935 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 17:42:41.339944 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 17:42:41.339954 kernel: audit: type=1403 audit(1757698960.417:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 17:42:41.339964 systemd[1]: Successfully loaded SELinux policy in 82.078ms. Sep 12 17:42:41.339979 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.018ms. Sep 12 17:42:41.339993 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 17:42:41.340004 systemd[1]: Detected virtualization kvm. Sep 12 17:42:41.340014 systemd[1]: Detected architecture x86-64. Sep 12 17:42:41.340023 systemd[1]: Detected first boot. Sep 12 17:42:41.340031 systemd[1]: Hostname set to . Sep 12 17:42:41.340039 systemd[1]: Initializing machine ID from VM UUID. Sep 12 17:42:41.340047 zram_generator::config[1133]: No configuration found. Sep 12 17:42:41.340057 kernel: Guest personality initialized and is inactive Sep 12 17:42:41.340065 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 17:42:41.340073 kernel: Initialized host personality Sep 12 17:42:41.340087 kernel: NET: Registered PF_VSOCK protocol family Sep 12 17:42:41.340104 systemd[1]: Populated /etc with preset unit settings. Sep 12 17:42:41.340115 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 17:42:41.340124 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 17:42:41.340132 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 17:42:41.340144 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 17:42:41.340153 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 17:42:41.340161 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 17:42:41.340170 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 17:42:41.340182 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 17:42:41.340193 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 17:42:41.340202 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 17:42:41.340211 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 17:42:41.340219 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 17:42:41.340227 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 17:42:41.340236 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 17:42:41.340245 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 17:42:41.340254 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 17:42:41.340263 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 17:42:41.340271 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 17:42:41.340279 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 17:42:41.340289 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 17:42:41.340297 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 17:42:41.340305 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 17:42:41.340314 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 17:42:41.340322 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 17:42:41.340330 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 17:42:41.340339 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 17:42:41.340347 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 17:42:41.340355 systemd[1]: Reached target slices.target - Slice Units. Sep 12 17:42:41.340363 systemd[1]: Reached target swap.target - Swaps. Sep 12 17:42:41.340372 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 17:42:41.340381 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 17:42:41.340389 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 17:42:41.340397 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 17:42:41.340406 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 17:42:41.340414 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 17:42:41.340422 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 17:42:41.340430 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 17:42:41.340438 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 17:42:41.340448 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 17:42:41.340456 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:41.340466 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 17:42:41.340474 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 17:42:41.340482 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 17:42:41.340491 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 17:42:41.340499 systemd[1]: Reached target machines.target - Containers. Sep 12 17:42:41.340508 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 17:42:41.340517 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:42:41.340526 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 17:42:41.340534 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 17:42:41.340542 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:42:41.340551 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:42:41.340559 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:42:41.340567 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 17:42:41.340576 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:42:41.340588 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 17:42:41.340598 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 17:42:41.340606 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 17:42:41.340614 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 17:42:41.340623 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 17:42:41.340632 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:42:41.340640 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 17:42:41.340648 kernel: loop: module loaded Sep 12 17:42:41.340656 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 17:42:41.340665 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 17:42:41.340674 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 17:42:41.340697 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 17:42:41.340706 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 17:42:41.340716 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 17:42:41.340724 systemd[1]: Stopped verity-setup.service. Sep 12 17:42:41.340733 kernel: fuse: init (API version 7.41) Sep 12 17:42:41.340742 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:41.340751 kernel: ACPI: bus type drm_connector registered Sep 12 17:42:41.340761 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 17:42:41.340769 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 17:42:41.340778 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 17:42:41.340786 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 17:42:41.340796 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 17:42:41.340804 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 17:42:41.340827 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 17:42:41.340853 systemd-journald[1217]: Collecting audit messages is disabled. Sep 12 17:42:41.340876 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 17:42:41.340890 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 17:42:41.340900 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 17:42:41.340908 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:42:41.340917 systemd-journald[1217]: Journal started Sep 12 17:42:41.340934 systemd-journald[1217]: Runtime Journal (/run/log/journal/d954c4229211415ba8467298b62a36bf) is 4.8M, max 38.6M, 33.7M free. Sep 12 17:42:41.084360 systemd[1]: Queued start job for default target multi-user.target. Sep 12 17:42:41.095523 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 17:42:41.096036 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 17:42:41.343846 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:42:41.343868 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 17:42:41.346048 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:42:41.346238 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:42:41.346998 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:42:41.347112 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:42:41.347847 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 17:42:41.348016 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 17:42:41.348651 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:42:41.348860 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:42:41.349587 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 17:42:41.350451 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 17:42:41.351316 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 17:42:41.352289 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 17:42:41.358785 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 17:42:41.360932 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 17:42:41.363868 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 17:42:41.364337 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 17:42:41.364359 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 17:42:41.365494 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 17:42:41.368792 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 17:42:41.369324 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:42:41.372948 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 17:42:41.375923 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 17:42:41.376408 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:42:41.377366 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 17:42:41.377897 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:42:41.381868 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 17:42:41.386305 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 17:42:41.390958 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 17:42:41.397624 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 17:42:41.398323 systemd-journald[1217]: Time spent on flushing to /var/log/journal/d954c4229211415ba8467298b62a36bf is 51.925ms for 1167 entries. Sep 12 17:42:41.398323 systemd-journald[1217]: System Journal (/var/log/journal/d954c4229211415ba8467298b62a36bf) is 8M, max 584.8M, 576.8M free. Sep 12 17:42:41.462643 systemd-journald[1217]: Received client request to flush runtime journal. Sep 12 17:42:41.462732 kernel: loop0: detected capacity change from 0 to 111000 Sep 12 17:42:41.462751 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 17:42:41.402403 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 17:42:41.404576 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 17:42:41.410480 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 17:42:41.413678 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 17:42:41.417582 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 17:42:41.432094 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 17:42:41.465860 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 17:42:41.467838 kernel: loop1: detected capacity change from 0 to 224512 Sep 12 17:42:41.476924 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 17:42:41.481368 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 17:42:41.485929 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 17:42:41.505824 kernel: loop2: detected capacity change from 0 to 128016 Sep 12 17:42:41.508195 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. Sep 12 17:42:41.508211 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. Sep 12 17:42:41.511529 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 17:42:41.534901 kernel: loop3: detected capacity change from 0 to 8 Sep 12 17:42:41.553466 kernel: loop4: detected capacity change from 0 to 111000 Sep 12 17:42:41.569912 kernel: loop5: detected capacity change from 0 to 224512 Sep 12 17:42:41.592851 kernel: loop6: detected capacity change from 0 to 128016 Sep 12 17:42:41.612870 kernel: loop7: detected capacity change from 0 to 8 Sep 12 17:42:41.613103 (sd-merge)[1281]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 12 17:42:41.613484 (sd-merge)[1281]: Merged extensions into '/usr'. Sep 12 17:42:41.619787 systemd[1]: Reload requested from client PID 1258 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 17:42:41.619923 systemd[1]: Reloading... Sep 12 17:42:41.701839 zram_generator::config[1316]: No configuration found. Sep 12 17:42:41.848839 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 17:42:41.849270 systemd[1]: Reloading finished in 229 ms. Sep 12 17:42:41.864251 ldconfig[1253]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 17:42:41.865153 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 17:42:41.866078 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 17:42:41.873641 systemd[1]: Starting ensure-sysext.service... Sep 12 17:42:41.877304 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 17:42:41.892702 systemd[1]: Reload requested from client PID 1350 ('systemctl') (unit ensure-sysext.service)... Sep 12 17:42:41.892716 systemd[1]: Reloading... Sep 12 17:42:41.896398 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 17:42:41.896867 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 17:42:41.897104 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 17:42:41.897582 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 17:42:41.899838 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 17:42:41.900030 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Sep 12 17:42:41.900067 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Sep 12 17:42:41.908619 systemd-tmpfiles[1351]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:42:41.909830 systemd-tmpfiles[1351]: Skipping /boot Sep 12 17:42:41.920297 systemd-tmpfiles[1351]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 17:42:41.920859 systemd-tmpfiles[1351]: Skipping /boot Sep 12 17:42:41.939842 zram_generator::config[1374]: No configuration found. Sep 12 17:42:42.097382 systemd[1]: Reloading finished in 203 ms. Sep 12 17:42:42.119935 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 17:42:42.124084 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 17:42:42.129898 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:42:42.132721 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 17:42:42.135907 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 17:42:42.140535 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 17:42:42.143623 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 17:42:42.147993 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 17:42:42.157353 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:42.158488 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:42:42.161927 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 17:42:42.170272 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 17:42:42.174169 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 17:42:42.174904 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:42:42.174991 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:42:42.175063 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:42.176870 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 17:42:42.182172 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:42.182305 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:42:42.182419 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:42:42.182487 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:42:42.184760 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 17:42:42.190140 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 17:42:42.191903 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:42.197526 systemd-udevd[1433]: Using default interface naming scheme 'v255'. Sep 12 17:42:42.206201 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:42.206554 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 17:42:42.213022 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 17:42:42.213969 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 17:42:42.214970 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 17:42:42.215149 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 17:42:42.216582 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 17:42:42.222770 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 17:42:42.225113 systemd[1]: Finished ensure-sysext.service. Sep 12 17:42:42.229309 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 17:42:42.229896 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 17:42:42.244942 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 17:42:42.245683 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 17:42:42.246143 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 17:42:42.247271 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 17:42:42.247563 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 17:42:42.249130 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 17:42:42.249280 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 17:42:42.252148 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 17:42:42.252196 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 17:42:42.264171 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 17:42:42.268338 augenrules[1463]: No rules Sep 12 17:42:42.268677 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 17:42:42.277981 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:42:42.278622 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:42:42.279848 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 17:42:42.288569 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 17:42:42.294558 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 17:42:42.358534 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 17:42:42.452861 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 17:42:42.452146 systemd-networkd[1466]: lo: Link UP Sep 12 17:42:42.452149 systemd-networkd[1466]: lo: Gained carrier Sep 12 17:42:42.455493 systemd-networkd[1466]: Enumeration completed Sep 12 17:42:42.457908 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 17:42:42.459712 systemd-networkd[1466]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:42.461053 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 17:42:42.464557 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 17:42:42.466910 systemd-networkd[1466]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:42:42.469131 systemd-networkd[1466]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:42.469138 systemd-networkd[1466]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 17:42:42.469448 systemd-networkd[1466]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:42.469468 systemd-networkd[1466]: eth0: Link UP Sep 12 17:42:42.469571 systemd-networkd[1466]: eth0: Gained carrier Sep 12 17:42:42.469581 systemd-networkd[1466]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:42.473291 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 17:42:42.474637 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 17:42:42.475412 systemd-networkd[1466]: eth1: Link UP Sep 12 17:42:42.477505 systemd-networkd[1466]: eth1: Gained carrier Sep 12 17:42:42.477572 systemd-networkd[1466]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 17:42:42.479099 systemd-resolved[1430]: Positive Trust Anchors: Sep 12 17:42:42.479115 systemd-resolved[1430]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 17:42:42.479139 systemd-resolved[1430]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 17:42:42.482370 systemd-resolved[1430]: Using system hostname 'ci-4426-1-0-d-1f6ac31256'. Sep 12 17:42:42.489449 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 17:42:42.490370 systemd[1]: Reached target network.target - Network. Sep 12 17:42:42.491858 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 17:42:42.492313 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 17:42:42.493264 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 17:42:42.494538 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 17:42:42.495201 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 17:42:42.496391 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 17:42:42.497099 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 17:42:42.497942 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 17:42:42.499868 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 17:42:42.499897 systemd[1]: Reached target paths.target - Path Units. Sep 12 17:42:42.500994 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 12 17:42:42.500459 systemd[1]: Reached target timers.target - Timer Units. Sep 12 17:42:42.501462 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 17:42:42.504236 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 17:42:42.507476 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 17:42:42.508959 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 17:42:42.509684 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 17:42:42.509871 kernel: ACPI: button: Power Button [PWRF] Sep 12 17:42:42.510890 systemd-networkd[1466]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 17:42:42.511621 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 17:42:42.511758 systemd-timesyncd[1459]: Network configuration changed, trying to establish connection. Sep 12 17:42:42.512265 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 17:42:42.513545 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 17:42:42.515089 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 17:42:42.517373 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 12 17:42:42.519762 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 17:42:42.520210 systemd[1]: Reached target basic.target - Basic System. Sep 12 17:42:42.520656 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:42:42.520683 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 17:42:42.521449 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 17:42:42.522663 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 17:42:42.523959 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 17:42:42.527003 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 17:42:42.531310 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 17:42:42.533356 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 17:42:42.534189 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 17:42:42.536883 systemd-networkd[1466]: eth0: DHCPv4 address 95.216.139.29/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 17:42:42.537620 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 17:42:42.539408 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 17:42:42.540778 systemd-timesyncd[1459]: Network configuration changed, trying to establish connection. Sep 12 17:42:42.543377 jq[1529]: false Sep 12 17:42:42.550417 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 17:42:42.552165 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 12 17:42:42.554024 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 17:42:42.558350 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 17:42:42.561619 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 17:42:42.562784 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 17:42:42.563143 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 17:42:42.564468 google_oslogin_nss_cache[1531]: oslogin_cache_refresh[1531]: Refreshing passwd entry cache Sep 12 17:42:42.564661 oslogin_cache_refresh[1531]: Refreshing passwd entry cache Sep 12 17:42:42.567974 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 17:42:42.570913 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 17:42:42.574769 google_oslogin_nss_cache[1531]: oslogin_cache_refresh[1531]: Failure getting users, quitting Sep 12 17:42:42.575083 oslogin_cache_refresh[1531]: Failure getting users, quitting Sep 12 17:42:42.575154 google_oslogin_nss_cache[1531]: oslogin_cache_refresh[1531]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:42:42.575181 oslogin_cache_refresh[1531]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 17:42:42.575259 google_oslogin_nss_cache[1531]: oslogin_cache_refresh[1531]: Refreshing group entry cache Sep 12 17:42:42.577119 oslogin_cache_refresh[1531]: Refreshing group entry cache Sep 12 17:42:42.579368 google_oslogin_nss_cache[1531]: oslogin_cache_refresh[1531]: Failure getting groups, quitting Sep 12 17:42:42.579368 google_oslogin_nss_cache[1531]: oslogin_cache_refresh[1531]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:42:42.579095 oslogin_cache_refresh[1531]: Failure getting groups, quitting Sep 12 17:42:42.579103 oslogin_cache_refresh[1531]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 17:42:42.580761 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 17:42:42.581455 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 17:42:42.581602 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 17:42:42.581939 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 17:42:42.582857 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 17:42:42.603270 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 17:42:42.604940 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 17:42:42.608655 extend-filesystems[1530]: Found /dev/sda6 Sep 12 17:42:42.614520 extend-filesystems[1530]: Found /dev/sda9 Sep 12 17:42:42.617530 extend-filesystems[1530]: Checking size of /dev/sda9 Sep 12 17:42:42.624827 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Sep 12 17:42:42.624880 jq[1543]: true Sep 12 17:42:42.628359 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 17:42:42.630009 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 17:42:42.634922 extend-filesystems[1530]: Resized partition /dev/sda9 Sep 12 17:42:42.638971 tar[1554]: linux-amd64/LICENSE Sep 12 17:42:42.639606 tar[1554]: linux-amd64/helm Sep 12 17:42:42.640359 extend-filesystems[1576]: resize2fs 1.47.2 (1-Jan-2025) Sep 12 17:42:42.641620 update_engine[1541]: I20250912 17:42:42.636794 1541 main.cc:92] Flatcar Update Engine starting Sep 12 17:42:42.642718 (ntainerd)[1566]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 17:42:42.650441 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 12 17:42:42.647350 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 17:42:42.647229 dbus-daemon[1527]: [system] SELinux support is enabled Sep 12 17:42:42.650931 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 17:42:42.651005 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 17:42:42.651738 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 17:42:42.651835 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 17:42:42.652802 coreos-metadata[1526]: Sep 12 17:42:42.652 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 12 17:42:42.657248 coreos-metadata[1526]: Sep 12 17:42:42.656 INFO Fetch successful Sep 12 17:42:42.657248 coreos-metadata[1526]: Sep 12 17:42:42.656 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 12 17:42:42.658946 coreos-metadata[1526]: Sep 12 17:42:42.657 INFO Fetch successful Sep 12 17:42:42.661633 systemd[1]: Started update-engine.service - Update Engine. Sep 12 17:42:42.663882 update_engine[1541]: I20250912 17:42:42.663838 1541 update_check_scheduler.cc:74] Next update check in 7m33s Sep 12 17:42:42.666236 jq[1571]: true Sep 12 17:42:42.666728 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 17:42:42.705699 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 12 17:42:42.706068 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 12 17:42:42.715828 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Sep 12 17:42:42.729013 systemd-logind[1540]: New seat seat0. Sep 12 17:42:42.731225 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 17:42:42.747834 kernel: Console: switching to colour dummy device 80x25 Sep 12 17:42:42.751566 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 12 17:42:42.751600 kernel: [drm] features: -context_init Sep 12 17:42:42.756827 kernel: [drm] number of scanouts: 1 Sep 12 17:42:42.756863 kernel: [drm] number of cap sets: 0 Sep 12 17:42:42.758822 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Sep 12 17:42:42.781157 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Sep 12 17:42:42.784271 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 17:42:42.790852 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 12 17:42:42.800409 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 17:42:42.803926 kernel: EDAC MC: Ver: 3.0.0 Sep 12 17:42:42.805411 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 17:42:42.805572 bash[1603]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:42:42.807866 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 17:42:42.809966 systemd[1]: Starting sshkeys.service... Sep 12 17:42:42.839832 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 12 17:42:42.871793 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 17:42:42.876578 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 17:42:42.880634 extend-filesystems[1576]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 17:42:42.880634 extend-filesystems[1576]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 12 17:42:42.880634 extend-filesystems[1576]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 12 17:42:42.880397 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 17:42:42.889013 extend-filesystems[1530]: Resized filesystem in /dev/sda9 Sep 12 17:42:42.880552 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 17:42:42.932017 coreos-metadata[1624]: Sep 12 17:42:42.930 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 12 17:42:42.932303 coreos-metadata[1624]: Sep 12 17:42:42.932 INFO Fetch successful Sep 12 17:42:42.935880 unknown[1624]: wrote ssh authorized keys file for user: core Sep 12 17:42:42.958597 containerd[1566]: time="2025-09-12T17:42:42Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 17:42:42.961292 containerd[1566]: time="2025-09-12T17:42:42.961261327Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 17:42:42.987739 containerd[1566]: time="2025-09-12T17:42:42.987626915Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.01µs" Sep 12 17:42:42.987739 containerd[1566]: time="2025-09-12T17:42:42.987665197Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 17:42:42.987739 containerd[1566]: time="2025-09-12T17:42:42.987684413Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 17:42:42.992050 containerd[1566]: time="2025-09-12T17:42:42.992015274Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 17:42:42.992050 containerd[1566]: time="2025-09-12T17:42:42.992052283Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 17:42:42.992141 containerd[1566]: time="2025-09-12T17:42:42.992076409Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992141 containerd[1566]: time="2025-09-12T17:42:42.992128927Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992141 containerd[1566]: time="2025-09-12T17:42:42.992138625Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992308 containerd[1566]: time="2025-09-12T17:42:42.992284659Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992308 containerd[1566]: time="2025-09-12T17:42:42.992305919Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992343 containerd[1566]: time="2025-09-12T17:42:42.992315226Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992343 containerd[1566]: time="2025-09-12T17:42:42.992322860Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992424 containerd[1566]: time="2025-09-12T17:42:42.992403241Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992825 containerd[1566]: time="2025-09-12T17:42:42.992637541Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992825 containerd[1566]: time="2025-09-12T17:42:42.992681753Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 17:42:42.992825 containerd[1566]: time="2025-09-12T17:42:42.992714635Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 17:42:42.992825 containerd[1566]: time="2025-09-12T17:42:42.992759839Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 17:42:42.994843 containerd[1566]: time="2025-09-12T17:42:42.993056035Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 17:42:42.994900 containerd[1566]: time="2025-09-12T17:42:42.994885545Z" level=info msg="metadata content store policy set" policy=shared Sep 12 17:42:42.996238 locksmithd[1577]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 17:42:42.999718 containerd[1566]: time="2025-09-12T17:42:42.999677221Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 17:42:42.999778 containerd[1566]: time="2025-09-12T17:42:42.999737554Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 17:42:42.999778 containerd[1566]: time="2025-09-12T17:42:42.999751420Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 17:42:42.999778 containerd[1566]: time="2025-09-12T17:42:42.999761919Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 17:42:42.999778 containerd[1566]: time="2025-09-12T17:42:42.999772680Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 17:42:42.999861 containerd[1566]: time="2025-09-12T17:42:42.999781186Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 17:42:42.999861 containerd[1566]: time="2025-09-12T17:42:42.999790243Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 17:42:42.999861 containerd[1566]: time="2025-09-12T17:42:42.999799821Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 17:42:42.999861 containerd[1566]: time="2025-09-12T17:42:42.999835747Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 17:42:42.999861 containerd[1566]: time="2025-09-12T17:42:42.999846949Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 17:42:42.999861 containerd[1566]: time="2025-09-12T17:42:42.999854393Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 17:42:42.999939 containerd[1566]: time="2025-09-12T17:42:42.999864742Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 17:42:43.000037 containerd[1566]: time="2025-09-12T17:42:43.000003392Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 17:42:43.000037 containerd[1566]: time="2025-09-12T17:42:43.000026545Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 17:42:43.000085 containerd[1566]: time="2025-09-12T17:42:43.000040612Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 17:42:43.000085 containerd[1566]: time="2025-09-12T17:42:43.000049298Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 17:42:43.000085 containerd[1566]: time="2025-09-12T17:42:43.000057353Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 17:42:43.000085 containerd[1566]: time="2025-09-12T17:42:43.000065218Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 17:42:43.000085 containerd[1566]: time="2025-09-12T17:42:43.000074285Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 17:42:43.000085 containerd[1566]: time="2025-09-12T17:42:43.000082170Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 17:42:43.000171 containerd[1566]: time="2025-09-12T17:42:43.000090556Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 17:42:43.000171 containerd[1566]: time="2025-09-12T17:42:43.000098250Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 17:42:43.000171 containerd[1566]: time="2025-09-12T17:42:43.000106325Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 17:42:43.000171 containerd[1566]: time="2025-09-12T17:42:43.000160287Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 17:42:43.000226 containerd[1566]: time="2025-09-12T17:42:43.000171377Z" level=info msg="Start snapshots syncer" Sep 12 17:42:43.000226 containerd[1566]: time="2025-09-12T17:42:43.000199600Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 17:42:43.000462 containerd[1566]: time="2025-09-12T17:42:43.000431294Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 17:42:43.000587 containerd[1566]: time="2025-09-12T17:42:43.000475598Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 17:42:43.000587 containerd[1566]: time="2025-09-12T17:42:43.000538856Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000649925Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000673449Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000683567Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000708464Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000718152Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000726037Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000734644Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000751886Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000760481Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 17:42:43.000871 containerd[1566]: time="2025-09-12T17:42:43.000768216Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001841168Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001864702Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001872978Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001881093Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001929143Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001940234Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001950784Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001963397Z" level=info msg="runtime interface created" Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001968136Z" level=info msg="created NRI interface" Sep 12 17:42:43.001973 containerd[1566]: time="2025-09-12T17:42:43.001974388Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 17:42:43.002229 containerd[1566]: time="2025-09-12T17:42:43.001983294Z" level=info msg="Connect containerd service" Sep 12 17:42:43.002229 containerd[1566]: time="2025-09-12T17:42:43.002004174Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 17:42:43.004491 containerd[1566]: time="2025-09-12T17:42:43.004409203Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 17:42:43.039623 update-ssh-keys[1632]: Updated "/home/core/.ssh/authorized_keys" Sep 12 17:42:43.040851 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 17:42:43.044932 systemd[1]: Finished sshkeys.service. Sep 12 17:42:43.163305 sshd_keygen[1561]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 17:42:43.188308 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 17:42:43.192553 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.201887155Z" level=info msg="Start subscribing containerd event" Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.201944843Z" level=info msg="Start recovering state" Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.202062183Z" level=info msg="Start event monitor" Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.202079134Z" level=info msg="Start cni network conf syncer for default" Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.202090556Z" level=info msg="Start streaming server" Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.202101777Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.202109431Z" level=info msg="runtime interface starting up..." Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.202116535Z" level=info msg="starting plugins..." Sep 12 17:42:43.204401 containerd[1566]: time="2025-09-12T17:42:43.202131533Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 17:42:43.205339 containerd[1566]: time="2025-09-12T17:42:43.204670303Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 17:42:43.205339 containerd[1566]: time="2025-09-12T17:42:43.204723523Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 17:42:43.204872 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 17:42:43.205661 containerd[1566]: time="2025-09-12T17:42:43.205536297Z" level=info msg="containerd successfully booted in 0.247259s" Sep 12 17:42:43.212896 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 17:42:43.225454 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 17:42:43.233882 systemd-logind[1540]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 17:42:43.240679 systemd-logind[1540]: Watching system buttons on /dev/input/event3 (Power Button) Sep 12 17:42:43.243067 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:42:43.247461 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 17:42:43.251054 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 17:42:43.253397 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 17:42:43.273030 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 17:42:43.275459 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 17:42:43.278273 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 17:42:43.283904 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 17:42:43.284513 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 17:42:43.297906 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 17:42:43.298056 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:42:43.301663 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 17:42:43.309069 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 17:42:43.378256 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 17:42:43.385789 tar[1554]: linux-amd64/README.md Sep 12 17:42:43.399476 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 17:42:43.644086 systemd-networkd[1466]: eth1: Gained IPv6LL Sep 12 17:42:43.644911 systemd-timesyncd[1459]: Network configuration changed, trying to establish connection. Sep 12 17:42:43.647531 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 17:42:43.648685 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 17:42:43.658623 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:43.661227 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 17:42:43.691544 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 17:42:43.836032 systemd-networkd[1466]: eth0: Gained IPv6LL Sep 12 17:42:43.836677 systemd-timesyncd[1459]: Network configuration changed, trying to establish connection. Sep 12 17:42:44.735154 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:44.736630 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 17:42:44.740990 systemd[1]: Startup finished in 3.035s (kernel) + 5.748s (initrd) + 4.403s (userspace) = 13.187s. Sep 12 17:42:44.744385 (kubelet)[1709]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:42:45.545237 kubelet[1709]: E0912 17:42:45.545150 1709 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:42:45.548537 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:42:45.548793 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:42:45.549402 systemd[1]: kubelet.service: Consumed 1.260s CPU time, 265.6M memory peak. Sep 12 17:42:51.139599 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 17:42:51.141615 systemd[1]: Started sshd@0-95.216.139.29:22-139.178.68.195:44414.service - OpenSSH per-connection server daemon (139.178.68.195:44414). Sep 12 17:42:52.252855 sshd[1721]: Accepted publickey for core from 139.178.68.195 port 44414 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:42:52.254444 sshd-session[1721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:52.260208 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 17:42:52.261065 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 17:42:52.268843 systemd-logind[1540]: New session 1 of user core. Sep 12 17:42:52.274366 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 17:42:52.276650 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 17:42:52.285157 (systemd)[1726]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 17:42:52.287354 systemd-logind[1540]: New session c1 of user core. Sep 12 17:42:52.404664 systemd[1726]: Queued start job for default target default.target. Sep 12 17:42:52.414523 systemd[1726]: Created slice app.slice - User Application Slice. Sep 12 17:42:52.414544 systemd[1726]: Reached target paths.target - Paths. Sep 12 17:42:52.414576 systemd[1726]: Reached target timers.target - Timers. Sep 12 17:42:52.415508 systemd[1726]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 17:42:52.424227 systemd[1726]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 17:42:52.424260 systemd[1726]: Reached target sockets.target - Sockets. Sep 12 17:42:52.424294 systemd[1726]: Reached target basic.target - Basic System. Sep 12 17:42:52.424319 systemd[1726]: Reached target default.target - Main User Target. Sep 12 17:42:52.424338 systemd[1726]: Startup finished in 131ms. Sep 12 17:42:52.424403 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 17:42:52.429937 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 17:42:53.151244 systemd[1]: Started sshd@1-95.216.139.29:22-139.178.68.195:44418.service - OpenSSH per-connection server daemon (139.178.68.195:44418). Sep 12 17:42:54.150617 sshd[1737]: Accepted publickey for core from 139.178.68.195 port 44418 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:42:54.152189 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:54.157800 systemd-logind[1540]: New session 2 of user core. Sep 12 17:42:54.163947 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 17:42:54.825288 sshd[1740]: Connection closed by 139.178.68.195 port 44418 Sep 12 17:42:54.825881 sshd-session[1737]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:54.829499 systemd-logind[1540]: Session 2 logged out. Waiting for processes to exit. Sep 12 17:42:54.830309 systemd[1]: sshd@1-95.216.139.29:22-139.178.68.195:44418.service: Deactivated successfully. Sep 12 17:42:54.832305 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 17:42:54.834243 systemd-logind[1540]: Removed session 2. Sep 12 17:42:54.998374 systemd[1]: Started sshd@2-95.216.139.29:22-139.178.68.195:44422.service - OpenSSH per-connection server daemon (139.178.68.195:44422). Sep 12 17:42:55.549373 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 17:42:55.551274 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:42:55.685324 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:42:55.702235 (kubelet)[1757]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:42:55.749356 kubelet[1757]: E0912 17:42:55.749267 1757 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:42:55.753056 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:42:55.753194 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:42:55.753461 systemd[1]: kubelet.service: Consumed 136ms CPU time, 109.1M memory peak. Sep 12 17:42:55.980244 sshd[1746]: Accepted publickey for core from 139.178.68.195 port 44422 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:42:55.981590 sshd-session[1746]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:55.986730 systemd-logind[1540]: New session 3 of user core. Sep 12 17:42:55.991970 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 17:42:56.649555 sshd[1765]: Connection closed by 139.178.68.195 port 44422 Sep 12 17:42:56.650097 sshd-session[1746]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:56.653927 systemd[1]: sshd@2-95.216.139.29:22-139.178.68.195:44422.service: Deactivated successfully. Sep 12 17:42:56.655687 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 17:42:56.656493 systemd-logind[1540]: Session 3 logged out. Waiting for processes to exit. Sep 12 17:42:56.657786 systemd-logind[1540]: Removed session 3. Sep 12 17:42:56.816102 systemd[1]: Started sshd@3-95.216.139.29:22-139.178.68.195:44430.service - OpenSSH per-connection server daemon (139.178.68.195:44430). Sep 12 17:42:57.793142 sshd[1771]: Accepted publickey for core from 139.178.68.195 port 44430 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:42:57.794651 sshd-session[1771]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:57.800415 systemd-logind[1540]: New session 4 of user core. Sep 12 17:42:57.810999 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 17:42:58.465859 sshd[1774]: Connection closed by 139.178.68.195 port 44430 Sep 12 17:42:58.466519 sshd-session[1771]: pam_unix(sshd:session): session closed for user core Sep 12 17:42:58.470109 systemd-logind[1540]: Session 4 logged out. Waiting for processes to exit. Sep 12 17:42:58.470686 systemd[1]: sshd@3-95.216.139.29:22-139.178.68.195:44430.service: Deactivated successfully. Sep 12 17:42:58.472478 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 17:42:58.474118 systemd-logind[1540]: Removed session 4. Sep 12 17:42:58.671785 systemd[1]: Started sshd@4-95.216.139.29:22-139.178.68.195:44440.service - OpenSSH per-connection server daemon (139.178.68.195:44440). Sep 12 17:42:59.766467 sshd[1780]: Accepted publickey for core from 139.178.68.195 port 44440 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:42:59.768117 sshd-session[1780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:42:59.773114 systemd-logind[1540]: New session 5 of user core. Sep 12 17:42:59.780989 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 17:43:00.348358 sudo[1784]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 17:43:00.348577 sudo[1784]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:43:00.363977 sudo[1784]: pam_unix(sudo:session): session closed for user root Sep 12 17:43:00.540397 sshd[1783]: Connection closed by 139.178.68.195 port 44440 Sep 12 17:43:00.541376 sshd-session[1780]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:00.545451 systemd[1]: sshd@4-95.216.139.29:22-139.178.68.195:44440.service: Deactivated successfully. Sep 12 17:43:00.548083 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 17:43:00.550114 systemd-logind[1540]: Session 5 logged out. Waiting for processes to exit. Sep 12 17:43:00.552356 systemd-logind[1540]: Removed session 5. Sep 12 17:43:00.724983 systemd[1]: Started sshd@5-95.216.139.29:22-139.178.68.195:41372.service - OpenSSH per-connection server daemon (139.178.68.195:41372). Sep 12 17:43:01.808971 sshd[1790]: Accepted publickey for core from 139.178.68.195 port 41372 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:43:01.810198 sshd-session[1790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:43:01.814656 systemd-logind[1540]: New session 6 of user core. Sep 12 17:43:01.821944 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 17:43:02.379299 sudo[1795]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 17:43:02.379617 sudo[1795]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:43:02.385022 sudo[1795]: pam_unix(sudo:session): session closed for user root Sep 12 17:43:02.391015 sudo[1794]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 17:43:02.391333 sudo[1794]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:43:02.402310 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 17:43:02.438356 augenrules[1817]: No rules Sep 12 17:43:02.439343 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 17:43:02.439526 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 17:43:02.440829 sudo[1794]: pam_unix(sudo:session): session closed for user root Sep 12 17:43:02.615694 sshd[1793]: Connection closed by 139.178.68.195 port 41372 Sep 12 17:43:02.616369 sshd-session[1790]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:02.619522 systemd[1]: sshd@5-95.216.139.29:22-139.178.68.195:41372.service: Deactivated successfully. Sep 12 17:43:02.621362 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 17:43:02.622650 systemd-logind[1540]: Session 6 logged out. Waiting for processes to exit. Sep 12 17:43:02.624154 systemd-logind[1540]: Removed session 6. Sep 12 17:43:02.801248 systemd[1]: Started sshd@6-95.216.139.29:22-139.178.68.195:41382.service - OpenSSH per-connection server daemon (139.178.68.195:41382). Sep 12 17:43:03.892494 sshd[1826]: Accepted publickey for core from 139.178.68.195 port 41382 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:43:03.893981 sshd-session[1826]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:43:03.899082 systemd-logind[1540]: New session 7 of user core. Sep 12 17:43:03.903934 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 17:43:04.461257 sudo[1830]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 17:43:04.461582 sudo[1830]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 17:43:04.746033 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 17:43:04.772190 (dockerd)[1848]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 17:43:04.988824 dockerd[1848]: time="2025-09-12T17:43:04.988742377Z" level=info msg="Starting up" Sep 12 17:43:04.989655 dockerd[1848]: time="2025-09-12T17:43:04.989612639Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 17:43:05.001273 dockerd[1848]: time="2025-09-12T17:43:05.000940940Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 17:43:05.015363 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2381401734-merged.mount: Deactivated successfully. Sep 12 17:43:05.027092 systemd[1]: var-lib-docker-metacopy\x2dcheck1208901009-merged.mount: Deactivated successfully. Sep 12 17:43:05.044118 dockerd[1848]: time="2025-09-12T17:43:05.044096832Z" level=info msg="Loading containers: start." Sep 12 17:43:05.055283 kernel: Initializing XFRM netlink socket Sep 12 17:43:05.211070 systemd-timesyncd[1459]: Network configuration changed, trying to establish connection. Sep 12 17:43:05.246735 systemd-networkd[1466]: docker0: Link UP Sep 12 17:43:05.250561 dockerd[1848]: time="2025-09-12T17:43:05.250522561Z" level=info msg="Loading containers: done." Sep 12 17:43:05.261013 systemd-timesyncd[1459]: Contacted time server 194.36.144.87:123 (2.flatcar.pool.ntp.org). Sep 12 17:43:05.261093 systemd-timesyncd[1459]: Initial clock synchronization to Fri 2025-09-12 17:43:05.411307 UTC. Sep 12 17:43:05.265658 dockerd[1848]: time="2025-09-12T17:43:05.265619599Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 17:43:05.265755 dockerd[1848]: time="2025-09-12T17:43:05.265696013Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 17:43:05.265755 dockerd[1848]: time="2025-09-12T17:43:05.265750585Z" level=info msg="Initializing buildkit" Sep 12 17:43:05.283068 dockerd[1848]: time="2025-09-12T17:43:05.283040396Z" level=info msg="Completed buildkit initialization" Sep 12 17:43:05.291361 dockerd[1848]: time="2025-09-12T17:43:05.291321484Z" level=info msg="Daemon has completed initialization" Sep 12 17:43:05.291523 dockerd[1848]: time="2025-09-12T17:43:05.291496181Z" level=info msg="API listen on /run/docker.sock" Sep 12 17:43:05.291560 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 17:43:05.799310 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 17:43:05.800900 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:43:05.917355 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:43:05.927999 (kubelet)[2066]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:43:05.965919 kubelet[2066]: E0912 17:43:05.965870 2066 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:43:05.968158 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:43:05.968270 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:43:05.968666 systemd[1]: kubelet.service: Consumed 126ms CPU time, 110.6M memory peak. Sep 12 17:43:06.011738 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3464010937-merged.mount: Deactivated successfully. Sep 12 17:43:06.377766 containerd[1566]: time="2025-09-12T17:43:06.377692713Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\"" Sep 12 17:43:06.900489 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount185702474.mount: Deactivated successfully. Sep 12 17:43:07.916175 containerd[1566]: time="2025-09-12T17:43:07.915264782Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:07.916175 containerd[1566]: time="2025-09-12T17:43:07.916139966Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.9: active requests=0, bytes read=28838016" Sep 12 17:43:07.916883 containerd[1566]: time="2025-09-12T17:43:07.916828564Z" level=info msg="ImageCreate event name:\"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:07.918987 containerd[1566]: time="2025-09-12T17:43:07.918967734Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:07.919698 containerd[1566]: time="2025-09-12T17:43:07.919655793Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.9\" with image id \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.9\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6df11cc2ad9679b1117be34d3a0230add88bc0a08fd7a3ebc26b680575e8de97\", size \"28834515\" in 1.541920958s" Sep 12 17:43:07.919741 containerd[1566]: time="2025-09-12T17:43:07.919701150Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.9\" returns image reference \"sha256:abd2b525baf428ffb8b8b7d1e09761dc5cdb7ed0c7896a9427e29e84f8eafc59\"" Sep 12 17:43:07.920231 containerd[1566]: time="2025-09-12T17:43:07.920201850Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\"" Sep 12 17:43:09.033945 containerd[1566]: time="2025-09-12T17:43:09.033894441Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:09.034851 containerd[1566]: time="2025-09-12T17:43:09.034762064Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.9: active requests=0, bytes read=24787049" Sep 12 17:43:09.035755 containerd[1566]: time="2025-09-12T17:43:09.035722553Z" level=info msg="ImageCreate event name:\"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:09.037754 containerd[1566]: time="2025-09-12T17:43:09.037728112Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:09.038409 containerd[1566]: time="2025-09-12T17:43:09.038390225Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.9\" with image id \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.9\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:243c4b8e3bce271fcb1b78008ab996ab6976b1a20096deac08338fcd17979922\", size \"26421706\" in 1.118010368s" Sep 12 17:43:09.038477 containerd[1566]: time="2025-09-12T17:43:09.038465106Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.9\" returns image reference \"sha256:0debe32fbb7223500fcf8c312f2a568a5abd3ed9274d8ec6780cfb30b8861e91\"" Sep 12 17:43:09.039018 containerd[1566]: time="2025-09-12T17:43:09.038997912Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\"" Sep 12 17:43:09.988646 containerd[1566]: time="2025-09-12T17:43:09.988578961Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:09.989443 containerd[1566]: time="2025-09-12T17:43:09.989357094Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.9: active requests=0, bytes read=19176311" Sep 12 17:43:09.990190 containerd[1566]: time="2025-09-12T17:43:09.990168330Z" level=info msg="ImageCreate event name:\"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:09.992271 containerd[1566]: time="2025-09-12T17:43:09.992237670Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:09.993021 containerd[1566]: time="2025-09-12T17:43:09.992903762Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.9\" with image id \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.9\", repo digest \"registry.k8s.io/kube-scheduler@sha256:50c49520dbd0e8b4076b6a5c77d8014df09ea3d59a73e8bafd2678d51ebb92d5\", size \"20810986\" in 953.88352ms" Sep 12 17:43:09.993021 containerd[1566]: time="2025-09-12T17:43:09.992927948Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.9\" returns image reference \"sha256:6934c23b154fcb9bf54ed5913782de746735a49f4daa4732285915050cd44ad5\"" Sep 12 17:43:09.993627 containerd[1566]: time="2025-09-12T17:43:09.993597427Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\"" Sep 12 17:43:10.957215 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3213807094.mount: Deactivated successfully. Sep 12 17:43:11.280319 containerd[1566]: time="2025-09-12T17:43:11.280191446Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:11.281264 containerd[1566]: time="2025-09-12T17:43:11.281158683Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.9: active requests=0, bytes read=30924234" Sep 12 17:43:11.281967 containerd[1566]: time="2025-09-12T17:43:11.281931968Z" level=info msg="ImageCreate event name:\"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:11.283553 containerd[1566]: time="2025-09-12T17:43:11.283515447Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:11.284241 containerd[1566]: time="2025-09-12T17:43:11.283963187Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.9\" with image id \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\", repo tag \"registry.k8s.io/kube-proxy:v1.32.9\", repo digest \"registry.k8s.io/kube-proxy@sha256:886af02535dc34886e4618b902f8c140d89af57233a245621d29642224516064\", size \"30923225\" in 1.290283821s" Sep 12 17:43:11.284241 containerd[1566]: time="2025-09-12T17:43:11.283996153Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.9\" returns image reference \"sha256:fa3fdca615a501743d8deb39729a96e731312aac8d96accec061d5265360332f\"" Sep 12 17:43:11.284479 containerd[1566]: time="2025-09-12T17:43:11.284423742Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 17:43:11.768609 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount771516830.mount: Deactivated successfully. Sep 12 17:43:12.518530 containerd[1566]: time="2025-09-12T17:43:12.518479660Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:12.519446 containerd[1566]: time="2025-09-12T17:43:12.519407895Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Sep 12 17:43:12.520426 containerd[1566]: time="2025-09-12T17:43:12.520039970Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:12.523144 containerd[1566]: time="2025-09-12T17:43:12.523112935Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:12.525421 containerd[1566]: time="2025-09-12T17:43:12.525385077Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.240931413s" Sep 12 17:43:12.525470 containerd[1566]: time="2025-09-12T17:43:12.525424945Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 17:43:12.526823 containerd[1566]: time="2025-09-12T17:43:12.526778996Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 17:43:12.964417 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4003098515.mount: Deactivated successfully. Sep 12 17:43:12.971089 containerd[1566]: time="2025-09-12T17:43:12.971036126Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:43:12.972050 containerd[1566]: time="2025-09-12T17:43:12.972000511Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Sep 12 17:43:12.972893 containerd[1566]: time="2025-09-12T17:43:12.972792028Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:43:12.974762 containerd[1566]: time="2025-09-12T17:43:12.974698783Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 17:43:12.976012 containerd[1566]: time="2025-09-12T17:43:12.975593414Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 448.783987ms" Sep 12 17:43:12.976012 containerd[1566]: time="2025-09-12T17:43:12.975623663Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 17:43:12.976477 containerd[1566]: time="2025-09-12T17:43:12.976431214Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 12 17:43:13.429493 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1346462684.mount: Deactivated successfully. Sep 12 17:43:14.763752 containerd[1566]: time="2025-09-12T17:43:14.763693492Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:14.764743 containerd[1566]: time="2025-09-12T17:43:14.764636382Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682132" Sep 12 17:43:14.765563 containerd[1566]: time="2025-09-12T17:43:14.765526626Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:14.768238 containerd[1566]: time="2025-09-12T17:43:14.768188164Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:14.769110 containerd[1566]: time="2025-09-12T17:43:14.769083027Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.792624053s" Sep 12 17:43:14.769187 containerd[1566]: time="2025-09-12T17:43:14.769173416Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 12 17:43:16.049744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 17:43:16.056112 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:43:16.175179 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:43:16.181002 (kubelet)[2285]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 17:43:16.215599 kubelet[2285]: E0912 17:43:16.215556 2285 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 17:43:16.217154 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 17:43:16.217267 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 17:43:16.217663 systemd[1]: kubelet.service: Consumed 123ms CPU time, 107.9M memory peak. Sep 12 17:43:17.150866 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:43:17.151219 systemd[1]: kubelet.service: Consumed 123ms CPU time, 107.9M memory peak. Sep 12 17:43:17.154309 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:43:17.192754 systemd[1]: Reload requested from client PID 2299 ('systemctl') (unit session-7.scope)... Sep 12 17:43:17.192785 systemd[1]: Reloading... Sep 12 17:43:17.280883 zram_generator::config[2344]: No configuration found. Sep 12 17:43:17.444592 systemd[1]: Reloading finished in 251 ms. Sep 12 17:43:17.499857 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 17:43:17.499944 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 17:43:17.500184 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:43:17.500235 systemd[1]: kubelet.service: Consumed 81ms CPU time, 98.3M memory peak. Sep 12 17:43:17.502519 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:43:17.612525 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:43:17.621095 (kubelet)[2398]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:43:17.653165 kubelet[2398]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:43:17.653165 kubelet[2398]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:43:17.653165 kubelet[2398]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:43:17.653594 kubelet[2398]: I0912 17:43:17.653201 2398 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:43:17.889590 kubelet[2398]: I0912 17:43:17.889524 2398 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 17:43:17.889590 kubelet[2398]: I0912 17:43:17.889560 2398 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:43:17.889924 kubelet[2398]: I0912 17:43:17.889896 2398 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 17:43:17.929870 kubelet[2398]: E0912 17:43:17.928779 2398 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://95.216.139.29:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 95.216.139.29:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:43:17.931883 kubelet[2398]: I0912 17:43:17.931608 2398 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:43:17.954178 kubelet[2398]: I0912 17:43:17.952941 2398 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:43:17.963652 kubelet[2398]: I0912 17:43:17.963621 2398 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:43:17.971465 kubelet[2398]: I0912 17:43:17.971368 2398 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:43:17.972123 kubelet[2398]: I0912 17:43:17.971733 2398 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426-1-0-d-1f6ac31256","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:43:17.974841 kubelet[2398]: I0912 17:43:17.974791 2398 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:43:17.974934 kubelet[2398]: I0912 17:43:17.974853 2398 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 17:43:17.976882 kubelet[2398]: I0912 17:43:17.976830 2398 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:43:17.983122 kubelet[2398]: I0912 17:43:17.982962 2398 kubelet.go:446] "Attempting to sync node with API server" Sep 12 17:43:17.983122 kubelet[2398]: I0912 17:43:17.983064 2398 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:43:17.983122 kubelet[2398]: I0912 17:43:17.983101 2398 kubelet.go:352] "Adding apiserver pod source" Sep 12 17:43:17.983122 kubelet[2398]: I0912 17:43:17.983116 2398 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:43:17.991454 kubelet[2398]: W0912 17:43:17.991401 2398 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://95.216.139.29:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-1-0-d-1f6ac31256&limit=500&resourceVersion=0": dial tcp 95.216.139.29:6443: connect: connection refused Sep 12 17:43:17.991648 kubelet[2398]: E0912 17:43:17.991606 2398 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://95.216.139.29:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-1-0-d-1f6ac31256&limit=500&resourceVersion=0\": dial tcp 95.216.139.29:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:43:17.992037 kubelet[2398]: I0912 17:43:17.991867 2398 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:43:17.996730 kubelet[2398]: I0912 17:43:17.996707 2398 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:43:17.997888 kubelet[2398]: W0912 17:43:17.997865 2398 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 17:43:18.003307 kubelet[2398]: I0912 17:43:18.003254 2398 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:43:18.003707 kubelet[2398]: I0912 17:43:18.003461 2398 server.go:1287] "Started kubelet" Sep 12 17:43:18.009649 kubelet[2398]: I0912 17:43:18.009626 2398 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:43:18.016776 kubelet[2398]: W0912 17:43:18.016702 2398 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://95.216.139.29:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 95.216.139.29:6443: connect: connection refused Sep 12 17:43:18.016885 kubelet[2398]: E0912 17:43:18.016773 2398 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://95.216.139.29:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 95.216.139.29:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:43:18.016989 kubelet[2398]: I0912 17:43:18.016928 2398 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:43:18.018036 kubelet[2398]: I0912 17:43:18.017906 2398 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:43:18.021145 kubelet[2398]: I0912 17:43:18.021071 2398 server.go:479] "Adding debug handlers to kubelet server" Sep 12 17:43:18.021965 kubelet[2398]: I0912 17:43:18.021950 2398 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:43:18.024965 kubelet[2398]: I0912 17:43:18.024933 2398 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:43:18.027268 kubelet[2398]: E0912 17:43:18.025894 2398 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://95.216.139.29:6443/api/v1/namespaces/default/events\": dial tcp 95.216.139.29:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426-1-0-d-1f6ac31256.186499ed0e0b8e9a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426-1-0-d-1f6ac31256,UID:ci-4426-1-0-d-1f6ac31256,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426-1-0-d-1f6ac31256,},FirstTimestamp:2025-09-12 17:43:18.00327337 +0000 UTC m=+0.378820828,LastTimestamp:2025-09-12 17:43:18.00327337 +0000 UTC m=+0.378820828,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426-1-0-d-1f6ac31256,}" Sep 12 17:43:18.027491 kubelet[2398]: E0912 17:43:18.027452 2398 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:43:18.027609 kubelet[2398]: I0912 17:43:18.027598 2398 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:43:18.027881 kubelet[2398]: E0912 17:43:18.027867 2398 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" Sep 12 17:43:18.028442 kubelet[2398]: E0912 17:43:18.028394 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.216.139.29:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-1-0-d-1f6ac31256?timeout=10s\": dial tcp 95.216.139.29:6443: connect: connection refused" interval="200ms" Sep 12 17:43:18.028711 kubelet[2398]: I0912 17:43:18.028696 2398 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:43:18.028845 kubelet[2398]: I0912 17:43:18.028801 2398 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:43:18.029138 kubelet[2398]: I0912 17:43:18.029114 2398 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:43:18.029326 kubelet[2398]: W0912 17:43:18.029299 2398 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://95.216.139.29:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 95.216.139.29:6443: connect: connection refused Sep 12 17:43:18.029441 kubelet[2398]: E0912 17:43:18.029380 2398 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://95.216.139.29:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 95.216.139.29:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:43:18.032839 kubelet[2398]: I0912 17:43:18.031841 2398 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:43:18.032839 kubelet[2398]: I0912 17:43:18.031854 2398 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:43:18.044437 kubelet[2398]: I0912 17:43:18.044410 2398 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:43:18.044437 kubelet[2398]: I0912 17:43:18.044427 2398 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:43:18.044437 kubelet[2398]: I0912 17:43:18.044441 2398 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:43:18.046795 kubelet[2398]: I0912 17:43:18.046774 2398 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:43:18.048196 kubelet[2398]: I0912 17:43:18.048172 2398 policy_none.go:49] "None policy: Start" Sep 12 17:43:18.048196 kubelet[2398]: I0912 17:43:18.048194 2398 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:43:18.048268 kubelet[2398]: I0912 17:43:18.048206 2398 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:43:18.049106 kubelet[2398]: I0912 17:43:18.049091 2398 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:43:18.049208 kubelet[2398]: I0912 17:43:18.049198 2398 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 17:43:18.049290 kubelet[2398]: I0912 17:43:18.049280 2398 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:43:18.049881 kubelet[2398]: I0912 17:43:18.049870 2398 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 17:43:18.050101 kubelet[2398]: E0912 17:43:18.050070 2398 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:43:18.050506 kubelet[2398]: W0912 17:43:18.050453 2398 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://95.216.139.29:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 95.216.139.29:6443: connect: connection refused Sep 12 17:43:18.050506 kubelet[2398]: E0912 17:43:18.050502 2398 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://95.216.139.29:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 95.216.139.29:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:43:18.055751 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 17:43:18.066272 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 17:43:18.070140 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 17:43:18.076440 kubelet[2398]: I0912 17:43:18.076411 2398 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:43:18.076594 kubelet[2398]: I0912 17:43:18.076573 2398 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:43:18.076630 kubelet[2398]: I0912 17:43:18.076589 2398 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:43:18.076829 kubelet[2398]: I0912 17:43:18.076758 2398 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:43:18.081718 kubelet[2398]: E0912 17:43:18.081689 2398 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:43:18.081791 kubelet[2398]: E0912 17:43:18.081763 2398 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426-1-0-d-1f6ac31256\" not found" Sep 12 17:43:18.161498 systemd[1]: Created slice kubepods-burstable-pod87a2cd2967fee7bb49b24c9454693116.slice - libcontainer container kubepods-burstable-pod87a2cd2967fee7bb49b24c9454693116.slice. Sep 12 17:43:18.178187 kubelet[2398]: I0912 17:43:18.178117 2398 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.178575 kubelet[2398]: E0912 17:43:18.178537 2398 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.216.139.29:6443/api/v1/nodes\": dial tcp 95.216.139.29:6443: connect: connection refused" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.180380 kubelet[2398]: E0912 17:43:18.180209 2398 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.183738 systemd[1]: Created slice kubepods-burstable-pod40343e56910b922452cd39e383691eaf.slice - libcontainer container kubepods-burstable-pod40343e56910b922452cd39e383691eaf.slice. Sep 12 17:43:18.185840 kubelet[2398]: E0912 17:43:18.185714 2398 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.188172 systemd[1]: Created slice kubepods-burstable-pod13902fa3928ab2b3ade0357f1fa5d06c.slice - libcontainer container kubepods-burstable-pod13902fa3928ab2b3ade0357f1fa5d06c.slice. Sep 12 17:43:18.189762 kubelet[2398]: E0912 17:43:18.189734 2398 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229489 kubelet[2398]: I0912 17:43:18.229306 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/40343e56910b922452cd39e383691eaf-ca-certs\") pod \"kube-apiserver-ci-4426-1-0-d-1f6ac31256\" (UID: \"40343e56910b922452cd39e383691eaf\") " pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229489 kubelet[2398]: I0912 17:43:18.229358 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/40343e56910b922452cd39e383691eaf-k8s-certs\") pod \"kube-apiserver-ci-4426-1-0-d-1f6ac31256\" (UID: \"40343e56910b922452cd39e383691eaf\") " pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229489 kubelet[2398]: E0912 17:43:18.229360 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.216.139.29:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-1-0-d-1f6ac31256?timeout=10s\": dial tcp 95.216.139.29:6443: connect: connection refused" interval="400ms" Sep 12 17:43:18.229489 kubelet[2398]: I0912 17:43:18.229401 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-flexvolume-dir\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229489 kubelet[2398]: I0912 17:43:18.229457 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-k8s-certs\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229731 kubelet[2398]: I0912 17:43:18.229503 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-kubeconfig\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229731 kubelet[2398]: I0912 17:43:18.229530 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229731 kubelet[2398]: I0912 17:43:18.229583 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/87a2cd2967fee7bb49b24c9454693116-kubeconfig\") pod \"kube-scheduler-ci-4426-1-0-d-1f6ac31256\" (UID: \"87a2cd2967fee7bb49b24c9454693116\") " pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229731 kubelet[2398]: I0912 17:43:18.229608 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/40343e56910b922452cd39e383691eaf-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426-1-0-d-1f6ac31256\" (UID: \"40343e56910b922452cd39e383691eaf\") " pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.229731 kubelet[2398]: I0912 17:43:18.229661 2398 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-ca-certs\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.381536 kubelet[2398]: I0912 17:43:18.381431 2398 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.381818 kubelet[2398]: E0912 17:43:18.381776 2398 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.216.139.29:6443/api/v1/nodes\": dial tcp 95.216.139.29:6443: connect: connection refused" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.481446 containerd[1566]: time="2025-09-12T17:43:18.481311170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426-1-0-d-1f6ac31256,Uid:87a2cd2967fee7bb49b24c9454693116,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:18.486300 containerd[1566]: time="2025-09-12T17:43:18.486272722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426-1-0-d-1f6ac31256,Uid:40343e56910b922452cd39e383691eaf,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:18.491977 containerd[1566]: time="2025-09-12T17:43:18.491947871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426-1-0-d-1f6ac31256,Uid:13902fa3928ab2b3ade0357f1fa5d06c,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:18.564892 containerd[1566]: time="2025-09-12T17:43:18.564756583Z" level=info msg="connecting to shim a6313224f49dacb79e45b38180b09d94e738d70a6bc28334ba882dee171cb546" address="unix:///run/containerd/s/dda3b94f36fdbff6b2b8722621956f017290987fc545cf008075073ed573eb0e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:18.570455 containerd[1566]: time="2025-09-12T17:43:18.570052804Z" level=info msg="connecting to shim 71560c80bf7e578dfdb218803e2026d28e514362580bf7aaeb12aebdaf28ebab" address="unix:///run/containerd/s/e9100432818e646424806e69d4eabec5126b8f058f2d7c1c14c44433cf4ef6d1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:18.576924 containerd[1566]: time="2025-09-12T17:43:18.576883109Z" level=info msg="connecting to shim d749c97ada9d32cf1e819aa2a748a4b72652d2ba41ebb6a2fa8bdf76716aadc5" address="unix:///run/containerd/s/3b94ea257c106b0d94ec53d6deff60afb7805199cf5c97401c8578283753c782" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:18.629989 kubelet[2398]: E0912 17:43:18.629957 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.216.139.29:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-1-0-d-1f6ac31256?timeout=10s\": dial tcp 95.216.139.29:6443: connect: connection refused" interval="800ms" Sep 12 17:43:18.647975 systemd[1]: Started cri-containerd-a6313224f49dacb79e45b38180b09d94e738d70a6bc28334ba882dee171cb546.scope - libcontainer container a6313224f49dacb79e45b38180b09d94e738d70a6bc28334ba882dee171cb546. Sep 12 17:43:18.653587 systemd[1]: Started cri-containerd-71560c80bf7e578dfdb218803e2026d28e514362580bf7aaeb12aebdaf28ebab.scope - libcontainer container 71560c80bf7e578dfdb218803e2026d28e514362580bf7aaeb12aebdaf28ebab. Sep 12 17:43:18.655684 systemd[1]: Started cri-containerd-d749c97ada9d32cf1e819aa2a748a4b72652d2ba41ebb6a2fa8bdf76716aadc5.scope - libcontainer container d749c97ada9d32cf1e819aa2a748a4b72652d2ba41ebb6a2fa8bdf76716aadc5. Sep 12 17:43:18.744966 containerd[1566]: time="2025-09-12T17:43:18.744786284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426-1-0-d-1f6ac31256,Uid:40343e56910b922452cd39e383691eaf,Namespace:kube-system,Attempt:0,} returns sandbox id \"a6313224f49dacb79e45b38180b09d94e738d70a6bc28334ba882dee171cb546\"" Sep 12 17:43:18.746614 containerd[1566]: time="2025-09-12T17:43:18.746578646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426-1-0-d-1f6ac31256,Uid:13902fa3928ab2b3ade0357f1fa5d06c,Namespace:kube-system,Attempt:0,} returns sandbox id \"71560c80bf7e578dfdb218803e2026d28e514362580bf7aaeb12aebdaf28ebab\"" Sep 12 17:43:18.754017 containerd[1566]: time="2025-09-12T17:43:18.753799072Z" level=info msg="CreateContainer within sandbox \"a6313224f49dacb79e45b38180b09d94e738d70a6bc28334ba882dee171cb546\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 17:43:18.754629 containerd[1566]: time="2025-09-12T17:43:18.754602374Z" level=info msg="CreateContainer within sandbox \"71560c80bf7e578dfdb218803e2026d28e514362580bf7aaeb12aebdaf28ebab\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 17:43:18.767681 containerd[1566]: time="2025-09-12T17:43:18.767392211Z" level=info msg="Container 225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:18.780392 containerd[1566]: time="2025-09-12T17:43:18.780341016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426-1-0-d-1f6ac31256,Uid:87a2cd2967fee7bb49b24c9454693116,Namespace:kube-system,Attempt:0,} returns sandbox id \"d749c97ada9d32cf1e819aa2a748a4b72652d2ba41ebb6a2fa8bdf76716aadc5\"" Sep 12 17:43:18.782628 containerd[1566]: time="2025-09-12T17:43:18.782452650Z" level=info msg="CreateContainer within sandbox \"d749c97ada9d32cf1e819aa2a748a4b72652d2ba41ebb6a2fa8bdf76716aadc5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 17:43:18.785133 kubelet[2398]: I0912 17:43:18.784879 2398 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.785735 kubelet[2398]: E0912 17:43:18.785502 2398 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.216.139.29:6443/api/v1/nodes\": dial tcp 95.216.139.29:6443: connect: connection refused" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:18.786377 containerd[1566]: time="2025-09-12T17:43:18.786346558Z" level=info msg="Container 7e1fc083637676a9b587d52ef80cb4c859de49e72d50660af0f995d3f774bc15: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:18.789312 containerd[1566]: time="2025-09-12T17:43:18.789265265Z" level=info msg="CreateContainer within sandbox \"71560c80bf7e578dfdb218803e2026d28e514362580bf7aaeb12aebdaf28ebab\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2\"" Sep 12 17:43:18.790836 containerd[1566]: time="2025-09-12T17:43:18.790008187Z" level=info msg="StartContainer for \"225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2\"" Sep 12 17:43:18.791108 containerd[1566]: time="2025-09-12T17:43:18.791085162Z" level=info msg="connecting to shim 225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2" address="unix:///run/containerd/s/e9100432818e646424806e69d4eabec5126b8f058f2d7c1c14c44433cf4ef6d1" protocol=ttrpc version=3 Sep 12 17:43:18.795153 containerd[1566]: time="2025-09-12T17:43:18.795127398Z" level=info msg="CreateContainer within sandbox \"a6313224f49dacb79e45b38180b09d94e738d70a6bc28334ba882dee171cb546\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7e1fc083637676a9b587d52ef80cb4c859de49e72d50660af0f995d3f774bc15\"" Sep 12 17:43:18.795730 containerd[1566]: time="2025-09-12T17:43:18.795695272Z" level=info msg="Container 88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:18.796062 containerd[1566]: time="2025-09-12T17:43:18.796041374Z" level=info msg="StartContainer for \"7e1fc083637676a9b587d52ef80cb4c859de49e72d50660af0f995d3f774bc15\"" Sep 12 17:43:18.798056 containerd[1566]: time="2025-09-12T17:43:18.798037537Z" level=info msg="connecting to shim 7e1fc083637676a9b587d52ef80cb4c859de49e72d50660af0f995d3f774bc15" address="unix:///run/containerd/s/dda3b94f36fdbff6b2b8722621956f017290987fc545cf008075073ed573eb0e" protocol=ttrpc version=3 Sep 12 17:43:18.801788 containerd[1566]: time="2025-09-12T17:43:18.801766927Z" level=info msg="CreateContainer within sandbox \"d749c97ada9d32cf1e819aa2a748a4b72652d2ba41ebb6a2fa8bdf76716aadc5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85\"" Sep 12 17:43:18.803159 containerd[1566]: time="2025-09-12T17:43:18.803136574Z" level=info msg="StartContainer for \"88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85\"" Sep 12 17:43:18.804050 containerd[1566]: time="2025-09-12T17:43:18.804028657Z" level=info msg="connecting to shim 88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85" address="unix:///run/containerd/s/3b94ea257c106b0d94ec53d6deff60afb7805199cf5c97401c8578283753c782" protocol=ttrpc version=3 Sep 12 17:43:18.815144 systemd[1]: Started cri-containerd-225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2.scope - libcontainer container 225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2. Sep 12 17:43:18.825213 systemd[1]: Started cri-containerd-7e1fc083637676a9b587d52ef80cb4c859de49e72d50660af0f995d3f774bc15.scope - libcontainer container 7e1fc083637676a9b587d52ef80cb4c859de49e72d50660af0f995d3f774bc15. Sep 12 17:43:18.839455 systemd[1]: Started cri-containerd-88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85.scope - libcontainer container 88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85. Sep 12 17:43:18.893885 kubelet[2398]: W0912 17:43:18.893605 2398 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://95.216.139.29:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-1-0-d-1f6ac31256&limit=500&resourceVersion=0": dial tcp 95.216.139.29:6443: connect: connection refused Sep 12 17:43:18.893885 kubelet[2398]: E0912 17:43:18.893851 2398 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://95.216.139.29:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-1-0-d-1f6ac31256&limit=500&resourceVersion=0\": dial tcp 95.216.139.29:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:43:18.916847 containerd[1566]: time="2025-09-12T17:43:18.916787730Z" level=info msg="StartContainer for \"225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2\" returns successfully" Sep 12 17:43:18.941963 containerd[1566]: time="2025-09-12T17:43:18.941927564Z" level=info msg="StartContainer for \"88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85\" returns successfully" Sep 12 17:43:18.952389 containerd[1566]: time="2025-09-12T17:43:18.952354279Z" level=info msg="StartContainer for \"7e1fc083637676a9b587d52ef80cb4c859de49e72d50660af0f995d3f774bc15\" returns successfully" Sep 12 17:43:19.049472 kubelet[2398]: W0912 17:43:19.049418 2398 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://95.216.139.29:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 95.216.139.29:6443: connect: connection refused Sep 12 17:43:19.049657 kubelet[2398]: E0912 17:43:19.049483 2398 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://95.216.139.29:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 95.216.139.29:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:43:19.062774 kubelet[2398]: E0912 17:43:19.062745 2398 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:19.068717 kubelet[2398]: E0912 17:43:19.068664 2398 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:19.068978 kubelet[2398]: E0912 17:43:19.068962 2398 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:19.139299 kubelet[2398]: W0912 17:43:19.139236 2398 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://95.216.139.29:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 95.216.139.29:6443: connect: connection refused Sep 12 17:43:19.139524 kubelet[2398]: E0912 17:43:19.139484 2398 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://95.216.139.29:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 95.216.139.29:6443: connect: connection refused" logger="UnhandledError" Sep 12 17:43:19.587563 kubelet[2398]: I0912 17:43:19.587523 2398 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.071357 kubelet[2398]: E0912 17:43:20.071166 2398 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.073296 kubelet[2398]: E0912 17:43:20.073207 2398 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.405191 kubelet[2398]: E0912 17:43:20.404801 2398 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426-1-0-d-1f6ac31256\" not found" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.480030 kubelet[2398]: I0912 17:43:20.479989 2398 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.480030 kubelet[2398]: E0912 17:43:20.480024 2398 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4426-1-0-d-1f6ac31256\": node \"ci-4426-1-0-d-1f6ac31256\" not found" Sep 12 17:43:20.495420 kubelet[2398]: E0912 17:43:20.495383 2398 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" Sep 12 17:43:20.596358 kubelet[2398]: E0912 17:43:20.596280 2398 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" Sep 12 17:43:20.697440 kubelet[2398]: E0912 17:43:20.697268 2398 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" Sep 12 17:43:20.797793 kubelet[2398]: E0912 17:43:20.797713 2398 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-1-0-d-1f6ac31256\" not found" Sep 12 17:43:20.828749 kubelet[2398]: I0912 17:43:20.828680 2398 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.836418 kubelet[2398]: E0912 17:43:20.836363 2398 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426-1-0-d-1f6ac31256\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.836418 kubelet[2398]: I0912 17:43:20.836399 2398 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.839574 kubelet[2398]: E0912 17:43:20.839460 2398 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426-1-0-d-1f6ac31256\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.839574 kubelet[2398]: I0912 17:43:20.839538 2398 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:20.842276 kubelet[2398]: E0912 17:43:20.842184 2398 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:21.011166 kubelet[2398]: I0912 17:43:21.011045 2398 apiserver.go:52] "Watching apiserver" Sep 12 17:43:21.029465 kubelet[2398]: I0912 17:43:21.029424 2398 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:43:21.072245 kubelet[2398]: I0912 17:43:21.072127 2398 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:21.074009 kubelet[2398]: E0912 17:43:21.073930 2398 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426-1-0-d-1f6ac31256\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:22.578314 systemd[1]: Reload requested from client PID 2672 ('systemctl') (unit session-7.scope)... Sep 12 17:43:22.578331 systemd[1]: Reloading... Sep 12 17:43:22.676840 zram_generator::config[2725]: No configuration found. Sep 12 17:43:22.850854 systemd[1]: Reloading finished in 272 ms. Sep 12 17:43:22.879328 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:43:22.893080 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 17:43:22.893267 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:43:22.893311 systemd[1]: kubelet.service: Consumed 700ms CPU time, 129.1M memory peak. Sep 12 17:43:22.895931 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 17:43:22.999377 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 17:43:23.010133 (kubelet)[2767]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 17:43:23.058177 kubelet[2767]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:43:23.058177 kubelet[2767]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 12 17:43:23.058177 kubelet[2767]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 17:43:23.058512 kubelet[2767]: I0912 17:43:23.058217 2767 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 17:43:23.063871 kubelet[2767]: I0912 17:43:23.063839 2767 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 12 17:43:23.063871 kubelet[2767]: I0912 17:43:23.063859 2767 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 17:43:23.065301 kubelet[2767]: I0912 17:43:23.064478 2767 server.go:954] "Client rotation is on, will bootstrap in background" Sep 12 17:43:23.067875 kubelet[2767]: I0912 17:43:23.067860 2767 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 17:43:23.070342 kubelet[2767]: I0912 17:43:23.070310 2767 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 17:43:23.074343 kubelet[2767]: I0912 17:43:23.074307 2767 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 17:43:23.077290 kubelet[2767]: I0912 17:43:23.077267 2767 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 17:43:23.077508 kubelet[2767]: I0912 17:43:23.077475 2767 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 17:43:23.077645 kubelet[2767]: I0912 17:43:23.077501 2767 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426-1-0-d-1f6ac31256","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 17:43:23.077725 kubelet[2767]: I0912 17:43:23.077647 2767 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 17:43:23.077725 kubelet[2767]: I0912 17:43:23.077655 2767 container_manager_linux.go:304] "Creating device plugin manager" Sep 12 17:43:23.077725 kubelet[2767]: I0912 17:43:23.077690 2767 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:43:23.077846 kubelet[2767]: I0912 17:43:23.077798 2767 kubelet.go:446] "Attempting to sync node with API server" Sep 12 17:43:23.077846 kubelet[2767]: I0912 17:43:23.077836 2767 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 17:43:23.077927 kubelet[2767]: I0912 17:43:23.077854 2767 kubelet.go:352] "Adding apiserver pod source" Sep 12 17:43:23.077927 kubelet[2767]: I0912 17:43:23.077864 2767 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 17:43:23.088454 kubelet[2767]: I0912 17:43:23.088414 2767 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 17:43:23.089006 kubelet[2767]: I0912 17:43:23.088795 2767 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 17:43:23.089423 kubelet[2767]: I0912 17:43:23.089371 2767 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 12 17:43:23.089423 kubelet[2767]: I0912 17:43:23.089397 2767 server.go:1287] "Started kubelet" Sep 12 17:43:23.090303 kubelet[2767]: I0912 17:43:23.090280 2767 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 17:43:23.091612 kubelet[2767]: I0912 17:43:23.091602 2767 server.go:479] "Adding debug handlers to kubelet server" Sep 12 17:43:23.093425 kubelet[2767]: I0912 17:43:23.093362 2767 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 17:43:23.094638 kubelet[2767]: I0912 17:43:23.093983 2767 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 17:43:23.099335 kubelet[2767]: I0912 17:43:23.098356 2767 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 17:43:23.100543 kubelet[2767]: I0912 17:43:23.098480 2767 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 17:43:23.100631 kubelet[2767]: I0912 17:43:23.100610 2767 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 12 17:43:23.100941 kubelet[2767]: I0912 17:43:23.100929 2767 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 12 17:43:23.101108 kubelet[2767]: I0912 17:43:23.101009 2767 reconciler.go:26] "Reconciler: start to sync state" Sep 12 17:43:23.103443 kubelet[2767]: I0912 17:43:23.103360 2767 factory.go:221] Registration of the systemd container factory successfully Sep 12 17:43:23.103489 kubelet[2767]: I0912 17:43:23.103460 2767 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 17:43:23.104860 kubelet[2767]: E0912 17:43:23.104796 2767 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 17:43:23.105946 kubelet[2767]: I0912 17:43:23.105919 2767 factory.go:221] Registration of the containerd container factory successfully Sep 12 17:43:23.109530 kubelet[2767]: I0912 17:43:23.109420 2767 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 17:43:23.110285 kubelet[2767]: I0912 17:43:23.110273 2767 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 17:43:23.110355 kubelet[2767]: I0912 17:43:23.110347 2767 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 12 17:43:23.110421 kubelet[2767]: I0912 17:43:23.110409 2767 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 12 17:43:23.110470 kubelet[2767]: I0912 17:43:23.110464 2767 kubelet.go:2382] "Starting kubelet main sync loop" Sep 12 17:43:23.110573 kubelet[2767]: E0912 17:43:23.110559 2767 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 17:43:23.159547 kubelet[2767]: I0912 17:43:23.159524 2767 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 12 17:43:23.159863 kubelet[2767]: I0912 17:43:23.159681 2767 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 12 17:43:23.159863 kubelet[2767]: I0912 17:43:23.159699 2767 state_mem.go:36] "Initialized new in-memory state store" Sep 12 17:43:23.160057 kubelet[2767]: I0912 17:43:23.160045 2767 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 17:43:23.160137 kubelet[2767]: I0912 17:43:23.160116 2767 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 17:43:23.160190 kubelet[2767]: I0912 17:43:23.160184 2767 policy_none.go:49] "None policy: Start" Sep 12 17:43:23.160866 kubelet[2767]: I0912 17:43:23.160855 2767 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 12 17:43:23.160929 kubelet[2767]: I0912 17:43:23.160921 2767 state_mem.go:35] "Initializing new in-memory state store" Sep 12 17:43:23.161072 kubelet[2767]: I0912 17:43:23.161063 2767 state_mem.go:75] "Updated machine memory state" Sep 12 17:43:23.166113 kubelet[2767]: I0912 17:43:23.166089 2767 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 17:43:23.166256 kubelet[2767]: I0912 17:43:23.166227 2767 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 17:43:23.166293 kubelet[2767]: I0912 17:43:23.166248 2767 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 17:43:23.166972 kubelet[2767]: I0912 17:43:23.166660 2767 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 17:43:23.169258 kubelet[2767]: E0912 17:43:23.169243 2767 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 12 17:43:23.212087 kubelet[2767]: I0912 17:43:23.212059 2767 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.213548 kubelet[2767]: I0912 17:43:23.213449 2767 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.214124 kubelet[2767]: I0912 17:43:23.213705 2767 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.272541 kubelet[2767]: I0912 17:43:23.272492 2767 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.280135 kubelet[2767]: I0912 17:43:23.279357 2767 kubelet_node_status.go:124] "Node was previously registered" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.280135 kubelet[2767]: I0912 17:43:23.279420 2767 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402370 kubelet[2767]: I0912 17:43:23.402245 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/40343e56910b922452cd39e383691eaf-k8s-certs\") pod \"kube-apiserver-ci-4426-1-0-d-1f6ac31256\" (UID: \"40343e56910b922452cd39e383691eaf\") " pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402370 kubelet[2767]: I0912 17:43:23.402295 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-ca-certs\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402370 kubelet[2767]: I0912 17:43:23.402325 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-k8s-certs\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402370 kubelet[2767]: I0912 17:43:23.402349 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-kubeconfig\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402370 kubelet[2767]: I0912 17:43:23.402371 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/87a2cd2967fee7bb49b24c9454693116-kubeconfig\") pod \"kube-scheduler-ci-4426-1-0-d-1f6ac31256\" (UID: \"87a2cd2967fee7bb49b24c9454693116\") " pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402597 kubelet[2767]: I0912 17:43:23.402397 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/40343e56910b922452cd39e383691eaf-ca-certs\") pod \"kube-apiserver-ci-4426-1-0-d-1f6ac31256\" (UID: \"40343e56910b922452cd39e383691eaf\") " pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402597 kubelet[2767]: I0912 17:43:23.402421 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/40343e56910b922452cd39e383691eaf-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426-1-0-d-1f6ac31256\" (UID: \"40343e56910b922452cd39e383691eaf\") " pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402597 kubelet[2767]: I0912 17:43:23.402445 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-flexvolume-dir\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:23.402597 kubelet[2767]: I0912 17:43:23.402469 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/13902fa3928ab2b3ade0357f1fa5d06c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" (UID: \"13902fa3928ab2b3ade0357f1fa5d06c\") " pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:24.079974 kubelet[2767]: I0912 17:43:24.079939 2767 apiserver.go:52] "Watching apiserver" Sep 12 17:43:24.101714 kubelet[2767]: I0912 17:43:24.101654 2767 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 12 17:43:24.139644 kubelet[2767]: I0912 17:43:24.139593 2767 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:24.140377 kubelet[2767]: I0912 17:43:24.139947 2767 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:24.140377 kubelet[2767]: I0912 17:43:24.139998 2767 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:24.149586 kubelet[2767]: E0912 17:43:24.149449 2767 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426-1-0-d-1f6ac31256\" already exists" pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:24.150326 kubelet[2767]: E0912 17:43:24.149893 2767 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426-1-0-d-1f6ac31256\" already exists" pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:24.150693 kubelet[2767]: E0912 17:43:24.150019 2767 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426-1-0-d-1f6ac31256\" already exists" pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" Sep 12 17:43:24.174638 kubelet[2767]: I0912 17:43:24.174588 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426-1-0-d-1f6ac31256" podStartSLOduration=1.174570195 podStartE2EDuration="1.174570195s" podCreationTimestamp="2025-09-12 17:43:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:43:24.16622204 +0000 UTC m=+1.150705261" watchObservedRunningTime="2025-09-12 17:43:24.174570195 +0000 UTC m=+1.159053406" Sep 12 17:43:24.184066 kubelet[2767]: I0912 17:43:24.184005 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426-1-0-d-1f6ac31256" podStartSLOduration=1.183990055 podStartE2EDuration="1.183990055s" podCreationTimestamp="2025-09-12 17:43:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:43:24.175334815 +0000 UTC m=+1.159818026" watchObservedRunningTime="2025-09-12 17:43:24.183990055 +0000 UTC m=+1.168473266" Sep 12 17:43:24.184361 kubelet[2767]: I0912 17:43:24.184087 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426-1-0-d-1f6ac31256" podStartSLOduration=1.184081421 podStartE2EDuration="1.184081421s" podCreationTimestamp="2025-09-12 17:43:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:43:24.183850387 +0000 UTC m=+1.168333588" watchObservedRunningTime="2025-09-12 17:43:24.184081421 +0000 UTC m=+1.168564622" Sep 12 17:43:27.522989 kubelet[2767]: I0912 17:43:27.522803 2767 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 17:43:27.524831 containerd[1566]: time="2025-09-12T17:43:27.524260863Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 17:43:27.525771 kubelet[2767]: I0912 17:43:27.525725 2767 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 17:43:27.558409 update_engine[1541]: I20250912 17:43:27.558290 1541 update_attempter.cc:509] Updating boot flags... Sep 12 17:43:28.518224 systemd[1]: Created slice kubepods-besteffort-pod23b0ca60_31bf_430a_a943_2260b9f6aed8.slice - libcontainer container kubepods-besteffort-pod23b0ca60_31bf_430a_a943_2260b9f6aed8.slice. Sep 12 17:43:28.536444 kubelet[2767]: I0912 17:43:28.536391 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/23b0ca60-31bf-430a-a943-2260b9f6aed8-kube-proxy\") pod \"kube-proxy-sl84d\" (UID: \"23b0ca60-31bf-430a-a943-2260b9f6aed8\") " pod="kube-system/kube-proxy-sl84d" Sep 12 17:43:28.536444 kubelet[2767]: I0912 17:43:28.536455 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/23b0ca60-31bf-430a-a943-2260b9f6aed8-xtables-lock\") pod \"kube-proxy-sl84d\" (UID: \"23b0ca60-31bf-430a-a943-2260b9f6aed8\") " pod="kube-system/kube-proxy-sl84d" Sep 12 17:43:28.536975 kubelet[2767]: I0912 17:43:28.536481 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhmlq\" (UniqueName: \"kubernetes.io/projected/23b0ca60-31bf-430a-a943-2260b9f6aed8-kube-api-access-xhmlq\") pod \"kube-proxy-sl84d\" (UID: \"23b0ca60-31bf-430a-a943-2260b9f6aed8\") " pod="kube-system/kube-proxy-sl84d" Sep 12 17:43:28.536975 kubelet[2767]: I0912 17:43:28.536509 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/23b0ca60-31bf-430a-a943-2260b9f6aed8-lib-modules\") pod \"kube-proxy-sl84d\" (UID: \"23b0ca60-31bf-430a-a943-2260b9f6aed8\") " pod="kube-system/kube-proxy-sl84d" Sep 12 17:43:28.645565 systemd[1]: Created slice kubepods-besteffort-pod01f83c32_c576_4b16_95a5_2db6421a2974.slice - libcontainer container kubepods-besteffort-pod01f83c32_c576_4b16_95a5_2db6421a2974.slice. Sep 12 17:43:28.738198 kubelet[2767]: I0912 17:43:28.738109 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/01f83c32-c576-4b16-95a5-2db6421a2974-var-lib-calico\") pod \"tigera-operator-755d956888-8vvcv\" (UID: \"01f83c32-c576-4b16-95a5-2db6421a2974\") " pod="tigera-operator/tigera-operator-755d956888-8vvcv" Sep 12 17:43:28.738198 kubelet[2767]: I0912 17:43:28.738153 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkcxr\" (UniqueName: \"kubernetes.io/projected/01f83c32-c576-4b16-95a5-2db6421a2974-kube-api-access-wkcxr\") pod \"tigera-operator-755d956888-8vvcv\" (UID: \"01f83c32-c576-4b16-95a5-2db6421a2974\") " pod="tigera-operator/tigera-operator-755d956888-8vvcv" Sep 12 17:43:28.827486 containerd[1566]: time="2025-09-12T17:43:28.827427126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-sl84d,Uid:23b0ca60-31bf-430a-a943-2260b9f6aed8,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:28.858356 containerd[1566]: time="2025-09-12T17:43:28.857998760Z" level=info msg="connecting to shim 0829a27155980f591f45b15c259bdfad2810b143530d9cda3825595b62a23f95" address="unix:///run/containerd/s/0d86aa25fec9d9f8412d86e956d753427c9d855079fd6ea75f3297d41e9133a3" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:28.886022 systemd[1]: Started cri-containerd-0829a27155980f591f45b15c259bdfad2810b143530d9cda3825595b62a23f95.scope - libcontainer container 0829a27155980f591f45b15c259bdfad2810b143530d9cda3825595b62a23f95. Sep 12 17:43:28.915537 containerd[1566]: time="2025-09-12T17:43:28.915489298Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-sl84d,Uid:23b0ca60-31bf-430a-a943-2260b9f6aed8,Namespace:kube-system,Attempt:0,} returns sandbox id \"0829a27155980f591f45b15c259bdfad2810b143530d9cda3825595b62a23f95\"" Sep 12 17:43:28.919922 containerd[1566]: time="2025-09-12T17:43:28.919877135Z" level=info msg="CreateContainer within sandbox \"0829a27155980f591f45b15c259bdfad2810b143530d9cda3825595b62a23f95\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 17:43:28.936489 containerd[1566]: time="2025-09-12T17:43:28.935774763Z" level=info msg="Container 09a7464ffb45153bb7462c99a15480faa3e8116d01cf9839cc218edf839c6b84: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:28.943292 containerd[1566]: time="2025-09-12T17:43:28.943251252Z" level=info msg="CreateContainer within sandbox \"0829a27155980f591f45b15c259bdfad2810b143530d9cda3825595b62a23f95\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"09a7464ffb45153bb7462c99a15480faa3e8116d01cf9839cc218edf839c6b84\"" Sep 12 17:43:28.944573 containerd[1566]: time="2025-09-12T17:43:28.944502260Z" level=info msg="StartContainer for \"09a7464ffb45153bb7462c99a15480faa3e8116d01cf9839cc218edf839c6b84\"" Sep 12 17:43:28.945982 containerd[1566]: time="2025-09-12T17:43:28.945965618Z" level=info msg="connecting to shim 09a7464ffb45153bb7462c99a15480faa3e8116d01cf9839cc218edf839c6b84" address="unix:///run/containerd/s/0d86aa25fec9d9f8412d86e956d753427c9d855079fd6ea75f3297d41e9133a3" protocol=ttrpc version=3 Sep 12 17:43:28.950493 containerd[1566]: time="2025-09-12T17:43:28.950450053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-8vvcv,Uid:01f83c32-c576-4b16-95a5-2db6421a2974,Namespace:tigera-operator,Attempt:0,}" Sep 12 17:43:28.965195 systemd[1]: Started cri-containerd-09a7464ffb45153bb7462c99a15480faa3e8116d01cf9839cc218edf839c6b84.scope - libcontainer container 09a7464ffb45153bb7462c99a15480faa3e8116d01cf9839cc218edf839c6b84. Sep 12 17:43:28.968692 containerd[1566]: time="2025-09-12T17:43:28.968643476Z" level=info msg="connecting to shim 33f87d6e7e7024d58e17f88ba976658881d68e5eb184dd00746fedc8df9460ce" address="unix:///run/containerd/s/e4e4f4e8503b732df4820db8cd37d04080d2a12016dd50f5843ed190a103c710" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:29.000072 systemd[1]: Started cri-containerd-33f87d6e7e7024d58e17f88ba976658881d68e5eb184dd00746fedc8df9460ce.scope - libcontainer container 33f87d6e7e7024d58e17f88ba976658881d68e5eb184dd00746fedc8df9460ce. Sep 12 17:43:29.011114 containerd[1566]: time="2025-09-12T17:43:29.011070406Z" level=info msg="StartContainer for \"09a7464ffb45153bb7462c99a15480faa3e8116d01cf9839cc218edf839c6b84\" returns successfully" Sep 12 17:43:29.059435 containerd[1566]: time="2025-09-12T17:43:29.059322143Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-8vvcv,Uid:01f83c32-c576-4b16-95a5-2db6421a2974,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"33f87d6e7e7024d58e17f88ba976658881d68e5eb184dd00746fedc8df9460ce\"" Sep 12 17:43:29.062493 containerd[1566]: time="2025-09-12T17:43:29.062468609Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 17:43:29.655218 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2871675995.mount: Deactivated successfully. Sep 12 17:43:31.489354 kubelet[2767]: I0912 17:43:31.489283 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-sl84d" podStartSLOduration=3.48926472 podStartE2EDuration="3.48926472s" podCreationTimestamp="2025-09-12 17:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:43:29.166748936 +0000 UTC m=+6.151232137" watchObservedRunningTime="2025-09-12 17:43:31.48926472 +0000 UTC m=+8.473747932" Sep 12 17:43:32.654380 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1966926664.mount: Deactivated successfully. Sep 12 17:43:33.016122 containerd[1566]: time="2025-09-12T17:43:33.016007709Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:33.017274 containerd[1566]: time="2025-09-12T17:43:33.017163835Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 17:43:33.017977 containerd[1566]: time="2025-09-12T17:43:33.017950311Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:33.019632 containerd[1566]: time="2025-09-12T17:43:33.019593521Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:33.020128 containerd[1566]: time="2025-09-12T17:43:33.020100560Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 3.957458859s" Sep 12 17:43:33.020196 containerd[1566]: time="2025-09-12T17:43:33.020183738Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 17:43:33.022974 containerd[1566]: time="2025-09-12T17:43:33.022866948Z" level=info msg="CreateContainer within sandbox \"33f87d6e7e7024d58e17f88ba976658881d68e5eb184dd00746fedc8df9460ce\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 17:43:33.028725 containerd[1566]: time="2025-09-12T17:43:33.028705723Z" level=info msg="Container 521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:33.040323 containerd[1566]: time="2025-09-12T17:43:33.040278024Z" level=info msg="CreateContainer within sandbox \"33f87d6e7e7024d58e17f88ba976658881d68e5eb184dd00746fedc8df9460ce\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9\"" Sep 12 17:43:33.040783 containerd[1566]: time="2025-09-12T17:43:33.040747974Z" level=info msg="StartContainer for \"521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9\"" Sep 12 17:43:33.041595 containerd[1566]: time="2025-09-12T17:43:33.041509501Z" level=info msg="connecting to shim 521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9" address="unix:///run/containerd/s/e4e4f4e8503b732df4820db8cd37d04080d2a12016dd50f5843ed190a103c710" protocol=ttrpc version=3 Sep 12 17:43:33.058957 systemd[1]: Started cri-containerd-521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9.scope - libcontainer container 521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9. Sep 12 17:43:33.087022 containerd[1566]: time="2025-09-12T17:43:33.086970286Z" level=info msg="StartContainer for \"521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9\" returns successfully" Sep 12 17:43:33.267416 kubelet[2767]: I0912 17:43:33.266476 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-8vvcv" podStartSLOduration=1.306615141 podStartE2EDuration="5.266461767s" podCreationTimestamp="2025-09-12 17:43:28 +0000 UTC" firstStartedPulling="2025-09-12 17:43:29.061013605 +0000 UTC m=+6.045496797" lastFinishedPulling="2025-09-12 17:43:33.020860222 +0000 UTC m=+10.005343423" observedRunningTime="2025-09-12 17:43:33.170529986 +0000 UTC m=+10.155013197" watchObservedRunningTime="2025-09-12 17:43:33.266461767 +0000 UTC m=+10.250944978" Sep 12 17:43:38.941070 sudo[1830]: pam_unix(sudo:session): session closed for user root Sep 12 17:43:39.115296 sshd[1829]: Connection closed by 139.178.68.195 port 41382 Sep 12 17:43:39.116259 sshd-session[1826]: pam_unix(sshd:session): session closed for user core Sep 12 17:43:39.119853 systemd[1]: sshd@6-95.216.139.29:22-139.178.68.195:41382.service: Deactivated successfully. Sep 12 17:43:39.121694 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 17:43:39.123456 systemd[1]: session-7.scope: Consumed 3.825s CPU time, 159.5M memory peak. Sep 12 17:43:39.125859 systemd-logind[1540]: Session 7 logged out. Waiting for processes to exit. Sep 12 17:43:39.127330 systemd-logind[1540]: Removed session 7. Sep 12 17:43:41.700277 systemd[1]: Created slice kubepods-besteffort-podda842344_095f_41df_8a48_a8d3eff02854.slice - libcontainer container kubepods-besteffort-podda842344_095f_41df_8a48_a8d3eff02854.slice. Sep 12 17:43:41.718210 kubelet[2767]: I0912 17:43:41.718119 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/da842344-095f-41df-8a48-a8d3eff02854-typha-certs\") pod \"calico-typha-5c7d4df57c-4fc99\" (UID: \"da842344-095f-41df-8a48-a8d3eff02854\") " pod="calico-system/calico-typha-5c7d4df57c-4fc99" Sep 12 17:43:41.718210 kubelet[2767]: I0912 17:43:41.718152 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/da842344-095f-41df-8a48-a8d3eff02854-tigera-ca-bundle\") pod \"calico-typha-5c7d4df57c-4fc99\" (UID: \"da842344-095f-41df-8a48-a8d3eff02854\") " pod="calico-system/calico-typha-5c7d4df57c-4fc99" Sep 12 17:43:41.718210 kubelet[2767]: I0912 17:43:41.718168 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v4hk\" (UniqueName: \"kubernetes.io/projected/da842344-095f-41df-8a48-a8d3eff02854-kube-api-access-9v4hk\") pod \"calico-typha-5c7d4df57c-4fc99\" (UID: \"da842344-095f-41df-8a48-a8d3eff02854\") " pod="calico-system/calico-typha-5c7d4df57c-4fc99" Sep 12 17:43:41.920353 systemd[1]: Created slice kubepods-besteffort-pod9bf893d8_f89b_4a74_b69e_89a1b24f9a0b.slice - libcontainer container kubepods-besteffort-pod9bf893d8_f89b_4a74_b69e_89a1b24f9a0b.slice. Sep 12 17:43:41.921256 kubelet[2767]: I0912 17:43:41.921231 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-cni-log-dir\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921425 kubelet[2767]: I0912 17:43:41.921268 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-flexvol-driver-host\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921425 kubelet[2767]: I0912 17:43:41.921285 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-xtables-lock\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921425 kubelet[2767]: I0912 17:43:41.921299 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw2jh\" (UniqueName: \"kubernetes.io/projected/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-kube-api-access-jw2jh\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921425 kubelet[2767]: I0912 17:43:41.921312 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-policysync\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921425 kubelet[2767]: I0912 17:43:41.921325 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-lib-modules\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921523 kubelet[2767]: I0912 17:43:41.921336 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-node-certs\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921523 kubelet[2767]: I0912 17:43:41.921347 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-tigera-ca-bundle\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921523 kubelet[2767]: I0912 17:43:41.921360 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-cni-net-dir\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921523 kubelet[2767]: I0912 17:43:41.921371 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-var-run-calico\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921523 kubelet[2767]: I0912 17:43:41.921384 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-cni-bin-dir\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:41.921607 kubelet[2767]: I0912 17:43:41.921397 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9bf893d8-f89b-4a74-b69e-89a1b24f9a0b-var-lib-calico\") pod \"calico-node-99z7m\" (UID: \"9bf893d8-f89b-4a74-b69e-89a1b24f9a0b\") " pod="calico-system/calico-node-99z7m" Sep 12 17:43:42.003643 containerd[1566]: time="2025-09-12T17:43:42.003536976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5c7d4df57c-4fc99,Uid:da842344-095f-41df-8a48-a8d3eff02854,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:42.046893 kubelet[2767]: E0912 17:43:42.046774 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.046893 kubelet[2767]: W0912 17:43:42.046803 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.047468 kubelet[2767]: E0912 17:43:42.047425 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.053465 containerd[1566]: time="2025-09-12T17:43:42.053060703Z" level=info msg="connecting to shim b0b2b8e9c937991ad63763db9f7517dcc2b7ad5813ac210552c6a99b5b25a8f6" address="unix:///run/containerd/s/c5efa94d479a0d516aa6ea839d35b7be97d9526d48c17d358299d8a3dca7b7ff" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:42.054030 kubelet[2767]: E0912 17:43:42.053894 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.054030 kubelet[2767]: W0912 17:43:42.053910 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.054030 kubelet[2767]: E0912 17:43:42.053928 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.105443 systemd[1]: Started cri-containerd-b0b2b8e9c937991ad63763db9f7517dcc2b7ad5813ac210552c6a99b5b25a8f6.scope - libcontainer container b0b2b8e9c937991ad63763db9f7517dcc2b7ad5813ac210552c6a99b5b25a8f6. Sep 12 17:43:42.175382 containerd[1566]: time="2025-09-12T17:43:42.175340485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5c7d4df57c-4fc99,Uid:da842344-095f-41df-8a48-a8d3eff02854,Namespace:calico-system,Attempt:0,} returns sandbox id \"b0b2b8e9c937991ad63763db9f7517dcc2b7ad5813ac210552c6a99b5b25a8f6\"" Sep 12 17:43:42.177618 containerd[1566]: time="2025-09-12T17:43:42.177244024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 17:43:42.224249 containerd[1566]: time="2025-09-12T17:43:42.224108653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-99z7m,Uid:9bf893d8-f89b-4a74-b69e-89a1b24f9a0b,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:42.229824 kubelet[2767]: E0912 17:43:42.229689 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rjfp6" podUID="f8cbe4d2-087a-4a06-9eb4-75af1bfa61da" Sep 12 17:43:42.247528 containerd[1566]: time="2025-09-12T17:43:42.247370422Z" level=info msg="connecting to shim cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487" address="unix:///run/containerd/s/add124bbb502391e80416b2022dff57345f4936653f59434d38f62aec714f705" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:43:42.272963 systemd[1]: Started cri-containerd-cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487.scope - libcontainer container cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487. Sep 12 17:43:42.311585 containerd[1566]: time="2025-09-12T17:43:42.311531563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-99z7m,Uid:9bf893d8-f89b-4a74-b69e-89a1b24f9a0b,Namespace:calico-system,Attempt:0,} returns sandbox id \"cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487\"" Sep 12 17:43:42.317243 kubelet[2767]: E0912 17:43:42.317225 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.317474 kubelet[2767]: W0912 17:43:42.317383 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.317474 kubelet[2767]: E0912 17:43:42.317405 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.317822 kubelet[2767]: E0912 17:43:42.317797 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.317977 kubelet[2767]: W0912 17:43:42.317890 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.317977 kubelet[2767]: E0912 17:43:42.317906 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.318245 kubelet[2767]: E0912 17:43:42.318173 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.318245 kubelet[2767]: W0912 17:43:42.318181 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.318245 kubelet[2767]: E0912 17:43:42.318190 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.325348 kubelet[2767]: E0912 17:43:42.325278 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.325348 kubelet[2767]: W0912 17:43:42.325293 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.325348 kubelet[2767]: E0912 17:43:42.325305 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.325951 kubelet[2767]: E0912 17:43:42.325931 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.325951 kubelet[2767]: W0912 17:43:42.325945 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.326021 kubelet[2767]: E0912 17:43:42.325957 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.326021 kubelet[2767]: I0912 17:43:42.325976 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8cbe4d2-087a-4a06-9eb4-75af1bfa61da-kubelet-dir\") pod \"csi-node-driver-rjfp6\" (UID: \"f8cbe4d2-087a-4a06-9eb4-75af1bfa61da\") " pod="calico-system/csi-node-driver-rjfp6" Sep 12 17:43:42.326265 kubelet[2767]: E0912 17:43:42.326249 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.326265 kubelet[2767]: W0912 17:43:42.326261 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.326319 kubelet[2767]: E0912 17:43:42.326280 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.326505 kubelet[2767]: E0912 17:43:42.326487 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.326505 kubelet[2767]: W0912 17:43:42.326499 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.326559 kubelet[2767]: E0912 17:43:42.326518 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.326770 kubelet[2767]: E0912 17:43:42.326751 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.326770 kubelet[2767]: W0912 17:43:42.326764 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.326844 kubelet[2767]: E0912 17:43:42.326772 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.327110 kubelet[2767]: E0912 17:43:42.327092 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.327110 kubelet[2767]: W0912 17:43:42.327104 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.327193 kubelet[2767]: E0912 17:43:42.327112 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.327438 kubelet[2767]: E0912 17:43:42.327423 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.327438 kubelet[2767]: W0912 17:43:42.327437 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.327488 kubelet[2767]: E0912 17:43:42.327445 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.327748 kubelet[2767]: E0912 17:43:42.327729 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.327748 kubelet[2767]: W0912 17:43:42.327743 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.327854 kubelet[2767]: E0912 17:43:42.327752 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.328004 kubelet[2767]: E0912 17:43:42.327968 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.328004 kubelet[2767]: W0912 17:43:42.327981 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.328004 kubelet[2767]: E0912 17:43:42.327989 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.328408 kubelet[2767]: E0912 17:43:42.328381 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.328408 kubelet[2767]: W0912 17:43:42.328393 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.328408 kubelet[2767]: E0912 17:43:42.328401 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.328861 kubelet[2767]: E0912 17:43:42.328842 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.328861 kubelet[2767]: W0912 17:43:42.328855 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.328934 kubelet[2767]: E0912 17:43:42.328863 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.329492 kubelet[2767]: E0912 17:43:42.329449 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.329492 kubelet[2767]: W0912 17:43:42.329462 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.329492 kubelet[2767]: E0912 17:43:42.329471 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.329686 kubelet[2767]: E0912 17:43:42.329650 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.329686 kubelet[2767]: W0912 17:43:42.329658 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.329686 kubelet[2767]: E0912 17:43:42.329665 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.329787 kubelet[2767]: E0912 17:43:42.329769 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.329787 kubelet[2767]: W0912 17:43:42.329781 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.329874 kubelet[2767]: E0912 17:43:42.329788 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.329928 kubelet[2767]: E0912 17:43:42.329908 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.329960 kubelet[2767]: W0912 17:43:42.329938 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.329960 kubelet[2767]: E0912 17:43:42.329945 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.330395 kubelet[2767]: E0912 17:43:42.330377 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.330395 kubelet[2767]: W0912 17:43:42.330390 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.330460 kubelet[2767]: E0912 17:43:42.330397 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.330713 kubelet[2767]: E0912 17:43:42.330695 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.330713 kubelet[2767]: W0912 17:43:42.330708 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.330773 kubelet[2767]: E0912 17:43:42.330715 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.331097 kubelet[2767]: E0912 17:43:42.331078 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.331097 kubelet[2767]: W0912 17:43:42.331092 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.331167 kubelet[2767]: E0912 17:43:42.331100 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.331579 kubelet[2767]: E0912 17:43:42.331561 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.331579 kubelet[2767]: W0912 17:43:42.331575 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.331648 kubelet[2767]: E0912 17:43:42.331583 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.331755 kubelet[2767]: E0912 17:43:42.331737 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.331755 kubelet[2767]: W0912 17:43:42.331751 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.331822 kubelet[2767]: E0912 17:43:42.331758 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.427277 kubelet[2767]: E0912 17:43:42.427232 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.427277 kubelet[2767]: W0912 17:43:42.427254 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.427277 kubelet[2767]: E0912 17:43:42.427274 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.427448 kubelet[2767]: I0912 17:43:42.427314 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f8cbe4d2-087a-4a06-9eb4-75af1bfa61da-varrun\") pod \"csi-node-driver-rjfp6\" (UID: \"f8cbe4d2-087a-4a06-9eb4-75af1bfa61da\") " pod="calico-system/csi-node-driver-rjfp6" Sep 12 17:43:42.427532 kubelet[2767]: E0912 17:43:42.427511 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.427532 kubelet[2767]: W0912 17:43:42.427526 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.427628 kubelet[2767]: E0912 17:43:42.427552 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.427628 kubelet[2767]: I0912 17:43:42.427569 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8cbe4d2-087a-4a06-9eb4-75af1bfa61da-registration-dir\") pod \"csi-node-driver-rjfp6\" (UID: \"f8cbe4d2-087a-4a06-9eb4-75af1bfa61da\") " pod="calico-system/csi-node-driver-rjfp6" Sep 12 17:43:42.427770 kubelet[2767]: E0912 17:43:42.427752 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.427770 kubelet[2767]: W0912 17:43:42.427764 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.428011 kubelet[2767]: E0912 17:43:42.427782 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.428011 kubelet[2767]: I0912 17:43:42.427797 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pp4b\" (UniqueName: \"kubernetes.io/projected/f8cbe4d2-087a-4a06-9eb4-75af1bfa61da-kube-api-access-6pp4b\") pod \"csi-node-driver-rjfp6\" (UID: \"f8cbe4d2-087a-4a06-9eb4-75af1bfa61da\") " pod="calico-system/csi-node-driver-rjfp6" Sep 12 17:43:42.428133 kubelet[2767]: E0912 17:43:42.428121 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.428274 kubelet[2767]: W0912 17:43:42.428181 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.428274 kubelet[2767]: E0912 17:43:42.428205 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.428378 kubelet[2767]: E0912 17:43:42.428367 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.428433 kubelet[2767]: W0912 17:43:42.428423 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.428577 kubelet[2767]: E0912 17:43:42.428513 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.428763 kubelet[2767]: E0912 17:43:42.428752 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.428843 kubelet[2767]: W0912 17:43:42.428832 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.429499 kubelet[2767]: E0912 17:43:42.429387 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.429594 kubelet[2767]: E0912 17:43:42.429583 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.429652 kubelet[2767]: W0912 17:43:42.429643 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.429856 kubelet[2767]: E0912 17:43:42.429842 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.430203 kubelet[2767]: E0912 17:43:42.430073 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.430203 kubelet[2767]: W0912 17:43:42.430131 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.430470 kubelet[2767]: E0912 17:43:42.430365 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.431233 kubelet[2767]: E0912 17:43:42.431212 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.431233 kubelet[2767]: W0912 17:43:42.431225 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.431365 kubelet[2767]: E0912 17:43:42.431240 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.431407 kubelet[2767]: E0912 17:43:42.431392 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.431407 kubelet[2767]: W0912 17:43:42.431404 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.431453 kubelet[2767]: E0912 17:43:42.431425 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.431610 kubelet[2767]: E0912 17:43:42.431592 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.431610 kubelet[2767]: W0912 17:43:42.431604 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.431670 kubelet[2767]: E0912 17:43:42.431617 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.431670 kubelet[2767]: I0912 17:43:42.431635 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8cbe4d2-087a-4a06-9eb4-75af1bfa61da-socket-dir\") pod \"csi-node-driver-rjfp6\" (UID: \"f8cbe4d2-087a-4a06-9eb4-75af1bfa61da\") " pod="calico-system/csi-node-driver-rjfp6" Sep 12 17:43:42.431831 kubelet[2767]: E0912 17:43:42.431791 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.431831 kubelet[2767]: W0912 17:43:42.431820 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.431831 kubelet[2767]: E0912 17:43:42.431889 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.432097 kubelet[2767]: E0912 17:43:42.432077 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.432097 kubelet[2767]: W0912 17:43:42.432090 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.432206 kubelet[2767]: E0912 17:43:42.432137 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.432614 kubelet[2767]: E0912 17:43:42.432595 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.432614 kubelet[2767]: W0912 17:43:42.432608 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.432686 kubelet[2767]: E0912 17:43:42.432617 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.432896 kubelet[2767]: E0912 17:43:42.432795 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.432896 kubelet[2767]: W0912 17:43:42.432838 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.432896 kubelet[2767]: E0912 17:43:42.432846 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.433326 kubelet[2767]: E0912 17:43:42.433304 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.433326 kubelet[2767]: W0912 17:43:42.433320 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.433389 kubelet[2767]: E0912 17:43:42.433330 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.433563 kubelet[2767]: E0912 17:43:42.433523 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.433563 kubelet[2767]: W0912 17:43:42.433535 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.433563 kubelet[2767]: E0912 17:43:42.433544 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.533681 kubelet[2767]: E0912 17:43:42.532684 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.533681 kubelet[2767]: W0912 17:43:42.532849 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.533681 kubelet[2767]: E0912 17:43:42.532872 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.534193 kubelet[2767]: E0912 17:43:42.534156 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.534547 kubelet[2767]: W0912 17:43:42.534169 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.534547 kubelet[2767]: E0912 17:43:42.534297 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.535876 kubelet[2767]: E0912 17:43:42.535855 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.535876 kubelet[2767]: W0912 17:43:42.535869 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.535963 kubelet[2767]: E0912 17:43:42.535886 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.536067 kubelet[2767]: E0912 17:43:42.536049 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.536067 kubelet[2767]: W0912 17:43:42.536061 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.536133 kubelet[2767]: E0912 17:43:42.536112 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.536291 kubelet[2767]: E0912 17:43:42.536273 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.536291 kubelet[2767]: W0912 17:43:42.536285 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.536399 kubelet[2767]: E0912 17:43:42.536373 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.536481 kubelet[2767]: E0912 17:43:42.536460 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.536481 kubelet[2767]: W0912 17:43:42.536472 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.536590 kubelet[2767]: E0912 17:43:42.536564 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.536626 kubelet[2767]: E0912 17:43:42.536613 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.536626 kubelet[2767]: W0912 17:43:42.536619 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.536672 kubelet[2767]: E0912 17:43:42.536629 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.537412 kubelet[2767]: E0912 17:43:42.537392 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.537412 kubelet[2767]: W0912 17:43:42.537405 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.537412 kubelet[2767]: E0912 17:43:42.537413 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.537583 kubelet[2767]: E0912 17:43:42.537558 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.537583 kubelet[2767]: W0912 17:43:42.537575 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.537750 kubelet[2767]: E0912 17:43:42.537591 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.537750 kubelet[2767]: E0912 17:43:42.537727 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.537750 kubelet[2767]: W0912 17:43:42.537735 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.537750 kubelet[2767]: E0912 17:43:42.537743 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.537974 kubelet[2767]: E0912 17:43:42.537960 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.537974 kubelet[2767]: W0912 17:43:42.537968 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.538117 kubelet[2767]: E0912 17:43:42.538075 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.538239 kubelet[2767]: E0912 17:43:42.538216 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.538239 kubelet[2767]: W0912 17:43:42.538232 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.538344 kubelet[2767]: E0912 17:43:42.538326 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.538446 kubelet[2767]: E0912 17:43:42.538435 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.538446 kubelet[2767]: W0912 17:43:42.538445 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.538596 kubelet[2767]: E0912 17:43:42.538574 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.538596 kubelet[2767]: W0912 17:43:42.538587 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.538596 kubelet[2767]: E0912 17:43:42.538596 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.538665 kubelet[2767]: E0912 17:43:42.538658 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.538843 kubelet[2767]: E0912 17:43:42.538791 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.538843 kubelet[2767]: W0912 17:43:42.538804 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.538843 kubelet[2767]: E0912 17:43:42.538827 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.539181 kubelet[2767]: E0912 17:43:42.539030 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.539181 kubelet[2767]: W0912 17:43:42.539040 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.539181 kubelet[2767]: E0912 17:43:42.539048 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.539370 kubelet[2767]: E0912 17:43:42.539358 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.539448 kubelet[2767]: W0912 17:43:42.539438 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.539631 kubelet[2767]: E0912 17:43:42.539603 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.539871 kubelet[2767]: E0912 17:43:42.539849 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.539871 kubelet[2767]: W0912 17:43:42.539860 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.540132 kubelet[2767]: E0912 17:43:42.540108 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.540347 kubelet[2767]: E0912 17:43:42.540261 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.540347 kubelet[2767]: W0912 17:43:42.540271 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.540347 kubelet[2767]: E0912 17:43:42.540280 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.540593 kubelet[2767]: E0912 17:43:42.540577 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.540713 kubelet[2767]: W0912 17:43:42.540670 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.540713 kubelet[2767]: E0912 17:43:42.540689 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:42.546913 kubelet[2767]: E0912 17:43:42.546890 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:42.546971 kubelet[2767]: W0912 17:43:42.546923 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:42.546971 kubelet[2767]: E0912 17:43:42.546932 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:44.111887 kubelet[2767]: E0912 17:43:44.111790 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rjfp6" podUID="f8cbe4d2-087a-4a06-9eb4-75af1bfa61da" Sep 12 17:43:44.285066 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2427280826.mount: Deactivated successfully. Sep 12 17:43:45.256858 containerd[1566]: time="2025-09-12T17:43:45.256798353Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:45.257778 containerd[1566]: time="2025-09-12T17:43:45.257618249Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 17:43:45.258510 containerd[1566]: time="2025-09-12T17:43:45.258485071Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:45.262677 containerd[1566]: time="2025-09-12T17:43:45.262646908Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:45.263598 containerd[1566]: time="2025-09-12T17:43:45.263572875Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 3.086288305s" Sep 12 17:43:45.263633 containerd[1566]: time="2025-09-12T17:43:45.263604060Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 17:43:45.266836 containerd[1566]: time="2025-09-12T17:43:45.266669547Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 17:43:45.279162 containerd[1566]: time="2025-09-12T17:43:45.279130646Z" level=info msg="CreateContainer within sandbox \"b0b2b8e9c937991ad63763db9f7517dcc2b7ad5813ac210552c6a99b5b25a8f6\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 17:43:45.287268 containerd[1566]: time="2025-09-12T17:43:45.287230826Z" level=info msg="Container d3fc76697cf0390d33a39aac51602647df8de9d6f001f383317b8674407d4c94: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:45.292036 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3129862827.mount: Deactivated successfully. Sep 12 17:43:45.296757 containerd[1566]: time="2025-09-12T17:43:45.296709332Z" level=info msg="CreateContainer within sandbox \"b0b2b8e9c937991ad63763db9f7517dcc2b7ad5813ac210552c6a99b5b25a8f6\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d3fc76697cf0390d33a39aac51602647df8de9d6f001f383317b8674407d4c94\"" Sep 12 17:43:45.297751 containerd[1566]: time="2025-09-12T17:43:45.297729293Z" level=info msg="StartContainer for \"d3fc76697cf0390d33a39aac51602647df8de9d6f001f383317b8674407d4c94\"" Sep 12 17:43:45.299443 containerd[1566]: time="2025-09-12T17:43:45.298886381Z" level=info msg="connecting to shim d3fc76697cf0390d33a39aac51602647df8de9d6f001f383317b8674407d4c94" address="unix:///run/containerd/s/c5efa94d479a0d516aa6ea839d35b7be97d9526d48c17d358299d8a3dca7b7ff" protocol=ttrpc version=3 Sep 12 17:43:45.321960 systemd[1]: Started cri-containerd-d3fc76697cf0390d33a39aac51602647df8de9d6f001f383317b8674407d4c94.scope - libcontainer container d3fc76697cf0390d33a39aac51602647df8de9d6f001f383317b8674407d4c94. Sep 12 17:43:45.369390 containerd[1566]: time="2025-09-12T17:43:45.369345594Z" level=info msg="StartContainer for \"d3fc76697cf0390d33a39aac51602647df8de9d6f001f383317b8674407d4c94\" returns successfully" Sep 12 17:43:46.111685 kubelet[2767]: E0912 17:43:46.111623 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rjfp6" podUID="f8cbe4d2-087a-4a06-9eb4-75af1bfa61da" Sep 12 17:43:46.255857 kubelet[2767]: E0912 17:43:46.255804 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.255857 kubelet[2767]: W0912 17:43:46.255846 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.255857 kubelet[2767]: E0912 17:43:46.255866 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.256076 kubelet[2767]: E0912 17:43:46.256060 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.256076 kubelet[2767]: W0912 17:43:46.256074 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.256131 kubelet[2767]: E0912 17:43:46.256104 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.256345 kubelet[2767]: E0912 17:43:46.256320 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.256345 kubelet[2767]: W0912 17:43:46.256335 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.256345 kubelet[2767]: E0912 17:43:46.256345 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.256616 kubelet[2767]: E0912 17:43:46.256593 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.256616 kubelet[2767]: W0912 17:43:46.256608 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.256616 kubelet[2767]: E0912 17:43:46.256617 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.256826 kubelet[2767]: E0912 17:43:46.256778 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.256826 kubelet[2767]: W0912 17:43:46.256790 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.256826 kubelet[2767]: E0912 17:43:46.256803 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.257009 kubelet[2767]: E0912 17:43:46.256973 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.257009 kubelet[2767]: W0912 17:43:46.256981 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.257009 kubelet[2767]: E0912 17:43:46.256989 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.257208 kubelet[2767]: E0912 17:43:46.257105 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.257208 kubelet[2767]: W0912 17:43:46.257113 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.257208 kubelet[2767]: E0912 17:43:46.257120 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.257411 kubelet[2767]: E0912 17:43:46.257245 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.257411 kubelet[2767]: W0912 17:43:46.257253 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.257411 kubelet[2767]: E0912 17:43:46.257260 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.257411 kubelet[2767]: E0912 17:43:46.257394 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.257411 kubelet[2767]: W0912 17:43:46.257401 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.257411 kubelet[2767]: E0912 17:43:46.257408 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.257542 kubelet[2767]: E0912 17:43:46.257514 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.257542 kubelet[2767]: W0912 17:43:46.257521 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.257542 kubelet[2767]: E0912 17:43:46.257529 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.257654 kubelet[2767]: E0912 17:43:46.257639 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.257654 kubelet[2767]: W0912 17:43:46.257649 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.257654 kubelet[2767]: E0912 17:43:46.257656 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.257890 kubelet[2767]: E0912 17:43:46.257783 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.257890 kubelet[2767]: W0912 17:43:46.257790 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.257890 kubelet[2767]: E0912 17:43:46.257798 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.258072 kubelet[2767]: E0912 17:43:46.257959 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.258072 kubelet[2767]: W0912 17:43:46.257967 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.258072 kubelet[2767]: E0912 17:43:46.257974 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.258170 kubelet[2767]: E0912 17:43:46.258095 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.258170 kubelet[2767]: W0912 17:43:46.258103 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.258170 kubelet[2767]: E0912 17:43:46.258111 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.258258 kubelet[2767]: E0912 17:43:46.258225 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.258258 kubelet[2767]: W0912 17:43:46.258233 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.258258 kubelet[2767]: E0912 17:43:46.258239 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.262652 kubelet[2767]: E0912 17:43:46.262629 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.262652 kubelet[2767]: W0912 17:43:46.262643 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.262652 kubelet[2767]: E0912 17:43:46.262652 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.262910 kubelet[2767]: E0912 17:43:46.262880 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.262910 kubelet[2767]: W0912 17:43:46.262893 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.263006 kubelet[2767]: E0912 17:43:46.262930 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.263141 kubelet[2767]: E0912 17:43:46.263102 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.263141 kubelet[2767]: W0912 17:43:46.263116 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.263141 kubelet[2767]: E0912 17:43:46.263137 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.263373 kubelet[2767]: E0912 17:43:46.263353 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.263373 kubelet[2767]: W0912 17:43:46.263369 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.263506 kubelet[2767]: E0912 17:43:46.263443 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.263677 kubelet[2767]: E0912 17:43:46.263641 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.263677 kubelet[2767]: W0912 17:43:46.263656 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.263677 kubelet[2767]: E0912 17:43:46.263677 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.263946 kubelet[2767]: E0912 17:43:46.263859 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.263946 kubelet[2767]: W0912 17:43:46.263872 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.263946 kubelet[2767]: E0912 17:43:46.263894 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.264090 kubelet[2767]: E0912 17:43:46.264074 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.264090 kubelet[2767]: W0912 17:43:46.264086 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.264183 kubelet[2767]: E0912 17:43:46.264122 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.264488 kubelet[2767]: E0912 17:43:46.264464 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.264488 kubelet[2767]: W0912 17:43:46.264477 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.264766 kubelet[2767]: E0912 17:43:46.264568 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.264766 kubelet[2767]: E0912 17:43:46.264654 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.264766 kubelet[2767]: W0912 17:43:46.264660 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.264766 kubelet[2767]: E0912 17:43:46.264753 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.265560 kubelet[2767]: E0912 17:43:46.264802 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.265560 kubelet[2767]: W0912 17:43:46.264834 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.265560 kubelet[2767]: E0912 17:43:46.264853 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.265560 kubelet[2767]: E0912 17:43:46.265001 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.265560 kubelet[2767]: W0912 17:43:46.265009 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.265560 kubelet[2767]: E0912 17:43:46.265029 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.265560 kubelet[2767]: E0912 17:43:46.265258 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.265560 kubelet[2767]: W0912 17:43:46.265269 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.265560 kubelet[2767]: E0912 17:43:46.265290 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.266187 kubelet[2767]: E0912 17:43:46.265892 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.266187 kubelet[2767]: W0912 17:43:46.265902 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.266187 kubelet[2767]: E0912 17:43:46.265932 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.266562 kubelet[2767]: E0912 17:43:46.266298 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.266562 kubelet[2767]: W0912 17:43:46.266328 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.266562 kubelet[2767]: E0912 17:43:46.266365 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.266859 kubelet[2767]: E0912 17:43:46.266790 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.266859 kubelet[2767]: W0912 17:43:46.266834 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.267028 kubelet[2767]: E0912 17:43:46.266883 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.267272 kubelet[2767]: E0912 17:43:46.267187 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.267272 kubelet[2767]: W0912 17:43:46.267200 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.267359 kubelet[2767]: E0912 17:43:46.267268 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.267470 kubelet[2767]: E0912 17:43:46.267454 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.267554 kubelet[2767]: W0912 17:43:46.267520 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.267554 kubelet[2767]: E0912 17:43:46.267533 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:46.267898 kubelet[2767]: E0912 17:43:46.267865 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:46.267898 kubelet[2767]: W0912 17:43:46.267887 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:46.267898 kubelet[2767]: E0912 17:43:46.267902 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.179756 containerd[1566]: time="2025-09-12T17:43:47.179691126Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:47.180421 containerd[1566]: time="2025-09-12T17:43:47.180300140Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 17:43:47.181342 containerd[1566]: time="2025-09-12T17:43:47.181152114Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:47.183716 containerd[1566]: time="2025-09-12T17:43:47.183665500Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:47.184403 containerd[1566]: time="2025-09-12T17:43:47.184134466Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.917438004s" Sep 12 17:43:47.184403 containerd[1566]: time="2025-09-12T17:43:47.184167203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 17:43:47.186632 containerd[1566]: time="2025-09-12T17:43:47.186611998Z" level=info msg="CreateContainer within sandbox \"cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 17:43:47.198030 kubelet[2767]: I0912 17:43:47.197991 2767 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:43:47.209445 containerd[1566]: time="2025-09-12T17:43:47.209121002Z" level=info msg="Container eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:47.224903 containerd[1566]: time="2025-09-12T17:43:47.224858542Z" level=info msg="CreateContainer within sandbox \"cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944\"" Sep 12 17:43:47.225839 containerd[1566]: time="2025-09-12T17:43:47.225434398Z" level=info msg="StartContainer for \"eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944\"" Sep 12 17:43:47.226937 containerd[1566]: time="2025-09-12T17:43:47.226894295Z" level=info msg="connecting to shim eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944" address="unix:///run/containerd/s/add124bbb502391e80416b2022dff57345f4936653f59434d38f62aec714f705" protocol=ttrpc version=3 Sep 12 17:43:47.255975 systemd[1]: Started cri-containerd-eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944.scope - libcontainer container eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944. Sep 12 17:43:47.266481 kubelet[2767]: E0912 17:43:47.266406 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.266888 kubelet[2767]: W0912 17:43:47.266627 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.266888 kubelet[2767]: E0912 17:43:47.266651 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.267620 kubelet[2767]: E0912 17:43:47.267523 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.267620 kubelet[2767]: W0912 17:43:47.267536 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.267620 kubelet[2767]: E0912 17:43:47.267549 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.267788 kubelet[2767]: E0912 17:43:47.267777 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.267861 kubelet[2767]: W0912 17:43:47.267851 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.267965 kubelet[2767]: E0912 17:43:47.267915 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.268152 kubelet[2767]: E0912 17:43:47.268141 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.268284 kubelet[2767]: W0912 17:43:47.268205 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.268284 kubelet[2767]: E0912 17:43:47.268218 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.268418 kubelet[2767]: E0912 17:43:47.268409 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.268480 kubelet[2767]: W0912 17:43:47.268471 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.268536 kubelet[2767]: E0912 17:43:47.268527 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.268759 kubelet[2767]: E0912 17:43:47.268687 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.268759 kubelet[2767]: W0912 17:43:47.268698 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.268759 kubelet[2767]: E0912 17:43:47.268706 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.269090 kubelet[2767]: E0912 17:43:47.268988 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.269090 kubelet[2767]: W0912 17:43:47.268999 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.269090 kubelet[2767]: E0912 17:43:47.269008 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.269290 kubelet[2767]: E0912 17:43:47.269251 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.269290 kubelet[2767]: W0912 17:43:47.269261 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.269453 kubelet[2767]: E0912 17:43:47.269369 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.269833 kubelet[2767]: E0912 17:43:47.269770 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.269833 kubelet[2767]: W0912 17:43:47.269793 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.269968 kubelet[2767]: E0912 17:43:47.269935 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.270520 kubelet[2767]: E0912 17:43:47.270495 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.270520 kubelet[2767]: W0912 17:43:47.270515 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.271140 kubelet[2767]: E0912 17:43:47.270582 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.272013 kubelet[2767]: E0912 17:43:47.271986 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.272013 kubelet[2767]: W0912 17:43:47.272001 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.272013 kubelet[2767]: E0912 17:43:47.272013 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.272249 kubelet[2767]: E0912 17:43:47.272222 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.272249 kubelet[2767]: W0912 17:43:47.272236 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.272249 kubelet[2767]: E0912 17:43:47.272245 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.272542 kubelet[2767]: E0912 17:43:47.272518 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.272542 kubelet[2767]: W0912 17:43:47.272531 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.272542 kubelet[2767]: E0912 17:43:47.272539 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.272715 kubelet[2767]: E0912 17:43:47.272691 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.272715 kubelet[2767]: W0912 17:43:47.272704 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.272715 kubelet[2767]: E0912 17:43:47.272712 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.272959 kubelet[2767]: E0912 17:43:47.272916 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.272959 kubelet[2767]: W0912 17:43:47.272930 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.272959 kubelet[2767]: E0912 17:43:47.272939 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.273193 kubelet[2767]: E0912 17:43:47.273169 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.273193 kubelet[2767]: W0912 17:43:47.273182 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.273193 kubelet[2767]: E0912 17:43:47.273190 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.273624 kubelet[2767]: E0912 17:43:47.273598 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.273624 kubelet[2767]: W0912 17:43:47.273612 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.273624 kubelet[2767]: E0912 17:43:47.273625 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.274420 kubelet[2767]: E0912 17:43:47.274394 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.274478 kubelet[2767]: W0912 17:43:47.274425 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.274478 kubelet[2767]: E0912 17:43:47.274440 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.274723 kubelet[2767]: E0912 17:43:47.274694 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.274723 kubelet[2767]: W0912 17:43:47.274706 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.274723 kubelet[2767]: E0912 17:43:47.274720 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.275174 kubelet[2767]: E0912 17:43:47.275150 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.275174 kubelet[2767]: W0912 17:43:47.275175 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.275285 kubelet[2767]: E0912 17:43:47.275194 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.276028 kubelet[2767]: E0912 17:43:47.276002 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.276028 kubelet[2767]: W0912 17:43:47.276018 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.276205 kubelet[2767]: E0912 17:43:47.276082 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.276419 kubelet[2767]: E0912 17:43:47.276382 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.276531 kubelet[2767]: W0912 17:43:47.276503 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.276613 kubelet[2767]: E0912 17:43:47.276590 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.277178 kubelet[2767]: E0912 17:43:47.277038 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.277178 kubelet[2767]: W0912 17:43:47.277049 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.277280 kubelet[2767]: E0912 17:43:47.277259 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.277857 kubelet[2767]: E0912 17:43:47.277797 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.278020 kubelet[2767]: W0912 17:43:47.277980 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.278258 kubelet[2767]: E0912 17:43:47.278205 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.279000 kubelet[2767]: E0912 17:43:47.278975 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.279000 kubelet[2767]: W0912 17:43:47.278989 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.279096 kubelet[2767]: E0912 17:43:47.279025 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.279338 kubelet[2767]: E0912 17:43:47.279325 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.279492 kubelet[2767]: W0912 17:43:47.279406 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.279492 kubelet[2767]: E0912 17:43:47.279441 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.279715 kubelet[2767]: E0912 17:43:47.279704 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.279855 kubelet[2767]: W0912 17:43:47.279749 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.279855 kubelet[2767]: E0912 17:43:47.279779 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.280363 kubelet[2767]: E0912 17:43:47.280352 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.280516 kubelet[2767]: W0912 17:43:47.280413 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.280516 kubelet[2767]: E0912 17:43:47.280440 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.280720 kubelet[2767]: E0912 17:43:47.280709 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.280780 kubelet[2767]: W0912 17:43:47.280770 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.280907 kubelet[2767]: E0912 17:43:47.280877 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.281055 kubelet[2767]: E0912 17:43:47.281031 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.281055 kubelet[2767]: W0912 17:43:47.281042 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.281264 kubelet[2767]: E0912 17:43:47.281145 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.281389 kubelet[2767]: E0912 17:43:47.281378 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.281485 kubelet[2767]: W0912 17:43:47.281473 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.281643 kubelet[2767]: E0912 17:43:47.281553 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.281949 kubelet[2767]: E0912 17:43:47.281938 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.282086 kubelet[2767]: W0912 17:43:47.281986 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.282086 kubelet[2767]: E0912 17:43:47.281996 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.282665 kubelet[2767]: E0912 17:43:47.282653 2767 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 17:43:47.282728 kubelet[2767]: W0912 17:43:47.282718 2767 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 17:43:47.282786 kubelet[2767]: E0912 17:43:47.282777 2767 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 17:43:47.303396 containerd[1566]: time="2025-09-12T17:43:47.303361369Z" level=info msg="StartContainer for \"eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944\" returns successfully" Sep 12 17:43:47.314653 systemd[1]: cri-containerd-eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944.scope: Deactivated successfully. Sep 12 17:43:47.328709 containerd[1566]: time="2025-09-12T17:43:47.328663635Z" level=info msg="received exit event container_id:\"eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944\" id:\"eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944\" pid:3456 exited_at:{seconds:1757699027 nanos:318605051}" Sep 12 17:43:47.350657 containerd[1566]: time="2025-09-12T17:43:47.350616294Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944\" id:\"eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944\" pid:3456 exited_at:{seconds:1757699027 nanos:318605051}" Sep 12 17:43:47.367021 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-eb7089c1c8136af8d2e2f8a7c697eb3a81147704183aa1fa8c6634144f461944-rootfs.mount: Deactivated successfully. Sep 12 17:43:48.111836 kubelet[2767]: E0912 17:43:48.111742 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rjfp6" podUID="f8cbe4d2-087a-4a06-9eb4-75af1bfa61da" Sep 12 17:43:48.205639 containerd[1566]: time="2025-09-12T17:43:48.205580604Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 17:43:48.224280 kubelet[2767]: I0912 17:43:48.223885 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5c7d4df57c-4fc99" podStartSLOduration=4.136003143 podStartE2EDuration="7.223871108s" podCreationTimestamp="2025-09-12 17:43:41 +0000 UTC" firstStartedPulling="2025-09-12 17:43:42.176873489 +0000 UTC m=+19.161356691" lastFinishedPulling="2025-09-12 17:43:45.264741456 +0000 UTC m=+22.249224656" observedRunningTime="2025-09-12 17:43:46.210978648 +0000 UTC m=+23.195461859" watchObservedRunningTime="2025-09-12 17:43:48.223871108 +0000 UTC m=+25.208354310" Sep 12 17:43:50.111742 kubelet[2767]: E0912 17:43:50.111677 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rjfp6" podUID="f8cbe4d2-087a-4a06-9eb4-75af1bfa61da" Sep 12 17:43:52.111822 kubelet[2767]: E0912 17:43:52.111775 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rjfp6" podUID="f8cbe4d2-087a-4a06-9eb4-75af1bfa61da" Sep 12 17:43:52.453752 containerd[1566]: time="2025-09-12T17:43:52.453550957Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:52.454561 containerd[1566]: time="2025-09-12T17:43:52.454405611Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 17:43:52.455330 containerd[1566]: time="2025-09-12T17:43:52.455304696Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:52.457018 containerd[1566]: time="2025-09-12T17:43:52.456999415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:43:52.457388 containerd[1566]: time="2025-09-12T17:43:52.457364843Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.251718465s" Sep 12 17:43:52.457425 containerd[1566]: time="2025-09-12T17:43:52.457390414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 17:43:52.459263 containerd[1566]: time="2025-09-12T17:43:52.459219656Z" level=info msg="CreateContainer within sandbox \"cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 17:43:52.470847 containerd[1566]: time="2025-09-12T17:43:52.469458847Z" level=info msg="Container 68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:43:52.487166 containerd[1566]: time="2025-09-12T17:43:52.487135425Z" level=info msg="CreateContainer within sandbox \"cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a\"" Sep 12 17:43:52.487851 containerd[1566]: time="2025-09-12T17:43:52.487828794Z" level=info msg="StartContainer for \"68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a\"" Sep 12 17:43:52.489337 containerd[1566]: time="2025-09-12T17:43:52.489301526Z" level=info msg="connecting to shim 68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a" address="unix:///run/containerd/s/add124bbb502391e80416b2022dff57345f4936653f59434d38f62aec714f705" protocol=ttrpc version=3 Sep 12 17:43:52.509931 systemd[1]: Started cri-containerd-68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a.scope - libcontainer container 68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a. Sep 12 17:43:52.550908 containerd[1566]: time="2025-09-12T17:43:52.550877596Z" level=info msg="StartContainer for \"68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a\" returns successfully" Sep 12 17:43:52.923020 systemd[1]: cri-containerd-68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a.scope: Deactivated successfully. Sep 12 17:43:52.923701 systemd[1]: cri-containerd-68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a.scope: Consumed 354ms CPU time, 167.2M memory peak, 11.1M read from disk, 171.3M written to disk. Sep 12 17:43:52.925301 containerd[1566]: time="2025-09-12T17:43:52.925222854Z" level=info msg="received exit event container_id:\"68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a\" id:\"68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a\" pid:3547 exited_at:{seconds:1757699032 nanos:924644115}" Sep 12 17:43:52.927742 containerd[1566]: time="2025-09-12T17:43:52.925738233Z" level=info msg="TaskExit event in podsandbox handler container_id:\"68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a\" id:\"68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a\" pid:3547 exited_at:{seconds:1757699032 nanos:924644115}" Sep 12 17:43:52.959795 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-68d18c625e4b70416fae97bde5f20f145e9c4d9b8666cc22ab2ca36fd873656a-rootfs.mount: Deactivated successfully. Sep 12 17:43:52.964686 kubelet[2767]: I0912 17:43:52.963364 2767 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 12 17:43:53.046623 systemd[1]: Created slice kubepods-burstable-podfd331680_dd74_42d7_aabe_3845b589bb24.slice - libcontainer container kubepods-burstable-podfd331680_dd74_42d7_aabe_3845b589bb24.slice. Sep 12 17:43:53.069407 systemd[1]: Created slice kubepods-besteffort-podfc215e0e_6746_46f5_990e_f6284f6589d7.slice - libcontainer container kubepods-besteffort-podfc215e0e_6746_46f5_990e_f6284f6589d7.slice. Sep 12 17:43:53.082336 systemd[1]: Created slice kubepods-besteffort-pod42f5189e_7e63_4633_b8a4_4e4423d9882b.slice - libcontainer container kubepods-besteffort-pod42f5189e_7e63_4633_b8a4_4e4423d9882b.slice. Sep 12 17:43:53.086998 systemd[1]: Created slice kubepods-besteffort-pod32ac1f6e_8e4c_423b_a1e9_9426f9eb5e07.slice - libcontainer container kubepods-besteffort-pod32ac1f6e_8e4c_423b_a1e9_9426f9eb5e07.slice. Sep 12 17:43:53.094840 systemd[1]: Created slice kubepods-besteffort-pod20907fe9_b0fe_40c2_a2f7_e34e4e300a17.slice - libcontainer container kubepods-besteffort-pod20907fe9_b0fe_40c2_a2f7_e34e4e300a17.slice. Sep 12 17:43:53.100337 systemd[1]: Created slice kubepods-besteffort-pod71d8700f_1b54_42a0_95f0_09edca88a31c.slice - libcontainer container kubepods-besteffort-pod71d8700f_1b54_42a0_95f0_09edca88a31c.slice. Sep 12 17:43:53.109872 systemd[1]: Created slice kubepods-burstable-pod90bfb799_6c60_4f68_b2ce_a2c4d3ee0b9c.slice - libcontainer container kubepods-burstable-pod90bfb799_6c60_4f68_b2ce_a2c4d3ee0b9c.slice. Sep 12 17:43:53.123644 kubelet[2767]: I0912 17:43:53.123508 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7cc5\" (UniqueName: \"kubernetes.io/projected/fc215e0e-6746-46f5-990e-f6284f6589d7-kube-api-access-j7cc5\") pod \"calico-apiserver-689c5fcdd8-kkmkw\" (UID: \"fc215e0e-6746-46f5-990e-f6284f6589d7\") " pod="calico-apiserver/calico-apiserver-689c5fcdd8-kkmkw" Sep 12 17:43:53.124337 kubelet[2767]: I0912 17:43:53.123981 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/71d8700f-1b54-42a0-95f0-09edca88a31c-whisker-backend-key-pair\") pod \"whisker-6f485d99d-fskhk\" (UID: \"71d8700f-1b54-42a0-95f0-09edca88a31c\") " pod="calico-system/whisker-6f485d99d-fskhk" Sep 12 17:43:53.124337 kubelet[2767]: I0912 17:43:53.124146 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cgvd\" (UniqueName: \"kubernetes.io/projected/71d8700f-1b54-42a0-95f0-09edca88a31c-kube-api-access-7cgvd\") pod \"whisker-6f485d99d-fskhk\" (UID: \"71d8700f-1b54-42a0-95f0-09edca88a31c\") " pod="calico-system/whisker-6f485d99d-fskhk" Sep 12 17:43:53.124337 kubelet[2767]: I0912 17:43:53.124192 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20907fe9-b0fe-40c2-a2f7-e34e4e300a17-tigera-ca-bundle\") pod \"calico-kube-controllers-59859d54f6-jtjgv\" (UID: \"20907fe9-b0fe-40c2-a2f7-e34e4e300a17\") " pod="calico-system/calico-kube-controllers-59859d54f6-jtjgv" Sep 12 17:43:53.124337 kubelet[2767]: I0912 17:43:53.124262 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rwtk\" (UniqueName: \"kubernetes.io/projected/90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c-kube-api-access-9rwtk\") pod \"coredns-668d6bf9bc-gw76d\" (UID: \"90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c\") " pod="kube-system/coredns-668d6bf9bc-gw76d" Sep 12 17:43:53.124337 kubelet[2767]: I0912 17:43:53.124277 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/42f5189e-7e63-4633-b8a4-4e4423d9882b-goldmane-key-pair\") pod \"goldmane-54d579b49d-58z84\" (UID: \"42f5189e-7e63-4633-b8a4-4e4423d9882b\") " pod="calico-system/goldmane-54d579b49d-58z84" Sep 12 17:43:53.124709 kubelet[2767]: I0912 17:43:53.124290 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c-config-volume\") pod \"coredns-668d6bf9bc-gw76d\" (UID: \"90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c\") " pod="kube-system/coredns-668d6bf9bc-gw76d" Sep 12 17:43:53.124709 kubelet[2767]: I0912 17:43:53.124445 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42f5189e-7e63-4633-b8a4-4e4423d9882b-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-58z84\" (UID: \"42f5189e-7e63-4633-b8a4-4e4423d9882b\") " pod="calico-system/goldmane-54d579b49d-58z84" Sep 12 17:43:53.124709 kubelet[2767]: I0912 17:43:53.124459 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksfs\" (UniqueName: \"kubernetes.io/projected/42f5189e-7e63-4633-b8a4-4e4423d9882b-kube-api-access-qksfs\") pod \"goldmane-54d579b49d-58z84\" (UID: \"42f5189e-7e63-4633-b8a4-4e4423d9882b\") " pod="calico-system/goldmane-54d579b49d-58z84" Sep 12 17:43:53.124709 kubelet[2767]: I0912 17:43:53.124608 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd331680-dd74-42d7-aabe-3845b589bb24-config-volume\") pod \"coredns-668d6bf9bc-wpwkw\" (UID: \"fd331680-dd74-42d7-aabe-3845b589bb24\") " pod="kube-system/coredns-668d6bf9bc-wpwkw" Sep 12 17:43:53.124709 kubelet[2767]: I0912 17:43:53.124629 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mgf\" (UniqueName: \"kubernetes.io/projected/fd331680-dd74-42d7-aabe-3845b589bb24-kube-api-access-x5mgf\") pod \"coredns-668d6bf9bc-wpwkw\" (UID: \"fd331680-dd74-42d7-aabe-3845b589bb24\") " pod="kube-system/coredns-668d6bf9bc-wpwkw" Sep 12 17:43:53.125219 kubelet[2767]: I0912 17:43:53.124846 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fc215e0e-6746-46f5-990e-f6284f6589d7-calico-apiserver-certs\") pod \"calico-apiserver-689c5fcdd8-kkmkw\" (UID: \"fc215e0e-6746-46f5-990e-f6284f6589d7\") " pod="calico-apiserver/calico-apiserver-689c5fcdd8-kkmkw" Sep 12 17:43:53.125219 kubelet[2767]: I0912 17:43:53.124871 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07-calico-apiserver-certs\") pod \"calico-apiserver-689c5fcdd8-vzvxw\" (UID: \"32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07\") " pod="calico-apiserver/calico-apiserver-689c5fcdd8-vzvxw" Sep 12 17:43:53.125219 kubelet[2767]: I0912 17:43:53.124897 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tkzd\" (UniqueName: \"kubernetes.io/projected/32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07-kube-api-access-9tkzd\") pod \"calico-apiserver-689c5fcdd8-vzvxw\" (UID: \"32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07\") " pod="calico-apiserver/calico-apiserver-689c5fcdd8-vzvxw" Sep 12 17:43:53.125219 kubelet[2767]: I0912 17:43:53.124913 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42f5189e-7e63-4633-b8a4-4e4423d9882b-config\") pod \"goldmane-54d579b49d-58z84\" (UID: \"42f5189e-7e63-4633-b8a4-4e4423d9882b\") " pod="calico-system/goldmane-54d579b49d-58z84" Sep 12 17:43:53.125219 kubelet[2767]: I0912 17:43:53.124925 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4rm8\" (UniqueName: \"kubernetes.io/projected/20907fe9-b0fe-40c2-a2f7-e34e4e300a17-kube-api-access-f4rm8\") pod \"calico-kube-controllers-59859d54f6-jtjgv\" (UID: \"20907fe9-b0fe-40c2-a2f7-e34e4e300a17\") " pod="calico-system/calico-kube-controllers-59859d54f6-jtjgv" Sep 12 17:43:53.125308 kubelet[2767]: I0912 17:43:53.124939 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d8700f-1b54-42a0-95f0-09edca88a31c-whisker-ca-bundle\") pod \"whisker-6f485d99d-fskhk\" (UID: \"71d8700f-1b54-42a0-95f0-09edca88a31c\") " pod="calico-system/whisker-6f485d99d-fskhk" Sep 12 17:43:53.273489 containerd[1566]: time="2025-09-12T17:43:53.273153464Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 17:43:53.358141 containerd[1566]: time="2025-09-12T17:43:53.358066917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wpwkw,Uid:fd331680-dd74-42d7-aabe-3845b589bb24,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:53.378429 containerd[1566]: time="2025-09-12T17:43:53.378093851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689c5fcdd8-kkmkw,Uid:fc215e0e-6746-46f5-990e-f6284f6589d7,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:43:53.396024 containerd[1566]: time="2025-09-12T17:43:53.395973197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689c5fcdd8-vzvxw,Uid:32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:43:53.396546 containerd[1566]: time="2025-09-12T17:43:53.396512391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-58z84,Uid:42f5189e-7e63-4633-b8a4-4e4423d9882b,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:53.400385 containerd[1566]: time="2025-09-12T17:43:53.400362874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59859d54f6-jtjgv,Uid:20907fe9-b0fe-40c2-a2f7-e34e4e300a17,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:53.419442 containerd[1566]: time="2025-09-12T17:43:53.419263724Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gw76d,Uid:90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c,Namespace:kube-system,Attempt:0,}" Sep 12 17:43:53.419644 containerd[1566]: time="2025-09-12T17:43:53.419605182Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f485d99d-fskhk,Uid:71d8700f-1b54-42a0-95f0-09edca88a31c,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:53.614309 containerd[1566]: time="2025-09-12T17:43:53.614260817Z" level=error msg="Failed to destroy network for sandbox \"d5af99f2a893b08d154251b68c3f787bc396847c7a81c12b5796d1ec3f35e808\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.617749 systemd[1]: run-netns-cni\x2da31bac26\x2d4f91\x2d52e2\x2d8c92\x2d19c598a38590.mount: Deactivated successfully. Sep 12 17:43:53.619278 containerd[1566]: time="2025-09-12T17:43:53.619244639Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-58z84,Uid:42f5189e-7e63-4633-b8a4-4e4423d9882b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d5af99f2a893b08d154251b68c3f787bc396847c7a81c12b5796d1ec3f35e808\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.622361 kubelet[2767]: E0912 17:43:53.622321 2767 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d5af99f2a893b08d154251b68c3f787bc396847c7a81c12b5796d1ec3f35e808\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.622919 containerd[1566]: time="2025-09-12T17:43:53.622891423Z" level=error msg="Failed to destroy network for sandbox \"c1738d7a7e5997efc24f4de2fb528ab30aa5bdc724a8122d50544add3c08d826\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.625677 systemd[1]: run-netns-cni\x2d7812964b\x2ddc60\x2dc639\x2d03aa\x2d0a6b577c8c40.mount: Deactivated successfully. Sep 12 17:43:53.628166 containerd[1566]: time="2025-09-12T17:43:53.628130990Z" level=error msg="Failed to destroy network for sandbox \"3391b61a64063b70d0a8afde2645e420810d7540894ea50416be9d28007e2ede\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.628339 kubelet[2767]: E0912 17:43:53.628316 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d5af99f2a893b08d154251b68c3f787bc396847c7a81c12b5796d1ec3f35e808\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-58z84" Sep 12 17:43:53.628412 kubelet[2767]: E0912 17:43:53.628398 2767 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d5af99f2a893b08d154251b68c3f787bc396847c7a81c12b5796d1ec3f35e808\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-58z84" Sep 12 17:43:53.630705 systemd[1]: run-netns-cni\x2da311531c\x2dccf1\x2d5acb\x2d54a6\x2d1626d9c42f3a.mount: Deactivated successfully. Sep 12 17:43:53.632358 kubelet[2767]: E0912 17:43:53.632309 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-58z84_calico-system(42f5189e-7e63-4633-b8a4-4e4423d9882b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-58z84_calico-system(42f5189e-7e63-4633-b8a4-4e4423d9882b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d5af99f2a893b08d154251b68c3f787bc396847c7a81c12b5796d1ec3f35e808\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-58z84" podUID="42f5189e-7e63-4633-b8a4-4e4423d9882b" Sep 12 17:43:53.634460 containerd[1566]: time="2025-09-12T17:43:53.633711713Z" level=error msg="Failed to destroy network for sandbox \"39eca6262cd1a087e1536e538213c89e663afec7a2c4c1274f135a310adde1bb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.637358 systemd[1]: run-netns-cni\x2dfff16c2d\x2db00c\x2dff9a\x2dc413\x2d4f0650ba79dd.mount: Deactivated successfully. Sep 12 17:43:53.638873 containerd[1566]: time="2025-09-12T17:43:53.638656507Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gw76d,Uid:90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1738d7a7e5997efc24f4de2fb528ab30aa5bdc724a8122d50544add3c08d826\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.638873 containerd[1566]: time="2025-09-12T17:43:53.638769615Z" level=error msg="Failed to destroy network for sandbox \"7fdd1b2195999b15f659f2c398c7b799cfbf06f7347137335ea937b0a0e0156e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.640310 kubelet[2767]: E0912 17:43:53.640066 2767 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1738d7a7e5997efc24f4de2fb528ab30aa5bdc724a8122d50544add3c08d826\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.640310 kubelet[2767]: E0912 17:43:53.640113 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1738d7a7e5997efc24f4de2fb528ab30aa5bdc724a8122d50544add3c08d826\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gw76d" Sep 12 17:43:53.640310 kubelet[2767]: E0912 17:43:53.640129 2767 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1738d7a7e5997efc24f4de2fb528ab30aa5bdc724a8122d50544add3c08d826\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gw76d" Sep 12 17:43:53.640416 kubelet[2767]: E0912 17:43:53.640174 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-gw76d_kube-system(90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-gw76d_kube-system(90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c1738d7a7e5997efc24f4de2fb528ab30aa5bdc724a8122d50544add3c08d826\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-gw76d" podUID="90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c" Sep 12 17:43:53.641882 containerd[1566]: time="2025-09-12T17:43:53.641860601Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6f485d99d-fskhk,Uid:71d8700f-1b54-42a0-95f0-09edca88a31c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3391b61a64063b70d0a8afde2645e420810d7540894ea50416be9d28007e2ede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.642259 kubelet[2767]: E0912 17:43:53.642237 2767 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3391b61a64063b70d0a8afde2645e420810d7540894ea50416be9d28007e2ede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.642415 kubelet[2767]: E0912 17:43:53.642266 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3391b61a64063b70d0a8afde2645e420810d7540894ea50416be9d28007e2ede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f485d99d-fskhk" Sep 12 17:43:53.642415 kubelet[2767]: E0912 17:43:53.642279 2767 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3391b61a64063b70d0a8afde2645e420810d7540894ea50416be9d28007e2ede\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6f485d99d-fskhk" Sep 12 17:43:53.642415 kubelet[2767]: E0912 17:43:53.642301 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6f485d99d-fskhk_calico-system(71d8700f-1b54-42a0-95f0-09edca88a31c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6f485d99d-fskhk_calico-system(71d8700f-1b54-42a0-95f0-09edca88a31c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3391b61a64063b70d0a8afde2645e420810d7540894ea50416be9d28007e2ede\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6f485d99d-fskhk" podUID="71d8700f-1b54-42a0-95f0-09edca88a31c" Sep 12 17:43:53.643691 containerd[1566]: time="2025-09-12T17:43:53.643481710Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wpwkw,Uid:fd331680-dd74-42d7-aabe-3845b589bb24,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"39eca6262cd1a087e1536e538213c89e663afec7a2c4c1274f135a310adde1bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.643751 kubelet[2767]: E0912 17:43:53.643699 2767 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39eca6262cd1a087e1536e538213c89e663afec7a2c4c1274f135a310adde1bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.644832 kubelet[2767]: E0912 17:43:53.643722 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39eca6262cd1a087e1536e538213c89e663afec7a2c4c1274f135a310adde1bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wpwkw" Sep 12 17:43:53.644832 kubelet[2767]: E0912 17:43:53.644119 2767 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39eca6262cd1a087e1536e538213c89e663afec7a2c4c1274f135a310adde1bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wpwkw" Sep 12 17:43:53.644832 kubelet[2767]: E0912 17:43:53.644235 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-wpwkw_kube-system(fd331680-dd74-42d7-aabe-3845b589bb24)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-wpwkw_kube-system(fd331680-dd74-42d7-aabe-3845b589bb24)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"39eca6262cd1a087e1536e538213c89e663afec7a2c4c1274f135a310adde1bb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-wpwkw" podUID="fd331680-dd74-42d7-aabe-3845b589bb24" Sep 12 17:43:53.644938 containerd[1566]: time="2025-09-12T17:43:53.644545049Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689c5fcdd8-vzvxw,Uid:32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdd1b2195999b15f659f2c398c7b799cfbf06f7347137335ea937b0a0e0156e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.644974 kubelet[2767]: E0912 17:43:53.644659 2767 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdd1b2195999b15f659f2c398c7b799cfbf06f7347137335ea937b0a0e0156e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.644974 kubelet[2767]: E0912 17:43:53.644681 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdd1b2195999b15f659f2c398c7b799cfbf06f7347137335ea937b0a0e0156e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-689c5fcdd8-vzvxw" Sep 12 17:43:53.644974 kubelet[2767]: E0912 17:43:53.644736 2767 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdd1b2195999b15f659f2c398c7b799cfbf06f7347137335ea937b0a0e0156e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-689c5fcdd8-vzvxw" Sep 12 17:43:53.645031 kubelet[2767]: E0912 17:43:53.644769 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-689c5fcdd8-vzvxw_calico-apiserver(32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-689c5fcdd8-vzvxw_calico-apiserver(32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7fdd1b2195999b15f659f2c398c7b799cfbf06f7347137335ea937b0a0e0156e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-689c5fcdd8-vzvxw" podUID="32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07" Sep 12 17:43:53.648119 containerd[1566]: time="2025-09-12T17:43:53.648086120Z" level=error msg="Failed to destroy network for sandbox \"c0ed83bd4cb43868769152bc5c594155cb2f1177f4607ace0d7d05c0d56758aa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.649285 containerd[1566]: time="2025-09-12T17:43:53.649034448Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689c5fcdd8-kkmkw,Uid:fc215e0e-6746-46f5-990e-f6284f6589d7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0ed83bd4cb43868769152bc5c594155cb2f1177f4607ace0d7d05c0d56758aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.649387 kubelet[2767]: E0912 17:43:53.649147 2767 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0ed83bd4cb43868769152bc5c594155cb2f1177f4607ace0d7d05c0d56758aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.649387 kubelet[2767]: E0912 17:43:53.649191 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0ed83bd4cb43868769152bc5c594155cb2f1177f4607ace0d7d05c0d56758aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-689c5fcdd8-kkmkw" Sep 12 17:43:53.649387 kubelet[2767]: E0912 17:43:53.649208 2767 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0ed83bd4cb43868769152bc5c594155cb2f1177f4607ace0d7d05c0d56758aa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-689c5fcdd8-kkmkw" Sep 12 17:43:53.649481 kubelet[2767]: E0912 17:43:53.649227 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-689c5fcdd8-kkmkw_calico-apiserver(fc215e0e-6746-46f5-990e-f6284f6589d7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-689c5fcdd8-kkmkw_calico-apiserver(fc215e0e-6746-46f5-990e-f6284f6589d7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c0ed83bd4cb43868769152bc5c594155cb2f1177f4607ace0d7d05c0d56758aa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-689c5fcdd8-kkmkw" podUID="fc215e0e-6746-46f5-990e-f6284f6589d7" Sep 12 17:43:53.656300 containerd[1566]: time="2025-09-12T17:43:53.656269297Z" level=error msg="Failed to destroy network for sandbox \"32330c9f26f45b0230c86b17bbb8718919573ffe96cd6e8732602b6ed8696324\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.657265 containerd[1566]: time="2025-09-12T17:43:53.657222063Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59859d54f6-jtjgv,Uid:20907fe9-b0fe-40c2-a2f7-e34e4e300a17,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"32330c9f26f45b0230c86b17bbb8718919573ffe96cd6e8732602b6ed8696324\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.657358 kubelet[2767]: E0912 17:43:53.657343 2767 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32330c9f26f45b0230c86b17bbb8718919573ffe96cd6e8732602b6ed8696324\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:53.657430 kubelet[2767]: E0912 17:43:53.657396 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32330c9f26f45b0230c86b17bbb8718919573ffe96cd6e8732602b6ed8696324\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59859d54f6-jtjgv" Sep 12 17:43:53.657430 kubelet[2767]: E0912 17:43:53.657416 2767 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32330c9f26f45b0230c86b17bbb8718919573ffe96cd6e8732602b6ed8696324\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-59859d54f6-jtjgv" Sep 12 17:43:53.658140 kubelet[2767]: E0912 17:43:53.657446 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-59859d54f6-jtjgv_calico-system(20907fe9-b0fe-40c2-a2f7-e34e4e300a17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-59859d54f6-jtjgv_calico-system(20907fe9-b0fe-40c2-a2f7-e34e4e300a17)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"32330c9f26f45b0230c86b17bbb8718919573ffe96cd6e8732602b6ed8696324\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-59859d54f6-jtjgv" podUID="20907fe9-b0fe-40c2-a2f7-e34e4e300a17" Sep 12 17:43:54.117453 systemd[1]: Created slice kubepods-besteffort-podf8cbe4d2_087a_4a06_9eb4_75af1bfa61da.slice - libcontainer container kubepods-besteffort-podf8cbe4d2_087a_4a06_9eb4_75af1bfa61da.slice. Sep 12 17:43:54.120301 containerd[1566]: time="2025-09-12T17:43:54.120044836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rjfp6,Uid:f8cbe4d2-087a-4a06-9eb4-75af1bfa61da,Namespace:calico-system,Attempt:0,}" Sep 12 17:43:54.160642 containerd[1566]: time="2025-09-12T17:43:54.160603351Z" level=error msg="Failed to destroy network for sandbox \"ed5f3a6c175c7d079bcfe0049be002259e68bd2c4782f93828e51ef8b13ea1ba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:54.162188 containerd[1566]: time="2025-09-12T17:43:54.162148188Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rjfp6,Uid:f8cbe4d2-087a-4a06-9eb4-75af1bfa61da,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed5f3a6c175c7d079bcfe0049be002259e68bd2c4782f93828e51ef8b13ea1ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:54.162612 kubelet[2767]: E0912 17:43:54.162570 2767 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed5f3a6c175c7d079bcfe0049be002259e68bd2c4782f93828e51ef8b13ea1ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 17:43:54.162900 kubelet[2767]: E0912 17:43:54.162627 2767 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed5f3a6c175c7d079bcfe0049be002259e68bd2c4782f93828e51ef8b13ea1ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rjfp6" Sep 12 17:43:54.162900 kubelet[2767]: E0912 17:43:54.162667 2767 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed5f3a6c175c7d079bcfe0049be002259e68bd2c4782f93828e51ef8b13ea1ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rjfp6" Sep 12 17:43:54.163527 kubelet[2767]: E0912 17:43:54.162716 2767 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rjfp6_calico-system(f8cbe4d2-087a-4a06-9eb4-75af1bfa61da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rjfp6_calico-system(f8cbe4d2-087a-4a06-9eb4-75af1bfa61da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed5f3a6c175c7d079bcfe0049be002259e68bd2c4782f93828e51ef8b13ea1ba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rjfp6" podUID="f8cbe4d2-087a-4a06-9eb4-75af1bfa61da" Sep 12 17:43:54.471586 systemd[1]: run-netns-cni\x2d06fd6a80\x2d21e5\x2d7f68\x2d309a\x2d80991eb71f63.mount: Deactivated successfully. Sep 12 17:43:54.471697 systemd[1]: run-netns-cni\x2d3df1ddd0\x2d831e\x2de2f6\x2dce0b\x2d480a0cf67440.mount: Deactivated successfully. Sep 12 17:43:54.471752 systemd[1]: run-netns-cni\x2d4e83fd78\x2d34c5\x2d735b\x2df1d0\x2dd55dd3fde16c.mount: Deactivated successfully. Sep 12 17:44:00.868287 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2054363901.mount: Deactivated successfully. Sep 12 17:44:01.039620 containerd[1566]: time="2025-09-12T17:44:01.039568675Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 17:44:01.113251 containerd[1566]: time="2025-09-12T17:44:01.112852598Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:01.144503 containerd[1566]: time="2025-09-12T17:44:01.144259011Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:01.145427 containerd[1566]: time="2025-09-12T17:44:01.145162933Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 7.871964669s" Sep 12 17:44:01.145427 containerd[1566]: time="2025-09-12T17:44:01.145230445Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 17:44:01.145427 containerd[1566]: time="2025-09-12T17:44:01.145308770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:01.180567 containerd[1566]: time="2025-09-12T17:44:01.180527544Z" level=info msg="CreateContainer within sandbox \"cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 17:44:01.248887 containerd[1566]: time="2025-09-12T17:44:01.247030981Z" level=info msg="Container 1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:01.248431 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2226821831.mount: Deactivated successfully. Sep 12 17:44:01.295144 containerd[1566]: time="2025-09-12T17:44:01.295076220Z" level=info msg="CreateContainer within sandbox \"cd29c3f4c41c971d46b1a803a5a884bf9cf8fba456cb75e90df88b45c78a2487\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\"" Sep 12 17:44:01.297388 containerd[1566]: time="2025-09-12T17:44:01.297235152Z" level=info msg="StartContainer for \"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\"" Sep 12 17:44:01.307967 containerd[1566]: time="2025-09-12T17:44:01.307926993Z" level=info msg="connecting to shim 1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb" address="unix:///run/containerd/s/add124bbb502391e80416b2022dff57345f4936653f59434d38f62aec714f705" protocol=ttrpc version=3 Sep 12 17:44:01.379178 systemd[1]: Started cri-containerd-1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb.scope - libcontainer container 1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb. Sep 12 17:44:01.433669 containerd[1566]: time="2025-09-12T17:44:01.433571314Z" level=info msg="StartContainer for \"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" returns successfully" Sep 12 17:44:01.511720 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 17:44:01.514843 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 17:44:01.784211 kubelet[2767]: I0912 17:44:01.783957 2767 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d8700f-1b54-42a0-95f0-09edca88a31c-whisker-ca-bundle\") pod \"71d8700f-1b54-42a0-95f0-09edca88a31c\" (UID: \"71d8700f-1b54-42a0-95f0-09edca88a31c\") " Sep 12 17:44:01.784211 kubelet[2767]: I0912 17:44:01.784009 2767 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cgvd\" (UniqueName: \"kubernetes.io/projected/71d8700f-1b54-42a0-95f0-09edca88a31c-kube-api-access-7cgvd\") pod \"71d8700f-1b54-42a0-95f0-09edca88a31c\" (UID: \"71d8700f-1b54-42a0-95f0-09edca88a31c\") " Sep 12 17:44:01.784211 kubelet[2767]: I0912 17:44:01.784032 2767 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/71d8700f-1b54-42a0-95f0-09edca88a31c-whisker-backend-key-pair\") pod \"71d8700f-1b54-42a0-95f0-09edca88a31c\" (UID: \"71d8700f-1b54-42a0-95f0-09edca88a31c\") " Sep 12 17:44:01.784929 kubelet[2767]: I0912 17:44:01.784898 2767 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71d8700f-1b54-42a0-95f0-09edca88a31c-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "71d8700f-1b54-42a0-95f0-09edca88a31c" (UID: "71d8700f-1b54-42a0-95f0-09edca88a31c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 12 17:44:01.788072 kubelet[2767]: I0912 17:44:01.787606 2767 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d8700f-1b54-42a0-95f0-09edca88a31c-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "71d8700f-1b54-42a0-95f0-09edca88a31c" (UID: "71d8700f-1b54-42a0-95f0-09edca88a31c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 12 17:44:01.788365 kubelet[2767]: I0912 17:44:01.788319 2767 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d8700f-1b54-42a0-95f0-09edca88a31c-kube-api-access-7cgvd" (OuterVolumeSpecName: "kube-api-access-7cgvd") pod "71d8700f-1b54-42a0-95f0-09edca88a31c" (UID: "71d8700f-1b54-42a0-95f0-09edca88a31c"). InnerVolumeSpecName "kube-api-access-7cgvd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 12 17:44:01.872900 systemd[1]: var-lib-kubelet-pods-71d8700f\x2d1b54\x2d42a0\x2d95f0\x2d09edca88a31c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d7cgvd.mount: Deactivated successfully. Sep 12 17:44:01.872998 systemd[1]: var-lib-kubelet-pods-71d8700f\x2d1b54\x2d42a0\x2d95f0\x2d09edca88a31c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 17:44:01.885111 kubelet[2767]: I0912 17:44:01.885078 2767 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/71d8700f-1b54-42a0-95f0-09edca88a31c-whisker-backend-key-pair\") on node \"ci-4426-1-0-d-1f6ac31256\" DevicePath \"\"" Sep 12 17:44:01.885921 kubelet[2767]: I0912 17:44:01.885108 2767 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71d8700f-1b54-42a0-95f0-09edca88a31c-whisker-ca-bundle\") on node \"ci-4426-1-0-d-1f6ac31256\" DevicePath \"\"" Sep 12 17:44:01.885921 kubelet[2767]: I0912 17:44:01.885143 2767 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7cgvd\" (UniqueName: \"kubernetes.io/projected/71d8700f-1b54-42a0-95f0-09edca88a31c-kube-api-access-7cgvd\") on node \"ci-4426-1-0-d-1f6ac31256\" DevicePath \"\"" Sep 12 17:44:02.326144 systemd[1]: Removed slice kubepods-besteffort-pod71d8700f_1b54_42a0_95f0_09edca88a31c.slice - libcontainer container kubepods-besteffort-pod71d8700f_1b54_42a0_95f0_09edca88a31c.slice. Sep 12 17:44:02.360320 kubelet[2767]: I0912 17:44:02.359266 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-99z7m" podStartSLOduration=2.514033403 podStartE2EDuration="21.359242377s" podCreationTimestamp="2025-09-12 17:43:41 +0000 UTC" firstStartedPulling="2025-09-12 17:43:42.314670537 +0000 UTC m=+19.299153738" lastFinishedPulling="2025-09-12 17:44:01.159879511 +0000 UTC m=+38.144362712" observedRunningTime="2025-09-12 17:44:02.347699524 +0000 UTC m=+39.332182735" watchObservedRunningTime="2025-09-12 17:44:02.359242377 +0000 UTC m=+39.343725588" Sep 12 17:44:02.429062 systemd[1]: Created slice kubepods-besteffort-pod81821e13_7bc5_4e27_acae_edbd133dc975.slice - libcontainer container kubepods-besteffort-pod81821e13_7bc5_4e27_acae_edbd133dc975.slice. Sep 12 17:44:02.490445 kubelet[2767]: I0912 17:44:02.490379 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/81821e13-7bc5-4e27-acae-edbd133dc975-whisker-backend-key-pair\") pod \"whisker-7d6f486c97-5ghhc\" (UID: \"81821e13-7bc5-4e27-acae-edbd133dc975\") " pod="calico-system/whisker-7d6f486c97-5ghhc" Sep 12 17:44:02.490445 kubelet[2767]: I0912 17:44:02.490431 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7np\" (UniqueName: \"kubernetes.io/projected/81821e13-7bc5-4e27-acae-edbd133dc975-kube-api-access-5m7np\") pod \"whisker-7d6f486c97-5ghhc\" (UID: \"81821e13-7bc5-4e27-acae-edbd133dc975\") " pod="calico-system/whisker-7d6f486c97-5ghhc" Sep 12 17:44:02.490635 kubelet[2767]: I0912 17:44:02.490461 2767 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81821e13-7bc5-4e27-acae-edbd133dc975-whisker-ca-bundle\") pod \"whisker-7d6f486c97-5ghhc\" (UID: \"81821e13-7bc5-4e27-acae-edbd133dc975\") " pod="calico-system/whisker-7d6f486c97-5ghhc" Sep 12 17:44:02.736386 containerd[1566]: time="2025-09-12T17:44:02.735998809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d6f486c97-5ghhc,Uid:81821e13-7bc5-4e27-acae-edbd133dc975,Namespace:calico-system,Attempt:0,}" Sep 12 17:44:02.831466 containerd[1566]: time="2025-09-12T17:44:02.830745849Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" id:\"7c92840293259c4c9d53a6e424991b517b7e18d578a15e632db6b4071bd10a9b\" pid:3888 exit_status:1 exited_at:{seconds:1757699042 nanos:808489995}" Sep 12 17:44:02.939050 containerd[1566]: time="2025-09-12T17:44:02.938998932Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" id:\"0658dd650cf6db4e54b3eb60a84dc6f86c3fd1a988e3b8c8b42f99bc0cdc0450\" pid:3927 exit_status:1 exited_at:{seconds:1757699042 nanos:938602913}" Sep 12 17:44:03.052146 systemd-networkd[1466]: calib59ce489e1f: Link UP Sep 12 17:44:03.052295 systemd-networkd[1466]: calib59ce489e1f: Gained carrier Sep 12 17:44:03.066736 containerd[1566]: 2025-09-12 17:44:02.781 [INFO][3895] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:44:03.066736 containerd[1566]: 2025-09-12 17:44:02.816 [INFO][3895] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0 whisker-7d6f486c97- calico-system 81821e13-7bc5-4e27-acae-edbd133dc975 895 0 2025-09-12 17:44:02 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7d6f486c97 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426-1-0-d-1f6ac31256 whisker-7d6f486c97-5ghhc eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calib59ce489e1f [] [] }} ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Namespace="calico-system" Pod="whisker-7d6f486c97-5ghhc" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-" Sep 12 17:44:03.066736 containerd[1566]: 2025-09-12 17:44:02.816 [INFO][3895] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Namespace="calico-system" Pod="whisker-7d6f486c97-5ghhc" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" Sep 12 17:44:03.066736 containerd[1566]: 2025-09-12 17:44:02.982 [INFO][3911] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" HandleID="k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Workload="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:02.985 [INFO][3911] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" HandleID="k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Workload="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000320320), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-1-0-d-1f6ac31256", "pod":"whisker-7d6f486c97-5ghhc", "timestamp":"2025-09-12 17:44:02.982740406 +0000 UTC"}, Hostname:"ci-4426-1-0-d-1f6ac31256", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:02.985 [INFO][3911] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:02.985 [INFO][3911] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:02.986 [INFO][3911] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-d-1f6ac31256' Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:03.001 [INFO][3911] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:03.011 [INFO][3911] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:03.017 [INFO][3911] ipam/ipam.go 511: Trying affinity for 192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:03.020 [INFO][3911] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.067882 containerd[1566]: 2025-09-12 17:44:03.023 [INFO][3911] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.068406 containerd[1566]: 2025-09-12 17:44:03.023 [INFO][3911] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.192/26 handle="k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.068406 containerd[1566]: 2025-09-12 17:44:03.025 [INFO][3911] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d Sep 12 17:44:03.068406 containerd[1566]: 2025-09-12 17:44:03.031 [INFO][3911] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.192/26 handle="k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.068406 containerd[1566]: 2025-09-12 17:44:03.036 [INFO][3911] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.193/26] block=192.168.3.192/26 handle="k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.068406 containerd[1566]: 2025-09-12 17:44:03.036 [INFO][3911] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.193/26] handle="k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:03.068406 containerd[1566]: 2025-09-12 17:44:03.036 [INFO][3911] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:44:03.068406 containerd[1566]: 2025-09-12 17:44:03.037 [INFO][3911] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.193/26] IPv6=[] ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" HandleID="k8s-pod-network.a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Workload="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" Sep 12 17:44:03.070134 containerd[1566]: 2025-09-12 17:44:03.039 [INFO][3895] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Namespace="calico-system" Pod="whisker-7d6f486c97-5ghhc" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0", GenerateName:"whisker-7d6f486c97-", Namespace:"calico-system", SelfLink:"", UID:"81821e13-7bc5-4e27-acae-edbd133dc975", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 44, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7d6f486c97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"", Pod:"whisker-7d6f486c97-5ghhc", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.3.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib59ce489e1f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:03.070134 containerd[1566]: 2025-09-12 17:44:03.039 [INFO][3895] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.193/32] ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Namespace="calico-system" Pod="whisker-7d6f486c97-5ghhc" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" Sep 12 17:44:03.070207 containerd[1566]: 2025-09-12 17:44:03.039 [INFO][3895] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib59ce489e1f ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Namespace="calico-system" Pod="whisker-7d6f486c97-5ghhc" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" Sep 12 17:44:03.070207 containerd[1566]: 2025-09-12 17:44:03.049 [INFO][3895] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Namespace="calico-system" Pod="whisker-7d6f486c97-5ghhc" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" Sep 12 17:44:03.070255 containerd[1566]: 2025-09-12 17:44:03.050 [INFO][3895] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Namespace="calico-system" Pod="whisker-7d6f486c97-5ghhc" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0", GenerateName:"whisker-7d6f486c97-", Namespace:"calico-system", SelfLink:"", UID:"81821e13-7bc5-4e27-acae-edbd133dc975", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 44, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7d6f486c97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d", Pod:"whisker-7d6f486c97-5ghhc", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.3.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calib59ce489e1f", MAC:"da:d5:b6:10:d7:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:03.070304 containerd[1566]: 2025-09-12 17:44:03.061 [INFO][3895] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" Namespace="calico-system" Pod="whisker-7d6f486c97-5ghhc" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-whisker--7d6f486c97--5ghhc-eth0" Sep 12 17:44:03.118946 kubelet[2767]: I0912 17:44:03.118856 2767 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d8700f-1b54-42a0-95f0-09edca88a31c" path="/var/lib/kubelet/pods/71d8700f-1b54-42a0-95f0-09edca88a31c/volumes" Sep 12 17:44:03.221083 containerd[1566]: time="2025-09-12T17:44:03.221042703Z" level=info msg="connecting to shim a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d" address="unix:///run/containerd/s/ab09894f94c5a41d3346e0f7dd42767fd73840aa09e4a9519ecd88e0f6a1accb" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:44:03.271102 systemd[1]: Started cri-containerd-a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d.scope - libcontainer container a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d. Sep 12 17:44:03.368156 containerd[1566]: time="2025-09-12T17:44:03.367458218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7d6f486c97-5ghhc,Uid:81821e13-7bc5-4e27-acae-edbd133dc975,Namespace:calico-system,Attempt:0,} returns sandbox id \"a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d\"" Sep 12 17:44:03.370674 containerd[1566]: time="2025-09-12T17:44:03.370572800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 17:44:03.427963 containerd[1566]: time="2025-09-12T17:44:03.427922643Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" id:\"df10df1ce83d84e88a2a796e8019c82c0868f585c37cea0da04db4d3b82ab8b8\" pid:4100 exit_status:1 exited_at:{seconds:1757699043 nanos:427460667}" Sep 12 17:44:04.111770 containerd[1566]: time="2025-09-12T17:44:04.111701114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wpwkw,Uid:fd331680-dd74-42d7-aabe-3845b589bb24,Namespace:kube-system,Attempt:0,}" Sep 12 17:44:04.229852 systemd-networkd[1466]: cali29e83d02b4b: Link UP Sep 12 17:44:04.230834 systemd-networkd[1466]: cali29e83d02b4b: Gained carrier Sep 12 17:44:04.246990 containerd[1566]: 2025-09-12 17:44:04.144 [INFO][4114] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:44:04.246990 containerd[1566]: 2025-09-12 17:44:04.156 [INFO][4114] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0 coredns-668d6bf9bc- kube-system fd331680-dd74-42d7-aabe-3845b589bb24 818 0 2025-09-12 17:43:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426-1-0-d-1f6ac31256 coredns-668d6bf9bc-wpwkw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali29e83d02b4b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wpwkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-" Sep 12 17:44:04.246990 containerd[1566]: 2025-09-12 17:44:04.156 [INFO][4114] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wpwkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" Sep 12 17:44:04.246990 containerd[1566]: 2025-09-12 17:44:04.185 [INFO][4126] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" HandleID="k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Workload="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.185 [INFO][4126] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" HandleID="k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Workload="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f610), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426-1-0-d-1f6ac31256", "pod":"coredns-668d6bf9bc-wpwkw", "timestamp":"2025-09-12 17:44:04.185826549 +0000 UTC"}, Hostname:"ci-4426-1-0-d-1f6ac31256", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.186 [INFO][4126] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.186 [INFO][4126] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.186 [INFO][4126] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-d-1f6ac31256' Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.193 [INFO][4126] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.198 [INFO][4126] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.204 [INFO][4126] ipam/ipam.go 511: Trying affinity for 192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.206 [INFO][4126] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247174 containerd[1566]: 2025-09-12 17:44:04.208 [INFO][4126] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247374 containerd[1566]: 2025-09-12 17:44:04.208 [INFO][4126] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.192/26 handle="k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247374 containerd[1566]: 2025-09-12 17:44:04.210 [INFO][4126] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c Sep 12 17:44:04.247374 containerd[1566]: 2025-09-12 17:44:04.217 [INFO][4126] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.192/26 handle="k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247374 containerd[1566]: 2025-09-12 17:44:04.223 [INFO][4126] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.194/26] block=192.168.3.192/26 handle="k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247374 containerd[1566]: 2025-09-12 17:44:04.223 [INFO][4126] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.194/26] handle="k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:04.247374 containerd[1566]: 2025-09-12 17:44:04.223 [INFO][4126] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:44:04.247374 containerd[1566]: 2025-09-12 17:44:04.223 [INFO][4126] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.194/26] IPv6=[] ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" HandleID="k8s-pod-network.d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Workload="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" Sep 12 17:44:04.247493 containerd[1566]: 2025-09-12 17:44:04.226 [INFO][4114] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wpwkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"fd331680-dd74-42d7-aabe-3845b589bb24", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"", Pod:"coredns-668d6bf9bc-wpwkw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.3.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali29e83d02b4b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:04.247493 containerd[1566]: 2025-09-12 17:44:04.226 [INFO][4114] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.194/32] ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wpwkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" Sep 12 17:44:04.247493 containerd[1566]: 2025-09-12 17:44:04.226 [INFO][4114] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali29e83d02b4b ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wpwkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" Sep 12 17:44:04.247493 containerd[1566]: 2025-09-12 17:44:04.231 [INFO][4114] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wpwkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" Sep 12 17:44:04.247493 containerd[1566]: 2025-09-12 17:44:04.231 [INFO][4114] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wpwkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"fd331680-dd74-42d7-aabe-3845b589bb24", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c", Pod:"coredns-668d6bf9bc-wpwkw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.3.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali29e83d02b4b", MAC:"52:f0:c1:c4:c2:a8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:04.247493 containerd[1566]: 2025-09-12 17:44:04.242 [INFO][4114] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" Namespace="kube-system" Pod="coredns-668d6bf9bc-wpwkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--wpwkw-eth0" Sep 12 17:44:04.268999 containerd[1566]: time="2025-09-12T17:44:04.268797564Z" level=info msg="connecting to shim d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c" address="unix:///run/containerd/s/eefe4d00dc779bacadf1b07a9b428cc8de13f3216a43f13c266fb05d392c18ad" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:44:04.305940 systemd[1]: Started cri-containerd-d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c.scope - libcontainer container d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c. Sep 12 17:44:04.379937 containerd[1566]: time="2025-09-12T17:44:04.379851483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wpwkw,Uid:fd331680-dd74-42d7-aabe-3845b589bb24,Namespace:kube-system,Attempt:0,} returns sandbox id \"d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c\"" Sep 12 17:44:04.385941 containerd[1566]: time="2025-09-12T17:44:04.384711263Z" level=info msg="CreateContainer within sandbox \"d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:44:04.405714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount229314732.mount: Deactivated successfully. Sep 12 17:44:04.407527 containerd[1566]: time="2025-09-12T17:44:04.407311758Z" level=info msg="Container df7729d21672115d8029bce2528c7710fcb11230aa2861d2cd72f73f4abc628b: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:04.413001 containerd[1566]: time="2025-09-12T17:44:04.412966025Z" level=info msg="CreateContainer within sandbox \"d7be548f81e6dae9259fd5171a2b60c65bff45030bd1dd2260c21165550dba1c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"df7729d21672115d8029bce2528c7710fcb11230aa2861d2cd72f73f4abc628b\"" Sep 12 17:44:04.414909 containerd[1566]: time="2025-09-12T17:44:04.414869414Z" level=info msg="StartContainer for \"df7729d21672115d8029bce2528c7710fcb11230aa2861d2cd72f73f4abc628b\"" Sep 12 17:44:04.416508 containerd[1566]: time="2025-09-12T17:44:04.416444359Z" level=info msg="connecting to shim df7729d21672115d8029bce2528c7710fcb11230aa2861d2cd72f73f4abc628b" address="unix:///run/containerd/s/eefe4d00dc779bacadf1b07a9b428cc8de13f3216a43f13c266fb05d392c18ad" protocol=ttrpc version=3 Sep 12 17:44:04.441133 systemd[1]: Started cri-containerd-df7729d21672115d8029bce2528c7710fcb11230aa2861d2cd72f73f4abc628b.scope - libcontainer container df7729d21672115d8029bce2528c7710fcb11230aa2861d2cd72f73f4abc628b. Sep 12 17:44:04.481259 containerd[1566]: time="2025-09-12T17:44:04.481187133Z" level=info msg="StartContainer for \"df7729d21672115d8029bce2528c7710fcb11230aa2861d2cd72f73f4abc628b\" returns successfully" Sep 12 17:44:04.533695 containerd[1566]: time="2025-09-12T17:44:04.533639129Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" id:\"c8d797f6cbae9d9a1fdaa4cb6862a1c4e8aef59743cffc56d413b5a4e6c2a46f\" pid:4200 exit_status:1 exited_at:{seconds:1757699044 nanos:532177064}" Sep 12 17:44:04.795946 systemd-networkd[1466]: calib59ce489e1f: Gained IPv6LL Sep 12 17:44:05.112064 containerd[1566]: time="2025-09-12T17:44:05.112024491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gw76d,Uid:90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c,Namespace:kube-system,Attempt:0,}" Sep 12 17:44:05.267931 systemd-networkd[1466]: cali135f0a86e77: Link UP Sep 12 17:44:05.268217 systemd-networkd[1466]: cali135f0a86e77: Gained carrier Sep 12 17:44:05.278574 containerd[1566]: time="2025-09-12T17:44:05.278437458Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.170 [INFO][4262] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.184 [INFO][4262] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0 coredns-668d6bf9bc- kube-system 90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c 823 0 2025-09-12 17:43:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426-1-0-d-1f6ac31256 coredns-668d6bf9bc-gw76d eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali135f0a86e77 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Namespace="kube-system" Pod="coredns-668d6bf9bc-gw76d" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.184 [INFO][4262] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Namespace="kube-system" Pod="coredns-668d6bf9bc-gw76d" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.211 [INFO][4273] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" HandleID="k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Workload="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.211 [INFO][4273] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" HandleID="k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Workload="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024ef90), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426-1-0-d-1f6ac31256", "pod":"coredns-668d6bf9bc-gw76d", "timestamp":"2025-09-12 17:44:05.211777487 +0000 UTC"}, Hostname:"ci-4426-1-0-d-1f6ac31256", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.212 [INFO][4273] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.212 [INFO][4273] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.212 [INFO][4273] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-d-1f6ac31256' Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.220 [INFO][4273] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.226 [INFO][4273] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.232 [INFO][4273] ipam/ipam.go 511: Trying affinity for 192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.235 [INFO][4273] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.240 [INFO][4273] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.240 [INFO][4273] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.192/26 handle="k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.242 [INFO][4273] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.247 [INFO][4273] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.192/26 handle="k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.255 [INFO][4273] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.195/26] block=192.168.3.192/26 handle="k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.255 [INFO][4273] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.195/26] handle="k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.255 [INFO][4273] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:44:05.280976 containerd[1566]: 2025-09-12 17:44:05.255 [INFO][4273] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.195/26] IPv6=[] ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" HandleID="k8s-pod-network.91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Workload="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" Sep 12 17:44:05.282980 containerd[1566]: 2025-09-12 17:44:05.260 [INFO][4262] cni-plugin/k8s.go 418: Populated endpoint ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Namespace="kube-system" Pod="coredns-668d6bf9bc-gw76d" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"", Pod:"coredns-668d6bf9bc-gw76d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.3.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali135f0a86e77", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:05.282980 containerd[1566]: 2025-09-12 17:44:05.262 [INFO][4262] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.195/32] ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Namespace="kube-system" Pod="coredns-668d6bf9bc-gw76d" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" Sep 12 17:44:05.282980 containerd[1566]: 2025-09-12 17:44:05.263 [INFO][4262] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali135f0a86e77 ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Namespace="kube-system" Pod="coredns-668d6bf9bc-gw76d" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" Sep 12 17:44:05.282980 containerd[1566]: 2025-09-12 17:44:05.267 [INFO][4262] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Namespace="kube-system" Pod="coredns-668d6bf9bc-gw76d" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" Sep 12 17:44:05.282980 containerd[1566]: 2025-09-12 17:44:05.267 [INFO][4262] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Namespace="kube-system" Pod="coredns-668d6bf9bc-gw76d" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb", Pod:"coredns-668d6bf9bc-gw76d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.3.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali135f0a86e77", MAC:"92:14:20:71:51:2c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:05.282980 containerd[1566]: 2025-09-12 17:44:05.274 [INFO][4262] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" Namespace="kube-system" Pod="coredns-668d6bf9bc-gw76d" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-coredns--668d6bf9bc--gw76d-eth0" Sep 12 17:44:05.282980 containerd[1566]: time="2025-09-12T17:44:05.282559160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 17:44:05.282980 containerd[1566]: time="2025-09-12T17:44:05.282612132Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:05.285572 containerd[1566]: time="2025-09-12T17:44:05.285553876Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:05.285896 containerd[1566]: time="2025-09-12T17:44:05.285880195Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.915124796s" Sep 12 17:44:05.285962 containerd[1566]: time="2025-09-12T17:44:05.285951123Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 17:44:05.288775 containerd[1566]: time="2025-09-12T17:44:05.288740690Z" level=info msg="CreateContainer within sandbox \"a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 17:44:05.301241 containerd[1566]: time="2025-09-12T17:44:05.301213946Z" level=info msg="Container 96c19b7d340f9291ede7c8bfe0711b8742a938f1cbf5250688770e31d46c2f7b: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:05.307603 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount146185955.mount: Deactivated successfully. Sep 12 17:44:05.314593 containerd[1566]: time="2025-09-12T17:44:05.314565240Z" level=info msg="CreateContainer within sandbox \"a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"96c19b7d340f9291ede7c8bfe0711b8742a938f1cbf5250688770e31d46c2f7b\"" Sep 12 17:44:05.315282 containerd[1566]: time="2025-09-12T17:44:05.315199290Z" level=info msg="connecting to shim 91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb" address="unix:///run/containerd/s/bdd640291c7f19c7fa23250f149def461086b29c0f837011e9351e4f88659523" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:44:05.317301 containerd[1566]: time="2025-09-12T17:44:05.315627889Z" level=info msg="StartContainer for \"96c19b7d340f9291ede7c8bfe0711b8742a938f1cbf5250688770e31d46c2f7b\"" Sep 12 17:44:05.318920 containerd[1566]: time="2025-09-12T17:44:05.318240498Z" level=info msg="connecting to shim 96c19b7d340f9291ede7c8bfe0711b8742a938f1cbf5250688770e31d46c2f7b" address="unix:///run/containerd/s/ab09894f94c5a41d3346e0f7dd42767fd73840aa09e4a9519ecd88e0f6a1accb" protocol=ttrpc version=3 Sep 12 17:44:05.350210 systemd[1]: Started cri-containerd-96c19b7d340f9291ede7c8bfe0711b8742a938f1cbf5250688770e31d46c2f7b.scope - libcontainer container 96c19b7d340f9291ede7c8bfe0711b8742a938f1cbf5250688770e31d46c2f7b. Sep 12 17:44:05.353646 systemd[1]: Started cri-containerd-91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb.scope - libcontainer container 91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb. Sep 12 17:44:05.372554 kubelet[2767]: I0912 17:44:05.372455 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-wpwkw" podStartSLOduration=37.372414226 podStartE2EDuration="37.372414226s" podCreationTimestamp="2025-09-12 17:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:44:05.364514145 +0000 UTC m=+42.348997346" watchObservedRunningTime="2025-09-12 17:44:05.372414226 +0000 UTC m=+42.356897427" Sep 12 17:44:05.431831 containerd[1566]: time="2025-09-12T17:44:05.431729441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gw76d,Uid:90bfb799-6c60-4f68-b2ce-a2c4d3ee0b9c,Namespace:kube-system,Attempt:0,} returns sandbox id \"91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb\"" Sep 12 17:44:05.438686 containerd[1566]: time="2025-09-12T17:44:05.438656791Z" level=info msg="CreateContainer within sandbox \"91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 17:44:05.456486 containerd[1566]: time="2025-09-12T17:44:05.456332833Z" level=info msg="Container 2f6e7e1458901da3681ea1407334d787709bb820139a3da2bdf79f1cfa50966a: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:05.458138 containerd[1566]: time="2025-09-12T17:44:05.458114729Z" level=info msg="StartContainer for \"96c19b7d340f9291ede7c8bfe0711b8742a938f1cbf5250688770e31d46c2f7b\" returns successfully" Sep 12 17:44:05.460918 containerd[1566]: time="2025-09-12T17:44:05.460897251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 17:44:05.466168 containerd[1566]: time="2025-09-12T17:44:05.466146748Z" level=info msg="CreateContainer within sandbox \"91e6d20875eddb9a62d88153f5e9aa0e09875988b24fe20b508935cd54ce4dcb\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"2f6e7e1458901da3681ea1407334d787709bb820139a3da2bdf79f1cfa50966a\"" Sep 12 17:44:05.467043 containerd[1566]: time="2025-09-12T17:44:05.466989817Z" level=info msg="StartContainer for \"2f6e7e1458901da3681ea1407334d787709bb820139a3da2bdf79f1cfa50966a\"" Sep 12 17:44:05.468549 containerd[1566]: time="2025-09-12T17:44:05.468524338Z" level=info msg="connecting to shim 2f6e7e1458901da3681ea1407334d787709bb820139a3da2bdf79f1cfa50966a" address="unix:///run/containerd/s/bdd640291c7f19c7fa23250f149def461086b29c0f837011e9351e4f88659523" protocol=ttrpc version=3 Sep 12 17:44:05.487110 systemd[1]: Started cri-containerd-2f6e7e1458901da3681ea1407334d787709bb820139a3da2bdf79f1cfa50966a.scope - libcontainer container 2f6e7e1458901da3681ea1407334d787709bb820139a3da2bdf79f1cfa50966a. Sep 12 17:44:05.538373 containerd[1566]: time="2025-09-12T17:44:05.538315563Z" level=info msg="StartContainer for \"2f6e7e1458901da3681ea1407334d787709bb820139a3da2bdf79f1cfa50966a\" returns successfully" Sep 12 17:44:05.563935 systemd-networkd[1466]: cali29e83d02b4b: Gained IPv6LL Sep 12 17:44:06.111623 containerd[1566]: time="2025-09-12T17:44:06.111569183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rjfp6,Uid:f8cbe4d2-087a-4a06-9eb4-75af1bfa61da,Namespace:calico-system,Attempt:0,}" Sep 12 17:44:06.111940 containerd[1566]: time="2025-09-12T17:44:06.111568182Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-58z84,Uid:42f5189e-7e63-4633-b8a4-4e4423d9882b,Namespace:calico-system,Attempt:0,}" Sep 12 17:44:06.234344 systemd-networkd[1466]: caliaf7f3079ee7: Link UP Sep 12 17:44:06.235669 systemd-networkd[1466]: caliaf7f3079ee7: Gained carrier Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.149 [INFO][4431] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.168 [INFO][4431] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0 goldmane-54d579b49d- calico-system 42f5189e-7e63-4633-b8a4-4e4423d9882b 824 0 2025-09-12 17:43:41 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426-1-0-d-1f6ac31256 goldmane-54d579b49d-58z84 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] caliaf7f3079ee7 [] [] }} ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Namespace="calico-system" Pod="goldmane-54d579b49d-58z84" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.168 [INFO][4431] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Namespace="calico-system" Pod="goldmane-54d579b49d-58z84" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.197 [INFO][4454] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" HandleID="k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Workload="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.197 [INFO][4454] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" HandleID="k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Workload="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-1-0-d-1f6ac31256", "pod":"goldmane-54d579b49d-58z84", "timestamp":"2025-09-12 17:44:06.197468731 +0000 UTC"}, Hostname:"ci-4426-1-0-d-1f6ac31256", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.197 [INFO][4454] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.198 [INFO][4454] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.198 [INFO][4454] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-d-1f6ac31256' Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.204 [INFO][4454] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.208 [INFO][4454] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.213 [INFO][4454] ipam/ipam.go 511: Trying affinity for 192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.214 [INFO][4454] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.216 [INFO][4454] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.217 [INFO][4454] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.192/26 handle="k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.219 [INFO][4454] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.222 [INFO][4454] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.192/26 handle="k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.228 [INFO][4454] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.196/26] block=192.168.3.192/26 handle="k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.228 [INFO][4454] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.196/26] handle="k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.228 [INFO][4454] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:44:06.247997 containerd[1566]: 2025-09-12 17:44:06.228 [INFO][4454] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.196/26] IPv6=[] ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" HandleID="k8s-pod-network.80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Workload="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" Sep 12 17:44:06.249287 containerd[1566]: 2025-09-12 17:44:06.230 [INFO][4431] cni-plugin/k8s.go 418: Populated endpoint ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Namespace="calico-system" Pod="goldmane-54d579b49d-58z84" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"42f5189e-7e63-4633-b8a4-4e4423d9882b", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"", Pod:"goldmane-54d579b49d-58z84", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.3.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliaf7f3079ee7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:06.249287 containerd[1566]: 2025-09-12 17:44:06.230 [INFO][4431] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.196/32] ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Namespace="calico-system" Pod="goldmane-54d579b49d-58z84" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" Sep 12 17:44:06.249287 containerd[1566]: 2025-09-12 17:44:06.230 [INFO][4431] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaf7f3079ee7 ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Namespace="calico-system" Pod="goldmane-54d579b49d-58z84" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" Sep 12 17:44:06.249287 containerd[1566]: 2025-09-12 17:44:06.236 [INFO][4431] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Namespace="calico-system" Pod="goldmane-54d579b49d-58z84" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" Sep 12 17:44:06.249287 containerd[1566]: 2025-09-12 17:44:06.236 [INFO][4431] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Namespace="calico-system" Pod="goldmane-54d579b49d-58z84" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"42f5189e-7e63-4633-b8a4-4e4423d9882b", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b", Pod:"goldmane-54d579b49d-58z84", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.3.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliaf7f3079ee7", MAC:"46:d8:47:98:83:05", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:06.249287 containerd[1566]: 2025-09-12 17:44:06.245 [INFO][4431] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" Namespace="calico-system" Pod="goldmane-54d579b49d-58z84" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-goldmane--54d579b49d--58z84-eth0" Sep 12 17:44:06.264832 containerd[1566]: time="2025-09-12T17:44:06.264436556Z" level=info msg="connecting to shim 80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b" address="unix:///run/containerd/s/920135a87dcaad703eb573d69905b37ac6da726c38b72df4e2d04d15ebba5045" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:44:06.282985 systemd[1]: Started cri-containerd-80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b.scope - libcontainer container 80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b. Sep 12 17:44:06.345001 containerd[1566]: time="2025-09-12T17:44:06.344969854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-58z84,Uid:42f5189e-7e63-4633-b8a4-4e4423d9882b,Namespace:calico-system,Attempt:0,} returns sandbox id \"80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b\"" Sep 12 17:44:06.368125 systemd-networkd[1466]: calief10080f9ab: Link UP Sep 12 17:44:06.370448 systemd-networkd[1466]: calief10080f9ab: Gained carrier Sep 12 17:44:06.377918 kubelet[2767]: I0912 17:44:06.377841 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-gw76d" podStartSLOduration=38.376229483 podStartE2EDuration="38.376229483s" podCreationTimestamp="2025-09-12 17:43:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 17:44:06.375478435 +0000 UTC m=+43.359961646" watchObservedRunningTime="2025-09-12 17:44:06.376229483 +0000 UTC m=+43.360712683" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.154 [INFO][4424] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.168 [INFO][4424] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0 csi-node-driver- calico-system f8cbe4d2-087a-4a06-9eb4-75af1bfa61da 722 0 2025-09-12 17:43:42 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426-1-0-d-1f6ac31256 csi-node-driver-rjfp6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calief10080f9ab [] [] }} ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Namespace="calico-system" Pod="csi-node-driver-rjfp6" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.168 [INFO][4424] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Namespace="calico-system" Pod="csi-node-driver-rjfp6" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.199 [INFO][4449] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" HandleID="k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Workload="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.199 [INFO][4449] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" HandleID="k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Workload="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5010), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-1-0-d-1f6ac31256", "pod":"csi-node-driver-rjfp6", "timestamp":"2025-09-12 17:44:06.19964941 +0000 UTC"}, Hostname:"ci-4426-1-0-d-1f6ac31256", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.200 [INFO][4449] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.228 [INFO][4449] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.228 [INFO][4449] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-d-1f6ac31256' Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.305 [INFO][4449] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.312 [INFO][4449] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.326 [INFO][4449] ipam/ipam.go 511: Trying affinity for 192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.331 [INFO][4449] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.337 [INFO][4449] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.337 [INFO][4449] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.192/26 handle="k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.338 [INFO][4449] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9 Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.346 [INFO][4449] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.192/26 handle="k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.359 [INFO][4449] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.197/26] block=192.168.3.192/26 handle="k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.360 [INFO][4449] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.197/26] handle="k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.361 [INFO][4449] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:44:06.391597 containerd[1566]: 2025-09-12 17:44:06.361 [INFO][4449] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.197/26] IPv6=[] ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" HandleID="k8s-pod-network.b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Workload="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" Sep 12 17:44:06.393402 containerd[1566]: 2025-09-12 17:44:06.363 [INFO][4424] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Namespace="calico-system" Pod="csi-node-driver-rjfp6" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f8cbe4d2-087a-4a06-9eb4-75af1bfa61da", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"", Pod:"csi-node-driver-rjfp6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.3.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calief10080f9ab", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:06.393402 containerd[1566]: 2025-09-12 17:44:06.363 [INFO][4424] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.197/32] ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Namespace="calico-system" Pod="csi-node-driver-rjfp6" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" Sep 12 17:44:06.393402 containerd[1566]: 2025-09-12 17:44:06.364 [INFO][4424] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calief10080f9ab ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Namespace="calico-system" Pod="csi-node-driver-rjfp6" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" Sep 12 17:44:06.393402 containerd[1566]: 2025-09-12 17:44:06.371 [INFO][4424] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Namespace="calico-system" Pod="csi-node-driver-rjfp6" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" Sep 12 17:44:06.393402 containerd[1566]: 2025-09-12 17:44:06.371 [INFO][4424] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Namespace="calico-system" Pod="csi-node-driver-rjfp6" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f8cbe4d2-087a-4a06-9eb4-75af1bfa61da", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9", Pod:"csi-node-driver-rjfp6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.3.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calief10080f9ab", MAC:"4e:7b:fe:8c:03:18", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:06.393402 containerd[1566]: 2025-09-12 17:44:06.387 [INFO][4424] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" Namespace="calico-system" Pod="csi-node-driver-rjfp6" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-csi--node--driver--rjfp6-eth0" Sep 12 17:44:06.416860 containerd[1566]: time="2025-09-12T17:44:06.414863302Z" level=info msg="connecting to shim b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9" address="unix:///run/containerd/s/c43f886f1243937599002dc7d199869e958da4d66b4c9a92d4ea67de5b4ceed9" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:44:06.438943 systemd[1]: Started cri-containerd-b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9.scope - libcontainer container b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9. Sep 12 17:44:06.462733 containerd[1566]: time="2025-09-12T17:44:06.462697551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rjfp6,Uid:f8cbe4d2-087a-4a06-9eb4-75af1bfa61da,Namespace:calico-system,Attempt:0,} returns sandbox id \"b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9\"" Sep 12 17:44:06.779994 systemd-networkd[1466]: cali135f0a86e77: Gained IPv6LL Sep 12 17:44:07.932467 systemd-networkd[1466]: calief10080f9ab: Gained IPv6LL Sep 12 17:44:08.060002 systemd-networkd[1466]: caliaf7f3079ee7: Gained IPv6LL Sep 12 17:44:08.111878 containerd[1566]: time="2025-09-12T17:44:08.111842658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689c5fcdd8-kkmkw,Uid:fc215e0e-6746-46f5-990e-f6284f6589d7,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:44:08.112830 containerd[1566]: time="2025-09-12T17:44:08.112395825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59859d54f6-jtjgv,Uid:20907fe9-b0fe-40c2-a2f7-e34e4e300a17,Namespace:calico-system,Attempt:0,}" Sep 12 17:44:08.250470 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3799852014.mount: Deactivated successfully. Sep 12 17:44:08.261648 systemd-networkd[1466]: calie315a882d79: Link UP Sep 12 17:44:08.262772 systemd-networkd[1466]: calie315a882d79: Gained carrier Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.159 [INFO][4616] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.175 [INFO][4616] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0 calico-apiserver-689c5fcdd8- calico-apiserver fc215e0e-6746-46f5-990e-f6284f6589d7 825 0 2025-09-12 17:43:39 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:689c5fcdd8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426-1-0-d-1f6ac31256 calico-apiserver-689c5fcdd8-kkmkw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie315a882d79 [] [] }} ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-kkmkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.175 [INFO][4616] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-kkmkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.213 [INFO][4644] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" HandleID="k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.215 [INFO][4644] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" HandleID="k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5820), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426-1-0-d-1f6ac31256", "pod":"calico-apiserver-689c5fcdd8-kkmkw", "timestamp":"2025-09-12 17:44:08.213421783 +0000 UTC"}, Hostname:"ci-4426-1-0-d-1f6ac31256", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.215 [INFO][4644] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.215 [INFO][4644] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.215 [INFO][4644] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-d-1f6ac31256' Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.222 [INFO][4644] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.226 [INFO][4644] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.230 [INFO][4644] ipam/ipam.go 511: Trying affinity for 192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.231 [INFO][4644] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.234 [INFO][4644] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.234 [INFO][4644] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.192/26 handle="k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.235 [INFO][4644] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.239 [INFO][4644] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.192/26 handle="k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.247 [INFO][4644] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.198/26] block=192.168.3.192/26 handle="k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.248 [INFO][4644] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.198/26] handle="k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.248 [INFO][4644] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:44:08.277223 containerd[1566]: 2025-09-12 17:44:08.249 [INFO][4644] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.198/26] IPv6=[] ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" HandleID="k8s-pod-network.29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" Sep 12 17:44:08.278220 containerd[1566]: 2025-09-12 17:44:08.254 [INFO][4616] cni-plugin/k8s.go 418: Populated endpoint ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-kkmkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0", GenerateName:"calico-apiserver-689c5fcdd8-", Namespace:"calico-apiserver", SelfLink:"", UID:"fc215e0e-6746-46f5-990e-f6284f6589d7", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689c5fcdd8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"", Pod:"calico-apiserver-689c5fcdd8-kkmkw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie315a882d79", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:08.278220 containerd[1566]: 2025-09-12 17:44:08.255 [INFO][4616] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.198/32] ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-kkmkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" Sep 12 17:44:08.278220 containerd[1566]: 2025-09-12 17:44:08.255 [INFO][4616] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie315a882d79 ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-kkmkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" Sep 12 17:44:08.278220 containerd[1566]: 2025-09-12 17:44:08.263 [INFO][4616] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-kkmkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" Sep 12 17:44:08.278220 containerd[1566]: 2025-09-12 17:44:08.264 [INFO][4616] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-kkmkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0", GenerateName:"calico-apiserver-689c5fcdd8-", Namespace:"calico-apiserver", SelfLink:"", UID:"fc215e0e-6746-46f5-990e-f6284f6589d7", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689c5fcdd8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d", Pod:"calico-apiserver-689c5fcdd8-kkmkw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie315a882d79", MAC:"22:fd:e5:f7:79:1d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:08.278220 containerd[1566]: 2025-09-12 17:44:08.273 [INFO][4616] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-kkmkw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--kkmkw-eth0" Sep 12 17:44:08.279510 containerd[1566]: time="2025-09-12T17:44:08.279477233Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 17:44:08.279993 containerd[1566]: time="2025-09-12T17:44:08.279968550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:08.281698 containerd[1566]: time="2025-09-12T17:44:08.281660255Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:08.281969 containerd[1566]: time="2025-09-12T17:44:08.281944979Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 2.821021779s" Sep 12 17:44:08.282010 containerd[1566]: time="2025-09-12T17:44:08.281970099Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 17:44:08.282197 containerd[1566]: time="2025-09-12T17:44:08.282172112Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:08.284141 containerd[1566]: time="2025-09-12T17:44:08.284060840Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 17:44:08.285177 containerd[1566]: time="2025-09-12T17:44:08.285060677Z" level=info msg="CreateContainer within sandbox \"a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 17:44:08.297523 containerd[1566]: time="2025-09-12T17:44:08.297482700Z" level=info msg="Container fe5c7c9f341dca23b90b7ee02447e6a668150c13da8a54a36296737ab438d5eb: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:08.310628 containerd[1566]: time="2025-09-12T17:44:08.310596772Z" level=info msg="CreateContainer within sandbox \"a141b8e64c521579aeb28acd15387c2afea8ab8c348811b530dd3b79d6339c5d\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"fe5c7c9f341dca23b90b7ee02447e6a668150c13da8a54a36296737ab438d5eb\"" Sep 12 17:44:08.317166 containerd[1566]: time="2025-09-12T17:44:08.317123122Z" level=info msg="StartContainer for \"fe5c7c9f341dca23b90b7ee02447e6a668150c13da8a54a36296737ab438d5eb\"" Sep 12 17:44:08.318221 containerd[1566]: time="2025-09-12T17:44:08.318175953Z" level=info msg="connecting to shim fe5c7c9f341dca23b90b7ee02447e6a668150c13da8a54a36296737ab438d5eb" address="unix:///run/containerd/s/ab09894f94c5a41d3346e0f7dd42767fd73840aa09e4a9519ecd88e0f6a1accb" protocol=ttrpc version=3 Sep 12 17:44:08.331013 containerd[1566]: time="2025-09-12T17:44:08.330964479Z" level=info msg="connecting to shim 29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d" address="unix:///run/containerd/s/d71a1058ef44d03122aa9adb34b479c626d39f13b2f608c8e6f32fadf2187a55" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:44:08.353937 systemd[1]: Started cri-containerd-fe5c7c9f341dca23b90b7ee02447e6a668150c13da8a54a36296737ab438d5eb.scope - libcontainer container fe5c7c9f341dca23b90b7ee02447e6a668150c13da8a54a36296737ab438d5eb. Sep 12 17:44:08.366075 systemd[1]: Started cri-containerd-29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d.scope - libcontainer container 29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d. Sep 12 17:44:08.374505 systemd-networkd[1466]: cali14e6b31c735: Link UP Sep 12 17:44:08.375193 systemd-networkd[1466]: cali14e6b31c735: Gained carrier Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.165 [INFO][4625] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.177 [INFO][4625] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0 calico-kube-controllers-59859d54f6- calico-system 20907fe9-b0fe-40c2-a2f7-e34e4e300a17 828 0 2025-09-12 17:43:42 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:59859d54f6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426-1-0-d-1f6ac31256 calico-kube-controllers-59859d54f6-jtjgv eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali14e6b31c735 [] [] }} ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Namespace="calico-system" Pod="calico-kube-controllers-59859d54f6-jtjgv" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.177 [INFO][4625] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Namespace="calico-system" Pod="calico-kube-controllers-59859d54f6-jtjgv" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.215 [INFO][4642] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" HandleID="k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.215 [INFO][4642] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" HandleID="k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5920), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-1-0-d-1f6ac31256", "pod":"calico-kube-controllers-59859d54f6-jtjgv", "timestamp":"2025-09-12 17:44:08.215268501 +0000 UTC"}, Hostname:"ci-4426-1-0-d-1f6ac31256", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.215 [INFO][4642] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.249 [INFO][4642] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.250 [INFO][4642] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-d-1f6ac31256' Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.322 [INFO][4642] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.328 [INFO][4642] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.334 [INFO][4642] ipam/ipam.go 511: Trying affinity for 192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.337 [INFO][4642] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.339 [INFO][4642] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.339 [INFO][4642] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.192/26 handle="k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.341 [INFO][4642] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492 Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.349 [INFO][4642] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.192/26 handle="k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.361 [INFO][4642] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.199/26] block=192.168.3.192/26 handle="k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.361 [INFO][4642] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.199/26] handle="k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.361 [INFO][4642] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:44:08.400071 containerd[1566]: 2025-09-12 17:44:08.361 [INFO][4642] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.199/26] IPv6=[] ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" HandleID="k8s-pod-network.c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" Sep 12 17:44:08.402417 containerd[1566]: 2025-09-12 17:44:08.370 [INFO][4625] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Namespace="calico-system" Pod="calico-kube-controllers-59859d54f6-jtjgv" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0", GenerateName:"calico-kube-controllers-59859d54f6-", Namespace:"calico-system", SelfLink:"", UID:"20907fe9-b0fe-40c2-a2f7-e34e4e300a17", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59859d54f6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"", Pod:"calico-kube-controllers-59859d54f6-jtjgv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.3.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali14e6b31c735", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:08.402417 containerd[1566]: 2025-09-12 17:44:08.372 [INFO][4625] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.199/32] ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Namespace="calico-system" Pod="calico-kube-controllers-59859d54f6-jtjgv" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" Sep 12 17:44:08.402417 containerd[1566]: 2025-09-12 17:44:08.372 [INFO][4625] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali14e6b31c735 ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Namespace="calico-system" Pod="calico-kube-controllers-59859d54f6-jtjgv" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" Sep 12 17:44:08.402417 containerd[1566]: 2025-09-12 17:44:08.380 [INFO][4625] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Namespace="calico-system" Pod="calico-kube-controllers-59859d54f6-jtjgv" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" Sep 12 17:44:08.402417 containerd[1566]: 2025-09-12 17:44:08.380 [INFO][4625] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Namespace="calico-system" Pod="calico-kube-controllers-59859d54f6-jtjgv" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0", GenerateName:"calico-kube-controllers-59859d54f6-", Namespace:"calico-system", SelfLink:"", UID:"20907fe9-b0fe-40c2-a2f7-e34e4e300a17", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"59859d54f6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492", Pod:"calico-kube-controllers-59859d54f6-jtjgv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.3.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali14e6b31c735", MAC:"92:eb:f8:1f:9b:49", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:08.402417 containerd[1566]: 2025-09-12 17:44:08.396 [INFO][4625] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" Namespace="calico-system" Pod="calico-kube-controllers-59859d54f6-jtjgv" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--kube--controllers--59859d54f6--jtjgv-eth0" Sep 12 17:44:08.435016 containerd[1566]: time="2025-09-12T17:44:08.434964531Z" level=info msg="connecting to shim c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492" address="unix:///run/containerd/s/934d86658bd082c4762dae6569c5fb0c1b7114164b29ed68d2e0692a10a149e4" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:44:08.467939 systemd[1]: Started cri-containerd-c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492.scope - libcontainer container c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492. Sep 12 17:44:08.494968 containerd[1566]: time="2025-09-12T17:44:08.494934157Z" level=info msg="StartContainer for \"fe5c7c9f341dca23b90b7ee02447e6a668150c13da8a54a36296737ab438d5eb\" returns successfully" Sep 12 17:44:08.495190 containerd[1566]: time="2025-09-12T17:44:08.495074692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689c5fcdd8-kkmkw,Uid:fc215e0e-6746-46f5-990e-f6284f6589d7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d\"" Sep 12 17:44:08.540742 containerd[1566]: time="2025-09-12T17:44:08.540502866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-59859d54f6-jtjgv,Uid:20907fe9-b0fe-40c2-a2f7-e34e4e300a17,Namespace:calico-system,Attempt:0,} returns sandbox id \"c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492\"" Sep 12 17:44:09.112572 containerd[1566]: time="2025-09-12T17:44:09.112425674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689c5fcdd8-vzvxw,Uid:32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07,Namespace:calico-apiserver,Attempt:0,}" Sep 12 17:44:09.234867 systemd-networkd[1466]: calie54fba84f27: Link UP Sep 12 17:44:09.235872 systemd-networkd[1466]: calie54fba84f27: Gained carrier Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.144 [INFO][4816] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.155 [INFO][4816] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0 calico-apiserver-689c5fcdd8- calico-apiserver 32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07 826 0 2025-09-12 17:43:39 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:689c5fcdd8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426-1-0-d-1f6ac31256 calico-apiserver-689c5fcdd8-vzvxw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie54fba84f27 [] [] }} ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-vzvxw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.155 [INFO][4816] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-vzvxw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.187 [INFO][4827] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" HandleID="k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.187 [INFO][4827] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" HandleID="k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5950), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426-1-0-d-1f6ac31256", "pod":"calico-apiserver-689c5fcdd8-vzvxw", "timestamp":"2025-09-12 17:44:09.18698737 +0000 UTC"}, Hostname:"ci-4426-1-0-d-1f6ac31256", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.187 [INFO][4827] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.187 [INFO][4827] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.187 [INFO][4827] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-1-0-d-1f6ac31256' Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.194 [INFO][4827] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.204 [INFO][4827] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.209 [INFO][4827] ipam/ipam.go 511: Trying affinity for 192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.211 [INFO][4827] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.214 [INFO][4827] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.192/26 host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.214 [INFO][4827] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.192/26 handle="k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.215 [INFO][4827] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.221 [INFO][4827] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.192/26 handle="k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.229 [INFO][4827] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.200/26] block=192.168.3.192/26 handle="k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.229 [INFO][4827] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.200/26] handle="k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" host="ci-4426-1-0-d-1f6ac31256" Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.229 [INFO][4827] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 17:44:09.251568 containerd[1566]: 2025-09-12 17:44:09.229 [INFO][4827] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.200/26] IPv6=[] ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" HandleID="k8s-pod-network.1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Workload="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" Sep 12 17:44:09.252975 containerd[1566]: 2025-09-12 17:44:09.232 [INFO][4816] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-vzvxw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0", GenerateName:"calico-apiserver-689c5fcdd8-", Namespace:"calico-apiserver", SelfLink:"", UID:"32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689c5fcdd8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"", Pod:"calico-apiserver-689c5fcdd8-vzvxw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie54fba84f27", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:09.252975 containerd[1566]: 2025-09-12 17:44:09.232 [INFO][4816] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.200/32] ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-vzvxw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" Sep 12 17:44:09.252975 containerd[1566]: 2025-09-12 17:44:09.232 [INFO][4816] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie54fba84f27 ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-vzvxw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" Sep 12 17:44:09.252975 containerd[1566]: 2025-09-12 17:44:09.236 [INFO][4816] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-vzvxw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" Sep 12 17:44:09.252975 containerd[1566]: 2025-09-12 17:44:09.236 [INFO][4816] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-vzvxw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0", GenerateName:"calico-apiserver-689c5fcdd8-", Namespace:"calico-apiserver", SelfLink:"", UID:"32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 17, 43, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"689c5fcdd8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-1-0-d-1f6ac31256", ContainerID:"1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a", Pod:"calico-apiserver-689c5fcdd8-vzvxw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie54fba84f27", MAC:"fe:80:dc:66:5d:f8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 17:44:09.252975 containerd[1566]: 2025-09-12 17:44:09.249 [INFO][4816] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" Namespace="calico-apiserver" Pod="calico-apiserver-689c5fcdd8-vzvxw" WorkloadEndpoint="ci--4426--1--0--d--1f6ac31256-k8s-calico--apiserver--689c5fcdd8--vzvxw-eth0" Sep 12 17:44:09.283210 containerd[1566]: time="2025-09-12T17:44:09.283177547Z" level=info msg="connecting to shim 1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a" address="unix:///run/containerd/s/9b109670d6e0916546953588d61a36b1b11701740833dbfe31bd50ae59053673" namespace=k8s.io protocol=ttrpc version=3 Sep 12 17:44:09.314971 systemd[1]: Started cri-containerd-1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a.scope - libcontainer container 1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a. Sep 12 17:44:09.357629 containerd[1566]: time="2025-09-12T17:44:09.357581635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-689c5fcdd8-vzvxw,Uid:32ac1f6e-8e4c-423b-a1e9-9426f9eb5e07,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a\"" Sep 12 17:44:09.723952 systemd-networkd[1466]: calie315a882d79: Gained IPv6LL Sep 12 17:44:10.300146 systemd-networkd[1466]: cali14e6b31c735: Gained IPv6LL Sep 12 17:44:11.196167 systemd-networkd[1466]: calie54fba84f27: Gained IPv6LL Sep 12 17:44:11.259496 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount117542082.mount: Deactivated successfully. Sep 12 17:44:11.294283 kubelet[2767]: I0912 17:44:11.294242 2767 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:44:11.342873 kubelet[2767]: I0912 17:44:11.342739 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7d6f486c97-5ghhc" podStartSLOduration=4.429203834 podStartE2EDuration="9.342724743s" podCreationTimestamp="2025-09-12 17:44:02 +0000 UTC" firstStartedPulling="2025-09-12 17:44:03.369591043 +0000 UTC m=+40.354074244" lastFinishedPulling="2025-09-12 17:44:08.283111952 +0000 UTC m=+45.267595153" observedRunningTime="2025-09-12 17:44:09.389047702 +0000 UTC m=+46.373530913" watchObservedRunningTime="2025-09-12 17:44:11.342724743 +0000 UTC m=+48.327207944" Sep 12 17:44:11.674191 containerd[1566]: time="2025-09-12T17:44:11.674146896Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:11.675631 containerd[1566]: time="2025-09-12T17:44:11.675539448Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 17:44:11.676619 containerd[1566]: time="2025-09-12T17:44:11.676580720Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:11.678912 containerd[1566]: time="2025-09-12T17:44:11.678869291Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:11.679868 containerd[1566]: time="2025-09-12T17:44:11.679836017Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.395727144s" Sep 12 17:44:11.679911 containerd[1566]: time="2025-09-12T17:44:11.679868761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 17:44:11.681136 containerd[1566]: time="2025-09-12T17:44:11.681108909Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 17:44:11.684007 containerd[1566]: time="2025-09-12T17:44:11.683377801Z" level=info msg="CreateContainer within sandbox \"80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 17:44:11.693011 containerd[1566]: time="2025-09-12T17:44:11.691744649Z" level=info msg="Container cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:11.705109 containerd[1566]: time="2025-09-12T17:44:11.704909890Z" level=info msg="CreateContainer within sandbox \"80d739b7d59fee822875bb5f9bb70a17bbbd27ee3b35dc43e66d2f9c6910964b\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\"" Sep 12 17:44:11.709668 containerd[1566]: time="2025-09-12T17:44:11.709586997Z" level=info msg="StartContainer for \"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\"" Sep 12 17:44:11.711096 containerd[1566]: time="2025-09-12T17:44:11.711007024Z" level=info msg="connecting to shim cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6" address="unix:///run/containerd/s/920135a87dcaad703eb573d69905b37ac6da726c38b72df4e2d04d15ebba5045" protocol=ttrpc version=3 Sep 12 17:44:11.738936 systemd[1]: Started cri-containerd-cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6.scope - libcontainer container cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6. Sep 12 17:44:11.787604 containerd[1566]: time="2025-09-12T17:44:11.787552860Z" level=info msg="StartContainer for \"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" returns successfully" Sep 12 17:44:12.110080 systemd-networkd[1466]: vxlan.calico: Link UP Sep 12 17:44:12.110101 systemd-networkd[1466]: vxlan.calico: Gained carrier Sep 12 17:44:12.546526 containerd[1566]: time="2025-09-12T17:44:12.546448139Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" id:\"cf0909efff21ccc907c4ae6bcb584ff0f748bbea68c0d854a3d9a6a6850f2582\" pid:5099 exit_status:1 exited_at:{seconds:1757699052 nanos:543192036}" Sep 12 17:44:13.435979 systemd-networkd[1466]: vxlan.calico: Gained IPv6LL Sep 12 17:44:13.558649 containerd[1566]: time="2025-09-12T17:44:13.558572647Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:13.595802 containerd[1566]: time="2025-09-12T17:44:13.560400818Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 17:44:13.595981 containerd[1566]: time="2025-09-12T17:44:13.562210714Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:13.595981 containerd[1566]: time="2025-09-12T17:44:13.572582122Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.891446512s" Sep 12 17:44:13.595981 containerd[1566]: time="2025-09-12T17:44:13.595960432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 17:44:13.596510 containerd[1566]: time="2025-09-12T17:44:13.572922301Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" id:\"bb0e19f6ec5b7ed62632c2cd099d74f2e286a5ec6baceef36c7d24cd4c44546c\" pid:5152 exit_status:1 exited_at:{seconds:1757699053 nanos:567951763}" Sep 12 17:44:13.597875 containerd[1566]: time="2025-09-12T17:44:13.597824403Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:13.599125 containerd[1566]: time="2025-09-12T17:44:13.598732372Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:44:13.600353 containerd[1566]: time="2025-09-12T17:44:13.600327291Z" level=info msg="CreateContainer within sandbox \"b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 17:44:13.653498 containerd[1566]: time="2025-09-12T17:44:13.653467308Z" level=info msg="Container 8928b173f7ded1445305cec565ceaf7b74e567662886c330dcf5a4ac636b9d0f: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:13.659598 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2912775863.mount: Deactivated successfully. Sep 12 17:44:13.681136 containerd[1566]: time="2025-09-12T17:44:13.681091332Z" level=info msg="CreateContainer within sandbox \"b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"8928b173f7ded1445305cec565ceaf7b74e567662886c330dcf5a4ac636b9d0f\"" Sep 12 17:44:13.681958 containerd[1566]: time="2025-09-12T17:44:13.681850282Z" level=info msg="StartContainer for \"8928b173f7ded1445305cec565ceaf7b74e567662886c330dcf5a4ac636b9d0f\"" Sep 12 17:44:13.683002 containerd[1566]: time="2025-09-12T17:44:13.682973298Z" level=info msg="connecting to shim 8928b173f7ded1445305cec565ceaf7b74e567662886c330dcf5a4ac636b9d0f" address="unix:///run/containerd/s/c43f886f1243937599002dc7d199869e958da4d66b4c9a92d4ea67de5b4ceed9" protocol=ttrpc version=3 Sep 12 17:44:13.714072 systemd[1]: Started cri-containerd-8928b173f7ded1445305cec565ceaf7b74e567662886c330dcf5a4ac636b9d0f.scope - libcontainer container 8928b173f7ded1445305cec565ceaf7b74e567662886c330dcf5a4ac636b9d0f. Sep 12 17:44:13.749730 containerd[1566]: time="2025-09-12T17:44:13.749655464Z" level=info msg="StartContainer for \"8928b173f7ded1445305cec565ceaf7b74e567662886c330dcf5a4ac636b9d0f\" returns successfully" Sep 12 17:44:14.493039 containerd[1566]: time="2025-09-12T17:44:14.492966821Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" id:\"ff85ba203e737d21a9264b82ee14bb515d84031c1a580ec809004adb31455e30\" pid:5208 exit_status:1 exited_at:{seconds:1757699054 nanos:492660077}" Sep 12 17:44:17.249992 containerd[1566]: time="2025-09-12T17:44:17.249928907Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:17.251083 containerd[1566]: time="2025-09-12T17:44:17.250915671Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 17:44:17.252416 containerd[1566]: time="2025-09-12T17:44:17.252362735Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:17.254484 containerd[1566]: time="2025-09-12T17:44:17.254431419Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:17.255393 containerd[1566]: time="2025-09-12T17:44:17.255022259Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.656262143s" Sep 12 17:44:17.255393 containerd[1566]: time="2025-09-12T17:44:17.255059520Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:44:17.256217 containerd[1566]: time="2025-09-12T17:44:17.256200344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 17:44:17.259372 containerd[1566]: time="2025-09-12T17:44:17.259293304Z" level=info msg="CreateContainer within sandbox \"29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:44:17.268036 containerd[1566]: time="2025-09-12T17:44:17.268004391Z" level=info msg="Container 50ea34ecaf32247bd97ef507651793f746bdce47500eaa613d1f738d8e6e157e: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:17.289447 containerd[1566]: time="2025-09-12T17:44:17.289403351Z" level=info msg="CreateContainer within sandbox \"29b1b8210013840394d7ab30380266db160c045778225ffc5d15d3346a83b70d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"50ea34ecaf32247bd97ef507651793f746bdce47500eaa613d1f738d8e6e157e\"" Sep 12 17:44:17.290912 containerd[1566]: time="2025-09-12T17:44:17.290884390Z" level=info msg="StartContainer for \"50ea34ecaf32247bd97ef507651793f746bdce47500eaa613d1f738d8e6e157e\"" Sep 12 17:44:17.291804 containerd[1566]: time="2025-09-12T17:44:17.291774509Z" level=info msg="connecting to shim 50ea34ecaf32247bd97ef507651793f746bdce47500eaa613d1f738d8e6e157e" address="unix:///run/containerd/s/d71a1058ef44d03122aa9adb34b479c626d39f13b2f608c8e6f32fadf2187a55" protocol=ttrpc version=3 Sep 12 17:44:17.340123 systemd[1]: Started cri-containerd-50ea34ecaf32247bd97ef507651793f746bdce47500eaa613d1f738d8e6e157e.scope - libcontainer container 50ea34ecaf32247bd97ef507651793f746bdce47500eaa613d1f738d8e6e157e. Sep 12 17:44:17.409447 containerd[1566]: time="2025-09-12T17:44:17.409307695Z" level=info msg="StartContainer for \"50ea34ecaf32247bd97ef507651793f746bdce47500eaa613d1f738d8e6e157e\" returns successfully" Sep 12 17:44:17.453838 kubelet[2767]: I0912 17:44:17.452856 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-58z84" podStartSLOduration=31.121504776 podStartE2EDuration="36.45279685s" podCreationTimestamp="2025-09-12 17:43:41 +0000 UTC" firstStartedPulling="2025-09-12 17:44:06.349281655 +0000 UTC m=+43.333764846" lastFinishedPulling="2025-09-12 17:44:11.680573719 +0000 UTC m=+48.665056920" observedRunningTime="2025-09-12 17:44:12.43789 +0000 UTC m=+49.422373211" watchObservedRunningTime="2025-09-12 17:44:17.45279685 +0000 UTC m=+54.437280051" Sep 12 17:44:18.283956 containerd[1566]: time="2025-09-12T17:44:18.283895344Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" id:\"8d216ab12d762e8f9fd05cad2d5fe27ea6585aa70000e05c2ad34382d8fcf7a8\" pid:5289 exited_at:{seconds:1757699058 nanos:267071138}" Sep 12 17:44:18.604096 kubelet[2767]: I0912 17:44:18.603966 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-689c5fcdd8-kkmkw" podStartSLOduration=30.843854884 podStartE2EDuration="39.603950422s" podCreationTimestamp="2025-09-12 17:43:39 +0000 UTC" firstStartedPulling="2025-09-12 17:44:08.495946729 +0000 UTC m=+45.480429921" lastFinishedPulling="2025-09-12 17:44:17.256042258 +0000 UTC m=+54.240525459" observedRunningTime="2025-09-12 17:44:17.45384788 +0000 UTC m=+54.438331081" watchObservedRunningTime="2025-09-12 17:44:18.603950422 +0000 UTC m=+55.588433624" Sep 12 17:44:20.839906 containerd[1566]: time="2025-09-12T17:44:20.839800498Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:20.841102 containerd[1566]: time="2025-09-12T17:44:20.841078971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 17:44:20.843430 containerd[1566]: time="2025-09-12T17:44:20.842654637Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:20.844529 containerd[1566]: time="2025-09-12T17:44:20.844503128Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:20.845091 containerd[1566]: time="2025-09-12T17:44:20.845010255Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.588698598s" Sep 12 17:44:20.845188 containerd[1566]: time="2025-09-12T17:44:20.845154454Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 17:44:20.846045 containerd[1566]: time="2025-09-12T17:44:20.846019981Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 17:44:20.883862 containerd[1566]: time="2025-09-12T17:44:20.883823605Z" level=info msg="CreateContainer within sandbox \"c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 17:44:20.895452 containerd[1566]: time="2025-09-12T17:44:20.895407495Z" level=info msg="Container d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:20.908787 containerd[1566]: time="2025-09-12T17:44:20.908742627Z" level=info msg="CreateContainer within sandbox \"c9c77d09b6bc2be9ba5113dc14646e13af248513de967fcbb1bb9c73ebdd3492\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\"" Sep 12 17:44:20.909396 containerd[1566]: time="2025-09-12T17:44:20.909306573Z" level=info msg="StartContainer for \"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\"" Sep 12 17:44:20.911298 containerd[1566]: time="2025-09-12T17:44:20.911248805Z" level=info msg="connecting to shim d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540" address="unix:///run/containerd/s/934d86658bd082c4762dae6569c5fb0c1b7114164b29ed68d2e0692a10a149e4" protocol=ttrpc version=3 Sep 12 17:44:20.930934 systemd[1]: Started cri-containerd-d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540.scope - libcontainer container d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540. Sep 12 17:44:20.979511 containerd[1566]: time="2025-09-12T17:44:20.979406512Z" level=info msg="StartContainer for \"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\" returns successfully" Sep 12 17:44:21.330789 containerd[1566]: time="2025-09-12T17:44:21.330719435Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:21.331920 containerd[1566]: time="2025-09-12T17:44:21.331888586Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 17:44:21.333485 containerd[1566]: time="2025-09-12T17:44:21.333451987Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 487.400405ms" Sep 12 17:44:21.333485 containerd[1566]: time="2025-09-12T17:44:21.333482246Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 17:44:21.334469 containerd[1566]: time="2025-09-12T17:44:21.334441572Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 17:44:21.337556 containerd[1566]: time="2025-09-12T17:44:21.337493128Z" level=info msg="CreateContainer within sandbox \"1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 17:44:21.344529 containerd[1566]: time="2025-09-12T17:44:21.344464072Z" level=info msg="Container d47ad94a1db03dc71a0d9f62945fa3718c33490713e7ded9b59ae80ba08242f5: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:21.366629 containerd[1566]: time="2025-09-12T17:44:21.366589725Z" level=info msg="CreateContainer within sandbox \"1a3388a39586217d14c2f60d722bba7f2ca7d8f1fb74ce30e5ac06ab1b2b212a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d47ad94a1db03dc71a0d9f62945fa3718c33490713e7ded9b59ae80ba08242f5\"" Sep 12 17:44:21.368637 containerd[1566]: time="2025-09-12T17:44:21.368605717Z" level=info msg="StartContainer for \"d47ad94a1db03dc71a0d9f62945fa3718c33490713e7ded9b59ae80ba08242f5\"" Sep 12 17:44:21.369665 containerd[1566]: time="2025-09-12T17:44:21.369634559Z" level=info msg="connecting to shim d47ad94a1db03dc71a0d9f62945fa3718c33490713e7ded9b59ae80ba08242f5" address="unix:///run/containerd/s/9b109670d6e0916546953588d61a36b1b11701740833dbfe31bd50ae59053673" protocol=ttrpc version=3 Sep 12 17:44:21.423971 systemd[1]: Started cri-containerd-d47ad94a1db03dc71a0d9f62945fa3718c33490713e7ded9b59ae80ba08242f5.scope - libcontainer container d47ad94a1db03dc71a0d9f62945fa3718c33490713e7ded9b59ae80ba08242f5. Sep 12 17:44:21.486569 containerd[1566]: time="2025-09-12T17:44:21.486466513Z" level=info msg="StartContainer for \"d47ad94a1db03dc71a0d9f62945fa3718c33490713e7ded9b59ae80ba08242f5\" returns successfully" Sep 12 17:44:21.537359 containerd[1566]: time="2025-09-12T17:44:21.537229450Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\" id:\"7dc8ab14ed7d8696c7c6c4188889d56013cd55d2e1bf9e587185e4d445551926\" pid:5388 exited_at:{seconds:1757699061 nanos:536899174}" Sep 12 17:44:21.602624 kubelet[2767]: I0912 17:44:21.602494 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-59859d54f6-jtjgv" podStartSLOduration=27.299100199 podStartE2EDuration="39.602479661s" podCreationTimestamp="2025-09-12 17:43:42 +0000 UTC" firstStartedPulling="2025-09-12 17:44:08.542556287 +0000 UTC m=+45.527039488" lastFinishedPulling="2025-09-12 17:44:20.845935749 +0000 UTC m=+57.830418950" observedRunningTime="2025-09-12 17:44:21.475152687 +0000 UTC m=+58.459635889" watchObservedRunningTime="2025-09-12 17:44:21.602479661 +0000 UTC m=+58.586962863" Sep 12 17:44:22.470283 kubelet[2767]: I0912 17:44:22.470047 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-689c5fcdd8-vzvxw" podStartSLOduration=31.494534442 podStartE2EDuration="43.470029667s" podCreationTimestamp="2025-09-12 17:43:39 +0000 UTC" firstStartedPulling="2025-09-12 17:44:09.358774576 +0000 UTC m=+46.343257787" lastFinishedPulling="2025-09-12 17:44:21.334269801 +0000 UTC m=+58.318753012" observedRunningTime="2025-09-12 17:44:22.467133013 +0000 UTC m=+59.451616234" watchObservedRunningTime="2025-09-12 17:44:22.470029667 +0000 UTC m=+59.454512868" Sep 12 17:44:23.472110 kubelet[2767]: I0912 17:44:23.472057 2767 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:44:23.494990 containerd[1566]: time="2025-09-12T17:44:23.494920991Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:23.495986 containerd[1566]: time="2025-09-12T17:44:23.495902159Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 17:44:23.497023 containerd[1566]: time="2025-09-12T17:44:23.496992807Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:23.498759 containerd[1566]: time="2025-09-12T17:44:23.498742071Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 17:44:23.499272 containerd[1566]: time="2025-09-12T17:44:23.499186216Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.16471693s" Sep 12 17:44:23.499272 containerd[1566]: time="2025-09-12T17:44:23.499216344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 17:44:23.503778 containerd[1566]: time="2025-09-12T17:44:23.503738582Z" level=info msg="CreateContainer within sandbox \"b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 17:44:23.517765 containerd[1566]: time="2025-09-12T17:44:23.515992392Z" level=info msg="Container 026e3da5e29fb80f70a3e84423f895599a72a950727c426ee6553a9cf3b5800f: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:44:23.527256 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount794654483.mount: Deactivated successfully. Sep 12 17:44:23.531600 containerd[1566]: time="2025-09-12T17:44:23.531558284Z" level=info msg="CreateContainer within sandbox \"b4ed8fa092e15a13b7a91aa7efe219d28dc3c7278adb5e111555168507b6c2a9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"026e3da5e29fb80f70a3e84423f895599a72a950727c426ee6553a9cf3b5800f\"" Sep 12 17:44:23.532318 containerd[1566]: time="2025-09-12T17:44:23.532269863Z" level=info msg="StartContainer for \"026e3da5e29fb80f70a3e84423f895599a72a950727c426ee6553a9cf3b5800f\"" Sep 12 17:44:23.533589 containerd[1566]: time="2025-09-12T17:44:23.533549895Z" level=info msg="connecting to shim 026e3da5e29fb80f70a3e84423f895599a72a950727c426ee6553a9cf3b5800f" address="unix:///run/containerd/s/c43f886f1243937599002dc7d199869e958da4d66b4c9a92d4ea67de5b4ceed9" protocol=ttrpc version=3 Sep 12 17:44:23.560795 systemd[1]: Started cri-containerd-026e3da5e29fb80f70a3e84423f895599a72a950727c426ee6553a9cf3b5800f.scope - libcontainer container 026e3da5e29fb80f70a3e84423f895599a72a950727c426ee6553a9cf3b5800f. Sep 12 17:44:23.628687 containerd[1566]: time="2025-09-12T17:44:23.628176480Z" level=info msg="StartContainer for \"026e3da5e29fb80f70a3e84423f895599a72a950727c426ee6553a9cf3b5800f\" returns successfully" Sep 12 17:44:24.422770 kubelet[2767]: I0912 17:44:24.419773 2767 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 17:44:24.422770 kubelet[2767]: I0912 17:44:24.422765 2767 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 17:44:24.519033 kubelet[2767]: I0912 17:44:24.518514 2767 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-rjfp6" podStartSLOduration=25.479814098 podStartE2EDuration="42.518490284s" podCreationTimestamp="2025-09-12 17:43:42 +0000 UTC" firstStartedPulling="2025-09-12 17:44:06.463602298 +0000 UTC m=+43.448085499" lastFinishedPulling="2025-09-12 17:44:23.502278484 +0000 UTC m=+60.486761685" observedRunningTime="2025-09-12 17:44:24.51709391 +0000 UTC m=+61.501577121" watchObservedRunningTime="2025-09-12 17:44:24.518490284 +0000 UTC m=+61.502973495" Sep 12 17:44:34.433059 containerd[1566]: time="2025-09-12T17:44:34.433011143Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" id:\"59f409f6ce4c1d4c96b8076b113c88e4aec799dcab3e17dd24e113a9a464a3c2\" pid:5473 exited_at:{seconds:1757699074 nanos:431715821}" Sep 12 17:44:41.149054 containerd[1566]: time="2025-09-12T17:44:41.149001668Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\" id:\"5bc13e3b2709ec6a314ccb6d43ff9d051b0be2844b62cefd644d34519327f8bc\" pid:5498 exited_at:{seconds:1757699081 nanos:148587184}" Sep 12 17:44:42.363707 kubelet[2767]: I0912 17:44:42.363354 2767 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 17:44:44.644247 containerd[1566]: time="2025-09-12T17:44:44.644082559Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" id:\"06f544c3fef424af750918766bbc72e23edcd293dae8ffc709deb43fe32db3e1\" pid:5522 exited_at:{seconds:1757699084 nanos:643730338}" Sep 12 17:44:51.507065 containerd[1566]: time="2025-09-12T17:44:51.507004344Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\" id:\"edaaaaa993943df21e36140bc878119256420a25023ad8cdf793763f5c34873f\" pid:5547 exited_at:{seconds:1757699091 nanos:506205192}" Sep 12 17:45:01.806390 systemd[1]: Started sshd@7-95.216.139.29:22-139.178.68.195:34892.service - OpenSSH per-connection server daemon (139.178.68.195:34892). Sep 12 17:45:02.950566 sshd[5576]: Accepted publickey for core from 139.178.68.195 port 34892 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:02.954485 sshd-session[5576]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:02.961550 systemd-logind[1540]: New session 8 of user core. Sep 12 17:45:02.966951 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 17:45:04.195070 sshd[5579]: Connection closed by 139.178.68.195 port 34892 Sep 12 17:45:04.195749 sshd-session[5576]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:04.206378 systemd[1]: sshd@7-95.216.139.29:22-139.178.68.195:34892.service: Deactivated successfully. Sep 12 17:45:04.209582 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 17:45:04.214259 systemd-logind[1540]: Session 8 logged out. Waiting for processes to exit. Sep 12 17:45:04.215610 systemd-logind[1540]: Removed session 8. Sep 12 17:45:04.616456 containerd[1566]: time="2025-09-12T17:45:04.616406076Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" id:\"782caac1122a2c2e281e657ebd8de85aa1504c31641c1e3e3b195e403d5ea758\" pid:5606 exited_at:{seconds:1757699104 nanos:592011615}" Sep 12 17:45:09.346036 systemd[1]: Started sshd@8-95.216.139.29:22-139.178.68.195:34908.service - OpenSSH per-connection server daemon (139.178.68.195:34908). Sep 12 17:45:10.359613 sshd[5619]: Accepted publickey for core from 139.178.68.195 port 34908 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:10.363423 sshd-session[5619]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:10.368523 systemd-logind[1540]: New session 9 of user core. Sep 12 17:45:10.373257 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 17:45:11.273480 sshd[5622]: Connection closed by 139.178.68.195 port 34908 Sep 12 17:45:11.274436 sshd-session[5619]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:11.277997 systemd[1]: sshd@8-95.216.139.29:22-139.178.68.195:34908.service: Deactivated successfully. Sep 12 17:45:11.279875 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 17:45:11.280687 systemd-logind[1540]: Session 9 logged out. Waiting for processes to exit. Sep 12 17:45:11.282612 systemd-logind[1540]: Removed session 9. Sep 12 17:45:14.596337 containerd[1566]: time="2025-09-12T17:45:14.590325879Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" id:\"6e45332acd6447774a2515e821dabc9778dcb38db42c2ad5e7a9c2b4cd843740\" pid:5647 exited_at:{seconds:1757699114 nanos:589965014}" Sep 12 17:45:16.443094 systemd[1]: Started sshd@9-95.216.139.29:22-139.178.68.195:44476.service - OpenSSH per-connection server daemon (139.178.68.195:44476). Sep 12 17:45:17.454150 sshd[5658]: Accepted publickey for core from 139.178.68.195 port 44476 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:17.458051 sshd-session[5658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:17.465303 systemd-logind[1540]: New session 10 of user core. Sep 12 17:45:17.471017 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 17:45:18.245258 sshd[5661]: Connection closed by 139.178.68.195 port 44476 Sep 12 17:45:18.245944 sshd-session[5658]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:18.249476 systemd-logind[1540]: Session 10 logged out. Waiting for processes to exit. Sep 12 17:45:18.251057 systemd[1]: sshd@9-95.216.139.29:22-139.178.68.195:44476.service: Deactivated successfully. Sep 12 17:45:18.253878 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 17:45:18.256713 systemd-logind[1540]: Removed session 10. Sep 12 17:45:18.277651 containerd[1566]: time="2025-09-12T17:45:18.271197346Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" id:\"65b3ff78a07a31c1cd37d858ca35615137acae1c97d90ecd4fe2fc0e770cce03\" pid:5683 exited_at:{seconds:1757699118 nanos:270748396}" Sep 12 17:45:18.413089 systemd[1]: Started sshd@10-95.216.139.29:22-139.178.68.195:44492.service - OpenSSH per-connection server daemon (139.178.68.195:44492). Sep 12 17:45:19.414362 sshd[5696]: Accepted publickey for core from 139.178.68.195 port 44492 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:19.416200 sshd-session[5696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:19.421502 systemd-logind[1540]: New session 11 of user core. Sep 12 17:45:19.427252 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 17:45:20.297947 sshd[5699]: Connection closed by 139.178.68.195 port 44492 Sep 12 17:45:20.300135 sshd-session[5696]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:20.307828 systemd[1]: sshd@10-95.216.139.29:22-139.178.68.195:44492.service: Deactivated successfully. Sep 12 17:45:20.314194 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 17:45:20.316537 systemd-logind[1540]: Session 11 logged out. Waiting for processes to exit. Sep 12 17:45:20.320831 systemd-logind[1540]: Removed session 11. Sep 12 17:45:20.503927 systemd[1]: Started sshd@11-95.216.139.29:22-139.178.68.195:55792.service - OpenSSH per-connection server daemon (139.178.68.195:55792). Sep 12 17:45:21.518449 containerd[1566]: time="2025-09-12T17:45:21.518402697Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\" id:\"53c18519faff6182796e4e93ba2ee4d53e9532f1ca7fbedc92944c0b8bd4be20\" pid:5729 exited_at:{seconds:1757699121 nanos:518013204}" Sep 12 17:45:21.606894 sshd[5713]: Accepted publickey for core from 139.178.68.195 port 55792 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:21.608720 sshd-session[5713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:21.616532 systemd-logind[1540]: New session 12 of user core. Sep 12 17:45:21.621988 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 17:45:22.467422 sshd[5737]: Connection closed by 139.178.68.195 port 55792 Sep 12 17:45:22.468988 sshd-session[5713]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:22.472353 systemd[1]: sshd@11-95.216.139.29:22-139.178.68.195:55792.service: Deactivated successfully. Sep 12 17:45:22.474346 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 17:45:22.475507 systemd-logind[1540]: Session 12 logged out. Waiting for processes to exit. Sep 12 17:45:22.477209 systemd-logind[1540]: Removed session 12. Sep 12 17:45:27.624507 systemd[1]: Started sshd@12-95.216.139.29:22-139.178.68.195:55806.service - OpenSSH per-connection server daemon (139.178.68.195:55806). Sep 12 17:45:28.620356 sshd[5751]: Accepted publickey for core from 139.178.68.195 port 55806 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:28.624160 sshd-session[5751]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:28.629782 systemd-logind[1540]: New session 13 of user core. Sep 12 17:45:28.634947 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 17:45:29.479012 sshd[5754]: Connection closed by 139.178.68.195 port 55806 Sep 12 17:45:29.482264 sshd-session[5751]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:29.485827 systemd[1]: sshd@12-95.216.139.29:22-139.178.68.195:55806.service: Deactivated successfully. Sep 12 17:45:29.488553 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 17:45:29.490164 systemd-logind[1540]: Session 13 logged out. Waiting for processes to exit. Sep 12 17:45:29.491744 systemd-logind[1540]: Removed session 13. Sep 12 17:45:29.646267 systemd[1]: Started sshd@13-95.216.139.29:22-139.178.68.195:55822.service - OpenSSH per-connection server daemon (139.178.68.195:55822). Sep 12 17:45:30.632879 sshd[5768]: Accepted publickey for core from 139.178.68.195 port 55822 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:30.634428 sshd-session[5768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:30.639241 systemd-logind[1540]: New session 14 of user core. Sep 12 17:45:30.644951 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 17:45:31.574710 sshd[5771]: Connection closed by 139.178.68.195 port 55822 Sep 12 17:45:31.578939 sshd-session[5768]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:31.592442 systemd[1]: sshd@13-95.216.139.29:22-139.178.68.195:55822.service: Deactivated successfully. Sep 12 17:45:31.595300 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 17:45:31.597983 systemd-logind[1540]: Session 14 logged out. Waiting for processes to exit. Sep 12 17:45:31.599758 systemd-logind[1540]: Removed session 14. Sep 12 17:45:31.742156 systemd[1]: Started sshd@14-95.216.139.29:22-139.178.68.195:54504.service - OpenSSH per-connection server daemon (139.178.68.195:54504). Sep 12 17:45:32.738275 sshd[5781]: Accepted publickey for core from 139.178.68.195 port 54504 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:32.739980 sshd-session[5781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:32.744741 systemd-logind[1540]: New session 15 of user core. Sep 12 17:45:32.750936 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 17:45:33.972033 sshd[5792]: Connection closed by 139.178.68.195 port 54504 Sep 12 17:45:33.972663 sshd-session[5781]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:33.978007 systemd-logind[1540]: Session 15 logged out. Waiting for processes to exit. Sep 12 17:45:33.979100 systemd[1]: sshd@14-95.216.139.29:22-139.178.68.195:54504.service: Deactivated successfully. Sep 12 17:45:33.984462 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 17:45:33.988396 systemd-logind[1540]: Removed session 15. Sep 12 17:45:34.140500 systemd[1]: Started sshd@15-95.216.139.29:22-139.178.68.195:54512.service - OpenSSH per-connection server daemon (139.178.68.195:54512). Sep 12 17:45:34.647043 containerd[1566]: time="2025-09-12T17:45:34.647001907Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" id:\"9937e76fb3f035876700d083fcb908982c203a5747b83b5d03f95571b25b6b26\" pid:5824 exited_at:{seconds:1757699134 nanos:646559940}" Sep 12 17:45:35.131603 sshd[5809]: Accepted publickey for core from 139.178.68.195 port 54512 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:35.133271 sshd-session[5809]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:35.138922 systemd-logind[1540]: New session 16 of user core. Sep 12 17:45:35.143004 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 17:45:36.470829 sshd[5835]: Connection closed by 139.178.68.195 port 54512 Sep 12 17:45:36.476626 sshd-session[5809]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:36.485418 systemd[1]: sshd@15-95.216.139.29:22-139.178.68.195:54512.service: Deactivated successfully. Sep 12 17:45:36.487283 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 17:45:36.488516 systemd-logind[1540]: Session 16 logged out. Waiting for processes to exit. Sep 12 17:45:36.491155 systemd-logind[1540]: Removed session 16. Sep 12 17:45:36.637800 systemd[1]: Started sshd@16-95.216.139.29:22-139.178.68.195:54526.service - OpenSSH per-connection server daemon (139.178.68.195:54526). Sep 12 17:45:37.650398 sshd[5845]: Accepted publickey for core from 139.178.68.195 port 54526 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:37.652524 sshd-session[5845]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:37.657258 systemd-logind[1540]: New session 17 of user core. Sep 12 17:45:37.664267 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 17:45:38.494741 sshd[5848]: Connection closed by 139.178.68.195 port 54526 Sep 12 17:45:38.496369 sshd-session[5845]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:38.500561 systemd-logind[1540]: Session 17 logged out. Waiting for processes to exit. Sep 12 17:45:38.501292 systemd[1]: sshd@16-95.216.139.29:22-139.178.68.195:54526.service: Deactivated successfully. Sep 12 17:45:38.503803 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 17:45:38.507610 systemd-logind[1540]: Removed session 17. Sep 12 17:45:41.056281 containerd[1566]: time="2025-09-12T17:45:41.056235473Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\" id:\"a17147346b3b8c4af1022315a12a99dee3e1263af56d793daefcabb1dd9c4f09\" pid:5873 exited_at:{seconds:1757699141 nanos:56070217}" Sep 12 17:45:43.663474 systemd[1]: Started sshd@17-95.216.139.29:22-139.178.68.195:47346.service - OpenSSH per-connection server daemon (139.178.68.195:47346). Sep 12 17:45:44.605075 containerd[1566]: time="2025-09-12T17:45:44.605028004Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cacb3aa2643a8577f1d7e3b8dbf5fa8b235b15675ccb3714ee73a71d81e846f6\" id:\"61a7fc89a54b9623950fa1f845f65eeaa755c0f4612d68fa49195427286ae669\" pid:5906 exited_at:{seconds:1757699144 nanos:604456493}" Sep 12 17:45:44.647055 sshd[5891]: Accepted publickey for core from 139.178.68.195 port 47346 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:44.648507 sshd-session[5891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:44.655064 systemd-logind[1540]: New session 18 of user core. Sep 12 17:45:44.659946 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 17:45:45.742011 sshd[5917]: Connection closed by 139.178.68.195 port 47346 Sep 12 17:45:45.743847 sshd-session[5891]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:45.749835 systemd[1]: sshd@17-95.216.139.29:22-139.178.68.195:47346.service: Deactivated successfully. Sep 12 17:45:45.752987 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 17:45:45.755021 systemd-logind[1540]: Session 18 logged out. Waiting for processes to exit. Sep 12 17:45:45.758499 systemd-logind[1540]: Removed session 18. Sep 12 17:45:50.947629 systemd[1]: Started sshd@18-95.216.139.29:22-139.178.68.195:53134.service - OpenSSH per-connection server daemon (139.178.68.195:53134). Sep 12 17:45:51.549418 containerd[1566]: time="2025-09-12T17:45:51.549378058Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d228cd4013ed4eb0bb02e1201899737743f8cf80c7e10123f5dc0a08c590f540\" id:\"c8d58476c3ea2bdd34af6a4d3111e8cee8bc68ffa4012c13572948baf7e4a0e5\" pid:5957 exited_at:{seconds:1757699151 nanos:536751362}" Sep 12 17:45:52.053866 sshd[5943]: Accepted publickey for core from 139.178.68.195 port 53134 ssh2: RSA SHA256:J+Fb4ToNUuR65qrI0AIJj8alHK8ZW8JjKLAg4TKn12I Sep 12 17:45:52.057159 sshd-session[5943]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 17:45:52.063157 systemd-logind[1540]: New session 19 of user core. Sep 12 17:45:52.067938 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 17:45:52.935231 sshd[5965]: Connection closed by 139.178.68.195 port 53134 Sep 12 17:45:52.935954 sshd-session[5943]: pam_unix(sshd:session): session closed for user core Sep 12 17:45:52.940262 systemd-logind[1540]: Session 19 logged out. Waiting for processes to exit. Sep 12 17:45:52.940372 systemd[1]: sshd@18-95.216.139.29:22-139.178.68.195:53134.service: Deactivated successfully. Sep 12 17:45:52.942097 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 17:45:52.943566 systemd-logind[1540]: Removed session 19. Sep 12 17:46:04.517109 containerd[1566]: time="2025-09-12T17:46:04.517038896Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a0d17556685e80ab0caae687541510c429f9fe30a634d23fa7f4988b1ac7efb\" id:\"b238f5e89068d01ab1eadc2345f9c32d42baf58c54da168b2c2155a8388376f2\" pid:5990 exited_at:{seconds:1757699164 nanos:516710422}" Sep 12 17:46:09.334700 systemd[1]: cri-containerd-225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2.scope: Deactivated successfully. Sep 12 17:46:09.335745 systemd[1]: cri-containerd-225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2.scope: Consumed 2.905s CPU time, 86.7M memory peak, 93.5M read from disk. Sep 12 17:46:09.453868 containerd[1566]: time="2025-09-12T17:46:09.452786673Z" level=info msg="received exit event container_id:\"225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2\" id:\"225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2\" pid:2603 exit_status:1 exited_at:{seconds:1757699169 nanos:416084816}" Sep 12 17:46:09.458202 containerd[1566]: time="2025-09-12T17:46:09.458140822Z" level=info msg="TaskExit event in podsandbox handler container_id:\"225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2\" id:\"225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2\" pid:2603 exit_status:1 exited_at:{seconds:1757699169 nanos:416084816}" Sep 12 17:46:09.580238 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2-rootfs.mount: Deactivated successfully. Sep 12 17:46:09.609431 systemd[1]: cri-containerd-88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85.scope: Deactivated successfully. Sep 12 17:46:09.610623 systemd[1]: cri-containerd-88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85.scope: Consumed 1.685s CPU time, 42.1M memory peak, 53.1M read from disk. Sep 12 17:46:09.621436 containerd[1566]: time="2025-09-12T17:46:09.621202022Z" level=info msg="received exit event container_id:\"88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85\" id:\"88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85\" pid:2624 exit_status:1 exited_at:{seconds:1757699169 nanos:610593812}" Sep 12 17:46:09.621436 containerd[1566]: time="2025-09-12T17:46:09.621351804Z" level=info msg="TaskExit event in podsandbox handler container_id:\"88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85\" id:\"88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85\" pid:2624 exit_status:1 exited_at:{seconds:1757699169 nanos:610593812}" Sep 12 17:46:09.632954 systemd[1]: cri-containerd-521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9.scope: Deactivated successfully. Sep 12 17:46:09.633224 systemd[1]: cri-containerd-521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9.scope: Consumed 13.874s CPU time, 108.4M memory peak, 65.5M read from disk. Sep 12 17:46:09.637584 containerd[1566]: time="2025-09-12T17:46:09.637544646Z" level=info msg="received exit event container_id:\"521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9\" id:\"521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9\" pid:3113 exit_status:1 exited_at:{seconds:1757699169 nanos:637243531}" Sep 12 17:46:09.639238 containerd[1566]: time="2025-09-12T17:46:09.639182563Z" level=info msg="TaskExit event in podsandbox handler container_id:\"521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9\" id:\"521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9\" pid:3113 exit_status:1 exited_at:{seconds:1757699169 nanos:637243531}" Sep 12 17:46:09.658336 kubelet[2767]: E0912 17:46:09.656656 2767 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:37696->10.0.0.2:2379: read: connection timed out" Sep 12 17:46:09.680740 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9-rootfs.mount: Deactivated successfully. Sep 12 17:46:09.680845 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85-rootfs.mount: Deactivated successfully. Sep 12 17:46:10.056232 kubelet[2767]: I0912 17:46:10.056190 2767 scope.go:117] "RemoveContainer" containerID="225b7989fdc315b06336d2595158198d26b0cfe082225ab67ddaa801770e4fc2" Sep 12 17:46:10.071880 kubelet[2767]: I0912 17:46:10.071512 2767 scope.go:117] "RemoveContainer" containerID="88f53f2733ecce4de711d8d1526c099aa1278a740b2121cc62ace0553fd99a85" Sep 12 17:46:10.104666 kubelet[2767]: I0912 17:46:10.103983 2767 scope.go:117] "RemoveContainer" containerID="521ab8a53ec8d2e0b631687b023851e70d144f1ddca79c7600a7afe6ff5ed0b9" Sep 12 17:46:10.117348 containerd[1566]: time="2025-09-12T17:46:10.117303656Z" level=info msg="CreateContainer within sandbox \"33f87d6e7e7024d58e17f88ba976658881d68e5eb184dd00746fedc8df9460ce\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 17:46:10.117721 containerd[1566]: time="2025-09-12T17:46:10.117303887Z" level=info msg="CreateContainer within sandbox \"d749c97ada9d32cf1e819aa2a748a4b72652d2ba41ebb6a2fa8bdf76716aadc5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 12 17:46:10.145548 containerd[1566]: time="2025-09-12T17:46:10.117478984Z" level=info msg="CreateContainer within sandbox \"71560c80bf7e578dfdb218803e2026d28e514362580bf7aaeb12aebdaf28ebab\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 12 17:46:10.263199 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount848105106.mount: Deactivated successfully. Sep 12 17:46:10.263283 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1211911752.mount: Deactivated successfully. Sep 12 17:46:10.265252 containerd[1566]: time="2025-09-12T17:46:10.265077877Z" level=info msg="Container 40b418d7c1debe3e2b5231e6837e74b4765b57a11ed3828f62e347f7fc03626d: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:46:10.265252 containerd[1566]: time="2025-09-12T17:46:10.265196119Z" level=info msg="Container 2dc08a6e9754d311c22e047c6e2d47003881c353bd459c3827981719e9e4cfc9: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:46:10.270184 containerd[1566]: time="2025-09-12T17:46:10.269435751Z" level=info msg="Container b40e8797446f4c4ce4614ab5cede09c7362bb26cbdc12206e5ba6601f086cd77: CDI devices from CRI Config.CDIDevices: []" Sep 12 17:46:10.283126 containerd[1566]: time="2025-09-12T17:46:10.283094742Z" level=info msg="CreateContainer within sandbox \"33f87d6e7e7024d58e17f88ba976658881d68e5eb184dd00746fedc8df9460ce\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"b40e8797446f4c4ce4614ab5cede09c7362bb26cbdc12206e5ba6601f086cd77\"" Sep 12 17:46:10.284683 containerd[1566]: time="2025-09-12T17:46:10.283892347Z" level=info msg="CreateContainer within sandbox \"71560c80bf7e578dfdb218803e2026d28e514362580bf7aaeb12aebdaf28ebab\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"2dc08a6e9754d311c22e047c6e2d47003881c353bd459c3827981719e9e4cfc9\"" Sep 12 17:46:10.285559 containerd[1566]: time="2025-09-12T17:46:10.285144322Z" level=info msg="StartContainer for \"2dc08a6e9754d311c22e047c6e2d47003881c353bd459c3827981719e9e4cfc9\"" Sep 12 17:46:10.286509 containerd[1566]: time="2025-09-12T17:46:10.286071649Z" level=info msg="connecting to shim 2dc08a6e9754d311c22e047c6e2d47003881c353bd459c3827981719e9e4cfc9" address="unix:///run/containerd/s/e9100432818e646424806e69d4eabec5126b8f058f2d7c1c14c44433cf4ef6d1" protocol=ttrpc version=3 Sep 12 17:46:10.287326 containerd[1566]: time="2025-09-12T17:46:10.287304990Z" level=info msg="CreateContainer within sandbox \"d749c97ada9d32cf1e819aa2a748a4b72652d2ba41ebb6a2fa8bdf76716aadc5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"40b418d7c1debe3e2b5231e6837e74b4765b57a11ed3828f62e347f7fc03626d\"" Sep 12 17:46:10.288479 containerd[1566]: time="2025-09-12T17:46:10.288458541Z" level=info msg="StartContainer for \"b40e8797446f4c4ce4614ab5cede09c7362bb26cbdc12206e5ba6601f086cd77\"" Sep 12 17:46:10.289457 containerd[1566]: time="2025-09-12T17:46:10.289435642Z" level=info msg="connecting to shim b40e8797446f4c4ce4614ab5cede09c7362bb26cbdc12206e5ba6601f086cd77" address="unix:///run/containerd/s/e4e4f4e8503b732df4820db8cd37d04080d2a12016dd50f5843ed190a103c710" protocol=ttrpc version=3 Sep 12 17:46:10.292064 containerd[1566]: time="2025-09-12T17:46:10.292028419Z" level=info msg="StartContainer for \"40b418d7c1debe3e2b5231e6837e74b4765b57a11ed3828f62e347f7fc03626d\"" Sep 12 17:46:10.294601 containerd[1566]: time="2025-09-12T17:46:10.294560093Z" level=info msg="connecting to shim 40b418d7c1debe3e2b5231e6837e74b4765b57a11ed3828f62e347f7fc03626d" address="unix:///run/containerd/s/3b94ea257c106b0d94ec53d6deff60afb7805199cf5c97401c8578283753c782" protocol=ttrpc version=3 Sep 12 17:46:10.351029 systemd[1]: Started cri-containerd-b40e8797446f4c4ce4614ab5cede09c7362bb26cbdc12206e5ba6601f086cd77.scope - libcontainer container b40e8797446f4c4ce4614ab5cede09c7362bb26cbdc12206e5ba6601f086cd77. Sep 12 17:46:10.371963 systemd[1]: Started cri-containerd-40b418d7c1debe3e2b5231e6837e74b4765b57a11ed3828f62e347f7fc03626d.scope - libcontainer container 40b418d7c1debe3e2b5231e6837e74b4765b57a11ed3828f62e347f7fc03626d. Sep 12 17:46:10.383619 systemd[1]: Started cri-containerd-2dc08a6e9754d311c22e047c6e2d47003881c353bd459c3827981719e9e4cfc9.scope - libcontainer container 2dc08a6e9754d311c22e047c6e2d47003881c353bd459c3827981719e9e4cfc9. Sep 12 17:46:10.440262 containerd[1566]: time="2025-09-12T17:46:10.440225642Z" level=info msg="StartContainer for \"b40e8797446f4c4ce4614ab5cede09c7362bb26cbdc12206e5ba6601f086cd77\" returns successfully" Sep 12 17:46:10.465887 containerd[1566]: time="2025-09-12T17:46:10.465774288Z" level=info msg="StartContainer for \"2dc08a6e9754d311c22e047c6e2d47003881c353bd459c3827981719e9e4cfc9\" returns successfully" Sep 12 17:46:10.507692 containerd[1566]: time="2025-09-12T17:46:10.507656366Z" level=info msg="StartContainer for \"40b418d7c1debe3e2b5231e6837e74b4765b57a11ed3828f62e347f7fc03626d\" returns successfully" Sep 12 17:46:14.290626 kubelet[2767]: E0912 17:46:14.272854 2767 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:37474->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4426-1-0-d-1f6ac31256.18649a13a383ca4d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4426-1-0-d-1f6ac31256,UID:40343e56910b922452cd39e383691eaf,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4426-1-0-d-1f6ac31256,},FirstTimestamp:2025-09-12 17:46:03.719715405 +0000 UTC m=+160.704198635,LastTimestamp:2025-09-12 17:46:03.719715405 +0000 UTC m=+160.704198635,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426-1-0-d-1f6ac31256,}"