Sep 12 22:59:49.907895 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Sep 12 20:38:35 -00 2025 Sep 12 22:59:49.907923 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:59:49.907931 kernel: BIOS-provided physical RAM map: Sep 12 22:59:49.907937 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 12 22:59:49.907943 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 12 22:59:49.907949 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 12 22:59:49.907957 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Sep 12 22:59:49.907963 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Sep 12 22:59:49.907975 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 12 22:59:49.907981 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 12 22:59:49.907987 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 12 22:59:49.907993 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 12 22:59:49.907999 kernel: NX (Execute Disable) protection: active Sep 12 22:59:49.908005 kernel: APIC: Static calls initialized Sep 12 22:59:49.908013 kernel: SMBIOS 2.8 present. Sep 12 22:59:49.908020 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Sep 12 22:59:49.908027 kernel: DMI: Memory slots populated: 1/1 Sep 12 22:59:49.908033 kernel: Hypervisor detected: KVM Sep 12 22:59:49.908050 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 12 22:59:49.908057 kernel: kvm-clock: using sched offset of 4320968908 cycles Sep 12 22:59:49.908071 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 12 22:59:49.908086 kernel: tsc: Detected 2495.312 MHz processor Sep 12 22:59:49.908095 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 12 22:59:49.908102 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 12 22:59:49.908109 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Sep 12 22:59:49.908116 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 12 22:59:49.908123 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 12 22:59:49.908129 kernel: Using GB pages for direct mapping Sep 12 22:59:49.908136 kernel: ACPI: Early table checksum verification disabled Sep 12 22:59:49.908142 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Sep 12 22:59:49.908149 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:59:49.908157 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:59:49.908164 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:59:49.908170 kernel: ACPI: FACS 0x000000007CFE0000 000040 Sep 12 22:59:49.908177 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:59:49.908183 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:59:49.908190 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:59:49.908197 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:59:49.908203 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Sep 12 22:59:49.908210 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Sep 12 22:59:49.908220 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Sep 12 22:59:49.908226 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Sep 12 22:59:49.908233 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Sep 12 22:59:49.908240 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Sep 12 22:59:49.908246 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Sep 12 22:59:49.908254 kernel: No NUMA configuration found Sep 12 22:59:49.908261 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Sep 12 22:59:49.908267 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Sep 12 22:59:49.908274 kernel: Zone ranges: Sep 12 22:59:49.908281 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 12 22:59:49.908288 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Sep 12 22:59:49.908294 kernel: Normal empty Sep 12 22:59:49.908301 kernel: Device empty Sep 12 22:59:49.908308 kernel: Movable zone start for each node Sep 12 22:59:49.908314 kernel: Early memory node ranges Sep 12 22:59:49.908322 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 12 22:59:49.908329 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Sep 12 22:59:49.908336 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Sep 12 22:59:49.908342 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 12 22:59:49.908349 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 12 22:59:49.908356 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 12 22:59:49.908362 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 12 22:59:49.908369 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 12 22:59:49.908376 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 12 22:59:49.908384 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 12 22:59:49.908391 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 12 22:59:49.908398 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 12 22:59:49.908404 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 12 22:59:49.908411 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 12 22:59:49.908418 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 12 22:59:49.908425 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 12 22:59:49.908431 kernel: CPU topo: Max. logical packages: 1 Sep 12 22:59:49.908438 kernel: CPU topo: Max. logical dies: 1 Sep 12 22:59:49.908446 kernel: CPU topo: Max. dies per package: 1 Sep 12 22:59:49.908466 kernel: CPU topo: Max. threads per core: 1 Sep 12 22:59:49.908474 kernel: CPU topo: Num. cores per package: 2 Sep 12 22:59:49.908483 kernel: CPU topo: Num. threads per package: 2 Sep 12 22:59:49.908491 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 12 22:59:49.908500 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 12 22:59:49.908508 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 12 22:59:49.908517 kernel: Booting paravirtualized kernel on KVM Sep 12 22:59:49.908527 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 12 22:59:49.908535 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 12 22:59:49.908546 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 12 22:59:49.908553 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 12 22:59:49.908560 kernel: pcpu-alloc: [0] 0 1 Sep 12 22:59:49.908566 kernel: kvm-guest: PV spinlocks disabled, no host support Sep 12 22:59:49.908574 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:59:49.908582 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 22:59:49.908589 kernel: random: crng init done Sep 12 22:59:49.908596 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 22:59:49.908604 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 12 22:59:49.908611 kernel: Fallback order for Node 0: 0 Sep 12 22:59:49.908618 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Sep 12 22:59:49.908624 kernel: Policy zone: DMA32 Sep 12 22:59:49.908631 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 22:59:49.908647 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 22:59:49.908662 kernel: ftrace: allocating 40125 entries in 157 pages Sep 12 22:59:49.908669 kernel: ftrace: allocated 157 pages with 5 groups Sep 12 22:59:49.908676 kernel: Dynamic Preempt: voluntary Sep 12 22:59:49.908684 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 22:59:49.908695 kernel: rcu: RCU event tracing is enabled. Sep 12 22:59:49.908703 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 22:59:49.908709 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 22:59:49.908716 kernel: Rude variant of Tasks RCU enabled. Sep 12 22:59:49.908723 kernel: Tracing variant of Tasks RCU enabled. Sep 12 22:59:49.908730 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 22:59:49.908737 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 22:59:49.908743 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:59:49.908752 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:59:49.908759 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:59:49.908766 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 12 22:59:49.908772 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 22:59:49.908779 kernel: Console: colour VGA+ 80x25 Sep 12 22:59:49.908786 kernel: printk: legacy console [tty0] enabled Sep 12 22:59:49.908792 kernel: printk: legacy console [ttyS0] enabled Sep 12 22:59:49.908799 kernel: ACPI: Core revision 20240827 Sep 12 22:59:49.908806 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 12 22:59:49.908818 kernel: APIC: Switch to symmetric I/O mode setup Sep 12 22:59:49.908825 kernel: x2apic enabled Sep 12 22:59:49.908832 kernel: APIC: Switched APIC routing to: physical x2apic Sep 12 22:59:49.908840 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 12 22:59:49.908847 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x23f7ed49df2, max_idle_ns: 440795247253 ns Sep 12 22:59:49.908855 kernel: Calibrating delay loop (skipped) preset value.. 4990.62 BogoMIPS (lpj=2495312) Sep 12 22:59:49.908862 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 12 22:59:49.908869 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 12 22:59:49.908876 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 12 22:59:49.908884 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 12 22:59:49.908891 kernel: Spectre V2 : Mitigation: Retpolines Sep 12 22:59:49.908899 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 12 22:59:49.908906 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 12 22:59:49.908913 kernel: active return thunk: retbleed_return_thunk Sep 12 22:59:49.908920 kernel: RETBleed: Mitigation: untrained return thunk Sep 12 22:59:49.908927 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 12 22:59:49.908935 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 12 22:59:49.908942 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 12 22:59:49.908950 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 12 22:59:49.908957 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 12 22:59:49.908964 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 12 22:59:49.908971 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 12 22:59:49.908978 kernel: Freeing SMP alternatives memory: 32K Sep 12 22:59:49.908985 kernel: pid_max: default: 32768 minimum: 301 Sep 12 22:59:49.908992 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 22:59:49.909000 kernel: landlock: Up and running. Sep 12 22:59:49.909007 kernel: SELinux: Initializing. Sep 12 22:59:49.909014 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 22:59:49.909021 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 12 22:59:49.909028 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 12 22:59:49.909036 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 12 22:59:49.911073 kernel: ... version: 0 Sep 12 22:59:49.911083 kernel: ... bit width: 48 Sep 12 22:59:49.911091 kernel: ... generic registers: 6 Sep 12 22:59:49.911101 kernel: ... value mask: 0000ffffffffffff Sep 12 22:59:49.911108 kernel: ... max period: 00007fffffffffff Sep 12 22:59:49.911115 kernel: ... fixed-purpose events: 0 Sep 12 22:59:49.911122 kernel: ... event mask: 000000000000003f Sep 12 22:59:49.911130 kernel: signal: max sigframe size: 1776 Sep 12 22:59:49.911137 kernel: rcu: Hierarchical SRCU implementation. Sep 12 22:59:49.911144 kernel: rcu: Max phase no-delay instances is 400. Sep 12 22:59:49.911152 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 22:59:49.911159 kernel: smp: Bringing up secondary CPUs ... Sep 12 22:59:49.911168 kernel: smpboot: x86: Booting SMP configuration: Sep 12 22:59:49.911175 kernel: .... node #0, CPUs: #1 Sep 12 22:59:49.911182 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 22:59:49.911189 kernel: smpboot: Total of 2 processors activated (9981.24 BogoMIPS) Sep 12 22:59:49.911197 kernel: Memory: 1917788K/2047464K available (14336K kernel code, 2432K rwdata, 9992K rodata, 54084K init, 2880K bss, 125140K reserved, 0K cma-reserved) Sep 12 22:59:49.911205 kernel: devtmpfs: initialized Sep 12 22:59:49.911212 kernel: x86/mm: Memory block size: 128MB Sep 12 22:59:49.911220 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 22:59:49.911227 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 22:59:49.911236 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 22:59:49.911243 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 22:59:49.911250 kernel: audit: initializing netlink subsys (disabled) Sep 12 22:59:49.911257 kernel: audit: type=2000 audit(1757717986.453:1): state=initialized audit_enabled=0 res=1 Sep 12 22:59:49.911264 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 22:59:49.911272 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 12 22:59:49.911279 kernel: cpuidle: using governor menu Sep 12 22:59:49.911286 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 22:59:49.911293 kernel: dca service started, version 1.12.1 Sep 12 22:59:49.911302 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 12 22:59:49.911309 kernel: PCI: Using configuration type 1 for base access Sep 12 22:59:49.911317 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 12 22:59:49.911324 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 22:59:49.911331 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 22:59:49.911339 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 22:59:49.911346 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 22:59:49.911353 kernel: ACPI: Added _OSI(Module Device) Sep 12 22:59:49.911361 kernel: ACPI: Added _OSI(Processor Device) Sep 12 22:59:49.911369 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 22:59:49.911376 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 22:59:49.911384 kernel: ACPI: Interpreter enabled Sep 12 22:59:49.911391 kernel: ACPI: PM: (supports S0 S5) Sep 12 22:59:49.911398 kernel: ACPI: Using IOAPIC for interrupt routing Sep 12 22:59:49.911405 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 12 22:59:49.911413 kernel: PCI: Using E820 reservations for host bridge windows Sep 12 22:59:49.911420 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 12 22:59:49.911427 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 22:59:49.911562 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 22:59:49.911634 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 12 22:59:49.911698 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 12 22:59:49.911707 kernel: PCI host bridge to bus 0000:00 Sep 12 22:59:49.911781 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 12 22:59:49.911840 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 12 22:59:49.911901 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 12 22:59:49.911959 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Sep 12 22:59:49.912015 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 12 22:59:49.912097 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 12 22:59:49.912156 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 22:59:49.912236 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 12 22:59:49.912314 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Sep 12 22:59:49.912385 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Sep 12 22:59:49.912461 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Sep 12 22:59:49.912528 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Sep 12 22:59:49.912594 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Sep 12 22:59:49.912659 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 12 22:59:49.912734 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.912800 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Sep 12 22:59:49.912870 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 22:59:49.912936 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 12 22:59:49.913004 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 22:59:49.914129 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.914205 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Sep 12 22:59:49.914272 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 22:59:49.914342 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 12 22:59:49.914408 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 22:59:49.914494 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.914562 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Sep 12 22:59:49.914628 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 22:59:49.914693 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 12 22:59:49.914758 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 22:59:49.914830 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.914899 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Sep 12 22:59:49.914965 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 22:59:49.915034 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 12 22:59:49.915680 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 22:59:49.915764 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.915832 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Sep 12 22:59:49.915905 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 22:59:49.915983 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 12 22:59:49.916734 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 22:59:49.916821 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.916891 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Sep 12 22:59:49.916957 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 22:59:49.917023 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 12 22:59:49.918126 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 22:59:49.918208 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.918275 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Sep 12 22:59:49.918340 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 22:59:49.918412 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 12 22:59:49.918563 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 22:59:49.918678 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.918771 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Sep 12 22:59:49.918857 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 22:59:49.918938 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 12 22:59:49.919026 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 22:59:49.921169 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:59:49.921267 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Sep 12 22:59:49.921366 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 22:59:49.921476 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 12 22:59:49.921573 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 22:59:49.921680 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 12 22:59:49.921774 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 12 22:59:49.921883 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 12 22:59:49.921975 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Sep 12 22:59:49.922096 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Sep 12 22:59:49.922215 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 12 22:59:49.922310 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 12 22:59:49.922391 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 12 22:59:49.922483 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Sep 12 22:59:49.922587 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Sep 12 22:59:49.922688 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Sep 12 22:59:49.922793 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 22:59:49.922909 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 12 22:59:49.923010 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Sep 12 22:59:49.923127 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 22:59:49.923238 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Sep 12 22:59:49.923332 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Sep 12 22:59:49.923427 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Sep 12 22:59:49.923544 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 22:59:49.923657 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Sep 12 22:59:49.923760 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Sep 12 22:59:49.923855 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 22:59:49.923975 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 12 22:59:49.925787 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Sep 12 22:59:49.925889 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 22:59:49.925999 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Sep 12 22:59:49.926126 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Sep 12 22:59:49.926218 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Sep 12 22:59:49.926305 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 22:59:49.926319 kernel: acpiphp: Slot [0] registered Sep 12 22:59:49.926421 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 12 22:59:49.926547 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Sep 12 22:59:49.926654 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Sep 12 22:59:49.926799 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Sep 12 22:59:49.926891 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 22:59:49.926906 kernel: acpiphp: Slot [0-2] registered Sep 12 22:59:49.926996 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 22:59:49.927011 kernel: acpiphp: Slot [0-3] registered Sep 12 22:59:49.927126 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 22:59:49.927146 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 12 22:59:49.927157 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 12 22:59:49.927166 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 12 22:59:49.927176 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 12 22:59:49.927185 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 12 22:59:49.927194 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 12 22:59:49.927203 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 12 22:59:49.927213 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 12 22:59:49.927222 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 12 22:59:49.927233 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 12 22:59:49.927242 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 12 22:59:49.927251 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 12 22:59:49.927261 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 12 22:59:49.927270 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 12 22:59:49.927279 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 12 22:59:49.927289 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 12 22:59:49.927298 kernel: iommu: Default domain type: Translated Sep 12 22:59:49.927307 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 12 22:59:49.927317 kernel: PCI: Using ACPI for IRQ routing Sep 12 22:59:49.927327 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 12 22:59:49.927336 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 12 22:59:49.927345 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Sep 12 22:59:49.927440 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 12 22:59:49.927557 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 12 22:59:49.927654 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 12 22:59:49.927669 kernel: vgaarb: loaded Sep 12 22:59:49.927680 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 12 22:59:49.927695 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 12 22:59:49.927705 kernel: clocksource: Switched to clocksource kvm-clock Sep 12 22:59:49.927715 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 22:59:49.927726 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 22:59:49.927736 kernel: pnp: PnP ACPI init Sep 12 22:59:49.927892 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 12 22:59:49.927907 kernel: pnp: PnP ACPI: found 5 devices Sep 12 22:59:49.927915 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 12 22:59:49.927926 kernel: NET: Registered PF_INET protocol family Sep 12 22:59:49.927933 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 22:59:49.927941 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 12 22:59:49.927949 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 22:59:49.927956 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 12 22:59:49.927964 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 12 22:59:49.927974 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 12 22:59:49.927985 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 22:59:49.927995 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 12 22:59:49.928006 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 22:59:49.928016 kernel: NET: Registered PF_XDP protocol family Sep 12 22:59:49.930361 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 22:59:49.930445 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 22:59:49.930559 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 22:59:49.930642 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Sep 12 22:59:49.930713 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Sep 12 22:59:49.930782 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Sep 12 22:59:49.930859 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 22:59:49.930942 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 12 22:59:49.931016 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 22:59:49.931110 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 22:59:49.931182 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 12 22:59:49.931252 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 22:59:49.931327 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 22:59:49.931396 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 12 22:59:49.931497 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 22:59:49.931600 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 22:59:49.934499 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 12 22:59:49.934613 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 22:59:49.934716 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 22:59:49.934790 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 12 22:59:49.934895 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 22:59:49.934988 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 22:59:49.935421 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 12 22:59:49.935548 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 22:59:49.935647 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 22:59:49.935728 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Sep 12 22:59:49.935800 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 12 22:59:49.935874 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 22:59:49.935948 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 22:59:49.936018 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Sep 12 22:59:49.937157 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 12 22:59:49.937236 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 22:59:49.937306 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 22:59:49.937375 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Sep 12 22:59:49.937470 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 12 22:59:49.937571 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 22:59:49.937675 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 12 22:59:49.937767 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 12 22:59:49.937845 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 12 22:59:49.937911 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Sep 12 22:59:49.937984 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 12 22:59:49.939129 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 12 22:59:49.939214 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 12 22:59:49.939278 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 12 22:59:49.939356 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 12 22:59:49.939420 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 12 22:59:49.939529 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 12 22:59:49.939617 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 12 22:59:49.939711 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 12 22:59:49.939777 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 12 22:59:49.939874 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 12 22:59:49.939940 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 12 22:59:49.940008 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Sep 12 22:59:49.940125 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 12 22:59:49.940228 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Sep 12 22:59:49.940321 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 12 22:59:49.940427 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 12 22:59:49.940547 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Sep 12 22:59:49.940633 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Sep 12 22:59:49.940697 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 12 22:59:49.940782 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Sep 12 22:59:49.940868 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 12 22:59:49.940955 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 12 22:59:49.940976 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 12 22:59:49.940987 kernel: PCI: CLS 0 bytes, default 64 Sep 12 22:59:49.940998 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x23f7ed49df2, max_idle_ns: 440795247253 ns Sep 12 22:59:49.941009 kernel: Initialise system trusted keyrings Sep 12 22:59:49.941021 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 12 22:59:49.941032 kernel: Key type asymmetric registered Sep 12 22:59:49.941078 kernel: Asymmetric key parser 'x509' registered Sep 12 22:59:49.941086 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 12 22:59:49.941094 kernel: io scheduler mq-deadline registered Sep 12 22:59:49.941104 kernel: io scheduler kyber registered Sep 12 22:59:49.941113 kernel: io scheduler bfq registered Sep 12 22:59:49.941217 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 12 22:59:49.941312 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 12 22:59:49.941400 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 12 22:59:49.941499 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 12 22:59:49.941597 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 12 22:59:49.941688 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 12 22:59:49.941777 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 12 22:59:49.941864 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 12 22:59:49.941949 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 12 22:59:49.942032 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 12 22:59:49.942165 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 12 22:59:49.942245 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 12 22:59:49.942316 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 12 22:59:49.942385 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 12 22:59:49.942494 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 12 22:59:49.942594 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 12 22:59:49.942607 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 12 22:59:49.942678 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Sep 12 22:59:49.942748 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Sep 12 22:59:49.942760 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 12 22:59:49.942771 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Sep 12 22:59:49.942779 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 22:59:49.942787 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 12 22:59:49.942795 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 12 22:59:49.942804 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 12 22:59:49.942811 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 12 22:59:49.942819 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 12 22:59:49.942896 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 12 22:59:49.942965 kernel: rtc_cmos 00:03: registered as rtc0 Sep 12 22:59:49.943037 kernel: rtc_cmos 00:03: setting system clock to 2025-09-12T22:59:49 UTC (1757717989) Sep 12 22:59:49.943119 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 12 22:59:49.943130 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 12 22:59:49.943138 kernel: NET: Registered PF_INET6 protocol family Sep 12 22:59:49.943146 kernel: Segment Routing with IPv6 Sep 12 22:59:49.943153 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 22:59:49.943161 kernel: NET: Registered PF_PACKET protocol family Sep 12 22:59:49.943169 kernel: Key type dns_resolver registered Sep 12 22:59:49.943179 kernel: IPI shorthand broadcast: enabled Sep 12 22:59:49.943187 kernel: sched_clock: Marking stable (3333012576, 158865345)->(3497073703, -5195782) Sep 12 22:59:49.943195 kernel: registered taskstats version 1 Sep 12 22:59:49.943203 kernel: Loading compiled-in X.509 certificates Sep 12 22:59:49.943211 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: c3297a5801573420030c321362a802da1fd49c4e' Sep 12 22:59:49.943219 kernel: Demotion targets for Node 0: null Sep 12 22:59:49.943227 kernel: Key type .fscrypt registered Sep 12 22:59:49.943234 kernel: Key type fscrypt-provisioning registered Sep 12 22:59:49.943242 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 22:59:49.943252 kernel: ima: Allocated hash algorithm: sha1 Sep 12 22:59:49.943260 kernel: ima: No architecture policies found Sep 12 22:59:49.943267 kernel: clk: Disabling unused clocks Sep 12 22:59:49.943275 kernel: Warning: unable to open an initial console. Sep 12 22:59:49.943283 kernel: Freeing unused kernel image (initmem) memory: 54084K Sep 12 22:59:49.943291 kernel: Write protecting the kernel read-only data: 24576k Sep 12 22:59:49.943298 kernel: Freeing unused kernel image (rodata/data gap) memory: 248K Sep 12 22:59:49.943306 kernel: Run /init as init process Sep 12 22:59:49.943315 kernel: with arguments: Sep 12 22:59:49.943323 kernel: /init Sep 12 22:59:49.943330 kernel: with environment: Sep 12 22:59:49.943338 kernel: HOME=/ Sep 12 22:59:49.943345 kernel: TERM=linux Sep 12 22:59:49.943353 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 22:59:49.943361 systemd[1]: Successfully made /usr/ read-only. Sep 12 22:59:49.943373 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:59:49.943383 systemd[1]: Detected virtualization kvm. Sep 12 22:59:49.943391 systemd[1]: Detected architecture x86-64. Sep 12 22:59:49.943399 systemd[1]: Running in initrd. Sep 12 22:59:49.943406 systemd[1]: No hostname configured, using default hostname. Sep 12 22:59:49.943417 systemd[1]: Hostname set to . Sep 12 22:59:49.943428 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:59:49.943440 systemd[1]: Queued start job for default target initrd.target. Sep 12 22:59:49.943461 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:59:49.943476 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:59:49.943489 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 22:59:49.943500 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:59:49.943511 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 22:59:49.943523 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 22:59:49.943535 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 22:59:49.943547 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 22:59:49.943561 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:59:49.943573 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:59:49.943583 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:59:49.943594 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:59:49.943603 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:59:49.943611 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:59:49.943620 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:59:49.943628 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:59:49.943638 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 22:59:49.943646 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 22:59:49.943655 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:59:49.943663 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:59:49.943672 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:59:49.943680 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:59:49.943689 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 22:59:49.943697 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:59:49.943706 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 22:59:49.943715 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 22:59:49.943723 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 22:59:49.943732 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:59:49.943740 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:59:49.943748 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:49.943756 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 22:59:49.943766 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:59:49.943775 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 22:59:49.943805 systemd-journald[215]: Collecting audit messages is disabled. Sep 12 22:59:49.943830 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:59:49.943839 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 22:59:49.943847 kernel: Bridge firewalling registered Sep 12 22:59:49.943855 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:59:49.943864 systemd-journald[215]: Journal started Sep 12 22:59:49.943884 systemd-journald[215]: Runtime Journal (/run/log/journal/52f104f02c4c4e1e85b0642dd438c206) is 4.8M, max 38.6M, 33.7M free. Sep 12 22:59:49.906808 systemd-modules-load[217]: Inserted module 'overlay' Sep 12 22:59:49.934975 systemd-modules-load[217]: Inserted module 'br_netfilter' Sep 12 22:59:49.976967 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:49.978362 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:59:49.977792 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:59:49.982269 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 22:59:49.985186 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:59:49.994147 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:59:50.000985 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:59:50.009098 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:59:50.011013 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:59:50.020167 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 22:59:50.022519 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:59:50.025465 systemd-tmpfiles[239]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 22:59:50.031409 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:59:50.035144 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:59:50.043436 dracut-cmdline[251]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=8e60d6befc710e967d67e9a1d87ced7416895090c99a765b3a00e66a62f49e40 Sep 12 22:59:50.074645 systemd-resolved[256]: Positive Trust Anchors: Sep 12 22:59:50.074658 systemd-resolved[256]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:59:50.074688 systemd-resolved[256]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:59:50.080471 systemd-resolved[256]: Defaulting to hostname 'linux'. Sep 12 22:59:50.081265 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:59:50.081935 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:59:50.118154 kernel: SCSI subsystem initialized Sep 12 22:59:50.127073 kernel: Loading iSCSI transport class v2.0-870. Sep 12 22:59:50.143176 kernel: iscsi: registered transport (tcp) Sep 12 22:59:50.162136 kernel: iscsi: registered transport (qla4xxx) Sep 12 22:59:50.162215 kernel: QLogic iSCSI HBA Driver Sep 12 22:59:50.180810 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:59:50.192723 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:59:50.197241 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:59:50.243092 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 22:59:50.245850 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 22:59:50.307110 kernel: raid6: avx2x4 gen() 29033 MB/s Sep 12 22:59:50.324129 kernel: raid6: avx2x2 gen() 29254 MB/s Sep 12 22:59:50.341218 kernel: raid6: avx2x1 gen() 25694 MB/s Sep 12 22:59:50.341277 kernel: raid6: using algorithm avx2x2 gen() 29254 MB/s Sep 12 22:59:50.360098 kernel: raid6: .... xor() 19866 MB/s, rmw enabled Sep 12 22:59:50.360171 kernel: raid6: using avx2x2 recovery algorithm Sep 12 22:59:50.401117 kernel: xor: automatically using best checksumming function avx Sep 12 22:59:50.570077 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 22:59:50.577657 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:59:50.580040 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:59:50.602963 systemd-udevd[465]: Using default interface naming scheme 'v255'. Sep 12 22:59:50.607768 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:59:50.612623 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 22:59:50.635141 dracut-pre-trigger[475]: rd.md=0: removing MD RAID activation Sep 12 22:59:50.659439 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:59:50.662757 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:59:50.714888 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:59:50.720416 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 22:59:50.828071 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Sep 12 22:59:50.836058 kernel: scsi host0: Virtio SCSI HBA Sep 12 22:59:50.841087 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 12 22:59:50.867063 kernel: cryptd: max_cpu_qlen set to 1000 Sep 12 22:59:50.875663 kernel: ACPI: bus type USB registered Sep 12 22:59:50.875706 kernel: usbcore: registered new interface driver usbfs Sep 12 22:59:50.876186 kernel: usbcore: registered new interface driver hub Sep 12 22:59:50.882064 kernel: usbcore: registered new device driver usb Sep 12 22:59:50.900734 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:59:50.900831 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:50.905705 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 12 22:59:50.904684 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:50.906549 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:50.909594 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:59:50.912065 kernel: AES CTR mode by8 optimization enabled Sep 12 22:59:50.914063 kernel: libata version 3.00 loaded. Sep 12 22:59:50.935690 kernel: sd 0:0:0:0: Power-on or device reset occurred Sep 12 22:59:50.938072 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 12 22:59:50.940062 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 12 22:59:50.940173 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Sep 12 22:59:50.943118 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 22:59:50.951263 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 22:59:50.951307 kernel: GPT:17805311 != 80003071 Sep 12 22:59:50.951317 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 22:59:50.951331 kernel: GPT:17805311 != 80003071 Sep 12 22:59:50.951340 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 22:59:50.951348 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:59:50.951358 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 12 22:59:50.955073 kernel: ahci 0000:00:1f.2: version 3.0 Sep 12 22:59:50.956074 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 12 22:59:50.956094 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 12 22:59:50.956195 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 12 22:59:50.956282 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 12 22:59:50.961169 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 22:59:50.961292 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 12 22:59:50.961383 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 12 22:59:50.962082 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 22:59:50.962192 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 12 22:59:50.962277 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 12 22:59:50.962359 kernel: hub 1-0:1.0: USB hub found Sep 12 22:59:50.962464 kernel: hub 1-0:1.0: 4 ports detected Sep 12 22:59:50.963057 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 12 22:59:50.964073 kernel: hub 2-0:1.0: USB hub found Sep 12 22:59:50.964187 kernel: hub 2-0:1.0: 4 ports detected Sep 12 22:59:50.971065 kernel: scsi host1: ahci Sep 12 22:59:50.974110 kernel: scsi host2: ahci Sep 12 22:59:50.974253 kernel: scsi host3: ahci Sep 12 22:59:50.975122 kernel: scsi host4: ahci Sep 12 22:59:50.975222 kernel: scsi host5: ahci Sep 12 22:59:50.976174 kernel: scsi host6: ahci Sep 12 22:59:50.976309 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 46 lpm-pol 1 Sep 12 22:59:50.976324 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 46 lpm-pol 1 Sep 12 22:59:50.976339 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 46 lpm-pol 1 Sep 12 22:59:50.976348 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 46 lpm-pol 1 Sep 12 22:59:50.976357 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 46 lpm-pol 1 Sep 12 22:59:50.976366 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 46 lpm-pol 1 Sep 12 22:59:51.032964 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 12 22:59:51.074934 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 12 22:59:51.075734 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:51.086183 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 22:59:51.093438 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 12 22:59:51.093989 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 12 22:59:51.096979 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 22:59:51.133083 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:59:51.133819 disk-uuid[631]: Primary Header is updated. Sep 12 22:59:51.133819 disk-uuid[631]: Secondary Entries is updated. Sep 12 22:59:51.133819 disk-uuid[631]: Secondary Header is updated. Sep 12 22:59:51.205389 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 12 22:59:51.285979 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 12 22:59:51.286030 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 12 22:59:51.288324 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 12 22:59:51.288358 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 12 22:59:51.289050 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 12 22:59:51.292495 kernel: ata1.00: LPM support broken, forcing max_power Sep 12 22:59:51.292513 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 12 22:59:51.292523 kernel: ata1.00: applying bridge limits Sep 12 22:59:51.293050 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 12 22:59:51.294061 kernel: ata1.00: LPM support broken, forcing max_power Sep 12 22:59:51.295095 kernel: ata1.00: configured for UDMA/100 Sep 12 22:59:51.296137 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 22:59:51.349205 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 22:59:51.352780 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 12 22:59:51.353107 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 22:59:51.356612 kernel: usbcore: registered new interface driver usbhid Sep 12 22:59:51.356664 kernel: usbhid: USB HID core driver Sep 12 22:59:51.364491 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Sep 12 22:59:51.364524 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 12 22:59:51.370116 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Sep 12 22:59:51.711784 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 22:59:51.715695 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:59:51.716870 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:59:51.719487 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:59:51.724271 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 22:59:51.755441 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:59:52.167099 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:59:52.167993 disk-uuid[632]: The operation has completed successfully. Sep 12 22:59:52.251659 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 22:59:52.251783 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 22:59:52.297736 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 22:59:52.315094 sh[665]: Success Sep 12 22:59:52.333729 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 22:59:52.334401 kernel: device-mapper: uevent: version 1.0.3 Sep 12 22:59:52.334430 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 22:59:52.346085 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 12 22:59:52.397437 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 22:59:52.402149 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 22:59:52.424559 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 22:59:52.447080 kernel: BTRFS: device fsid 5d2ab445-1154-4e47-9d7e-ff4b81d84474 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (677) Sep 12 22:59:52.447134 kernel: BTRFS info (device dm-0): first mount of filesystem 5d2ab445-1154-4e47-9d7e-ff4b81d84474 Sep 12 22:59:52.451216 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:59:52.465655 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 22:59:52.465748 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 22:59:52.469477 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 22:59:52.472384 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 22:59:52.474277 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:59:52.476091 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 22:59:52.478265 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 22:59:52.485267 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 22:59:52.521148 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (708) Sep 12 22:59:52.527504 kernel: BTRFS info (device sda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:59:52.527564 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:59:52.539956 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:59:52.540075 kernel: BTRFS info (device sda6): turning on async discard Sep 12 22:59:52.540104 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:59:52.550094 kernel: BTRFS info (device sda6): last unmount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:59:52.551693 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 22:59:52.555316 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 22:59:52.607729 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:59:52.612182 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:59:52.662469 systemd-networkd[846]: lo: Link UP Sep 12 22:59:52.664205 systemd-networkd[846]: lo: Gained carrier Sep 12 22:59:52.676747 systemd-networkd[846]: Enumeration completed Sep 12 22:59:52.676890 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:59:52.680685 systemd[1]: Reached target network.target - Network. Sep 12 22:59:52.681510 systemd-networkd[846]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:52.681515 systemd-networkd[846]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:59:52.683667 systemd-networkd[846]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:52.683672 systemd-networkd[846]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:59:52.686121 systemd-networkd[846]: eth0: Link UP Sep 12 22:59:52.686285 systemd-networkd[846]: eth1: Link UP Sep 12 22:59:52.686484 systemd-networkd[846]: eth0: Gained carrier Sep 12 22:59:52.686500 systemd-networkd[846]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:52.692884 systemd-networkd[846]: eth1: Gained carrier Sep 12 22:59:52.692899 systemd-networkd[846]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:52.697252 ignition[787]: Ignition 2.22.0 Sep 12 22:59:52.697265 ignition[787]: Stage: fetch-offline Sep 12 22:59:52.697293 ignition[787]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:52.697300 ignition[787]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:59:52.699006 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:59:52.697377 ignition[787]: parsed url from cmdline: "" Sep 12 22:59:52.697380 ignition[787]: no config URL provided Sep 12 22:59:52.697384 ignition[787]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:59:52.697389 ignition[787]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:59:52.701558 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 22:59:52.697393 ignition[787]: failed to fetch config: resource requires networking Sep 12 22:59:52.697649 ignition[787]: Ignition finished successfully Sep 12 22:59:52.727134 ignition[855]: Ignition 2.22.0 Sep 12 22:59:52.727806 ignition[855]: Stage: fetch Sep 12 22:59:52.727155 systemd-networkd[846]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 22:59:52.727935 ignition[855]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:52.727942 ignition[855]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:59:52.728010 ignition[855]: parsed url from cmdline: "" Sep 12 22:59:52.728014 ignition[855]: no config URL provided Sep 12 22:59:52.728020 ignition[855]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:59:52.728025 ignition[855]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:59:52.730086 ignition[855]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 12 22:59:52.730228 ignition[855]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 12 22:59:52.760098 systemd-networkd[846]: eth0: DHCPv4 address 46.62.198.104/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 22:59:52.930609 ignition[855]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 12 22:59:52.938943 ignition[855]: GET result: OK Sep 12 22:59:52.939287 ignition[855]: parsing config with SHA512: 22cae5254e47e7c6dbb3fa5201236d88393b8e8431a4ff9f8fcd8c7c7866c717c54cf1814794fa4c7d98224d042c253049808419600ff91d0c8be7fde03bc708 Sep 12 22:59:52.946264 unknown[855]: fetched base config from "system" Sep 12 22:59:52.946297 unknown[855]: fetched base config from "system" Sep 12 22:59:52.947188 ignition[855]: fetch: fetch complete Sep 12 22:59:52.946307 unknown[855]: fetched user config from "hetzner" Sep 12 22:59:52.947196 ignition[855]: fetch: fetch passed Sep 12 22:59:52.947260 ignition[855]: Ignition finished successfully Sep 12 22:59:52.951623 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 22:59:52.955119 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 22:59:52.999379 ignition[862]: Ignition 2.22.0 Sep 12 22:59:52.999392 ignition[862]: Stage: kargs Sep 12 22:59:52.999563 ignition[862]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:52.999574 ignition[862]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:59:53.003832 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 22:59:53.000309 ignition[862]: kargs: kargs passed Sep 12 22:59:53.000356 ignition[862]: Ignition finished successfully Sep 12 22:59:53.007318 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 22:59:53.044546 ignition[869]: Ignition 2.22.0 Sep 12 22:59:53.044561 ignition[869]: Stage: disks Sep 12 22:59:53.044713 ignition[869]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:53.044722 ignition[869]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:59:53.047386 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 22:59:53.045790 ignition[869]: disks: disks passed Sep 12 22:59:53.049219 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 22:59:53.045832 ignition[869]: Ignition finished successfully Sep 12 22:59:53.050726 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 22:59:53.052256 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:59:53.053696 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:59:53.055506 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:59:53.057869 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 22:59:53.084688 systemd-fsck[878]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 12 22:59:53.087422 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 22:59:53.089567 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 22:59:53.198071 kernel: EXT4-fs (sda9): mounted filesystem d027afc5-396a-49bf-a5be-60ddd42cb089 r/w with ordered data mode. Quota mode: none. Sep 12 22:59:53.198627 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 22:59:53.199528 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 22:59:53.201746 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:59:53.204131 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 22:59:53.226175 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 22:59:53.228220 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 22:59:53.229671 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:59:53.235559 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 22:59:53.243806 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (886) Sep 12 22:59:53.243827 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 22:59:53.254093 kernel: BTRFS info (device sda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:59:53.254130 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:59:53.264394 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:59:53.264828 kernel: BTRFS info (device sda6): turning on async discard Sep 12 22:59:53.267904 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:59:53.278769 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:59:53.318238 coreos-metadata[888]: Sep 12 22:59:53.318 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 12 22:59:53.319642 coreos-metadata[888]: Sep 12 22:59:53.319 INFO Fetch successful Sep 12 22:59:53.319642 coreos-metadata[888]: Sep 12 22:59:53.319 INFO wrote hostname ci-4459-0-0-5-bd33a49fb7 to /sysroot/etc/hostname Sep 12 22:59:53.320852 initrd-setup-root[913]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 22:59:53.324765 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 22:59:53.329272 initrd-setup-root[921]: cut: /sysroot/etc/group: No such file or directory Sep 12 22:59:53.333916 initrd-setup-root[928]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 22:59:53.339279 initrd-setup-root[935]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 22:59:53.438185 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 22:59:53.441432 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 22:59:53.444980 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 22:59:53.456198 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 22:59:53.462170 kernel: BTRFS info (device sda6): last unmount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:59:53.484532 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 22:59:53.493702 ignition[1003]: INFO : Ignition 2.22.0 Sep 12 22:59:53.493702 ignition[1003]: INFO : Stage: mount Sep 12 22:59:53.496280 ignition[1003]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:53.496280 ignition[1003]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:59:53.496280 ignition[1003]: INFO : mount: mount passed Sep 12 22:59:53.496280 ignition[1003]: INFO : Ignition finished successfully Sep 12 22:59:53.496743 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 22:59:53.498270 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 22:59:53.518353 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:59:53.540097 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1015) Sep 12 22:59:53.540159 kernel: BTRFS info (device sda6): first mount of filesystem fd5cdc72-255e-4ed2-8d25-c5e581a08827 Sep 12 22:59:53.543083 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 12 22:59:53.549365 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:59:53.549392 kernel: BTRFS info (device sda6): turning on async discard Sep 12 22:59:53.549405 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:59:53.552870 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:59:53.591232 ignition[1031]: INFO : Ignition 2.22.0 Sep 12 22:59:53.591232 ignition[1031]: INFO : Stage: files Sep 12 22:59:53.592364 ignition[1031]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:53.592364 ignition[1031]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:59:53.594248 ignition[1031]: DEBUG : files: compiled without relabeling support, skipping Sep 12 22:59:53.596909 ignition[1031]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 22:59:53.596909 ignition[1031]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 22:59:53.600226 ignition[1031]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 22:59:53.601863 ignition[1031]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 22:59:53.601863 ignition[1031]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 22:59:53.600705 unknown[1031]: wrote ssh authorized keys file for user: core Sep 12 22:59:53.605514 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 22:59:53.605514 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 12 22:59:53.738443 systemd-networkd[846]: eth0: Gained IPv6LL Sep 12 22:59:53.930328 systemd-networkd[846]: eth1: Gained IPv6LL Sep 12 22:59:53.981852 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 22:59:54.661293 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 12 22:59:54.661293 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:59:54.666574 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 22:59:54.685133 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 22:59:54.685133 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 22:59:54.685133 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 12 22:59:55.106914 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 22:59:55.345872 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 12 22:59:55.345872 ignition[1031]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 22:59:55.348723 ignition[1031]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:59:55.351995 ignition[1031]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:59:55.351995 ignition[1031]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 22:59:55.351995 ignition[1031]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 22:59:55.356499 ignition[1031]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 22:59:55.356499 ignition[1031]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 22:59:55.356499 ignition[1031]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 22:59:55.356499 ignition[1031]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 12 22:59:55.356499 ignition[1031]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 22:59:55.356499 ignition[1031]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:59:55.356499 ignition[1031]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:59:55.356499 ignition[1031]: INFO : files: files passed Sep 12 22:59:55.356499 ignition[1031]: INFO : Ignition finished successfully Sep 12 22:59:55.353803 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 22:59:55.356551 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 22:59:55.360251 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 22:59:55.375338 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 22:59:55.375477 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 22:59:55.383492 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:59:55.385615 initrd-setup-root-after-ignition[1062]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:59:55.387206 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:59:55.388322 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:59:55.389736 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 22:59:55.391990 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 22:59:55.443084 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 22:59:55.443198 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 22:59:55.445850 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 22:59:55.447712 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 22:59:55.449401 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 22:59:55.450438 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 22:59:55.475229 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:59:55.479017 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 22:59:55.529915 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:59:55.531488 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:59:55.533991 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 22:59:55.536432 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 22:59:55.536698 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:59:55.539307 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 22:59:55.540864 systemd[1]: Stopped target basic.target - Basic System. Sep 12 22:59:55.543136 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 22:59:55.545303 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:59:55.547393 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 22:59:55.549884 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:59:55.553216 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 22:59:55.555737 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:59:55.558543 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 22:59:55.560874 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 22:59:55.563279 systemd[1]: Stopped target swap.target - Swaps. Sep 12 22:59:55.565871 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 22:59:55.566268 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:59:55.568383 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:59:55.570971 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:59:55.573186 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 22:59:55.574142 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:59:55.575617 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 22:59:55.575900 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 22:59:55.579077 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 22:59:55.579402 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:59:55.590360 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 22:59:55.590595 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 22:59:55.592831 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 22:59:55.593103 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 22:59:55.598348 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 22:59:55.600279 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 22:59:55.602266 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:59:55.606250 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 22:59:55.608363 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 22:59:55.608608 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:59:55.610932 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 22:59:55.611127 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:59:55.631741 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 22:59:55.631842 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 22:59:55.640899 ignition[1086]: INFO : Ignition 2.22.0 Sep 12 22:59:55.643095 ignition[1086]: INFO : Stage: umount Sep 12 22:59:55.643757 ignition[1086]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:59:55.643757 ignition[1086]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:59:55.645785 ignition[1086]: INFO : umount: umount passed Sep 12 22:59:55.645785 ignition[1086]: INFO : Ignition finished successfully Sep 12 22:59:55.649070 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 22:59:55.650277 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 22:59:55.652039 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 22:59:55.653024 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 22:59:55.654538 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 22:59:55.655108 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 22:59:55.656812 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 22:59:55.656844 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 22:59:55.658897 systemd[1]: Stopped target network.target - Network. Sep 12 22:59:55.665697 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 22:59:55.665737 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:59:55.666996 systemd[1]: Stopped target paths.target - Path Units. Sep 12 22:59:55.668212 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 22:59:55.672477 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:59:55.673001 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 22:59:55.674973 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 22:59:55.683485 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 22:59:55.683533 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:59:55.684894 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 22:59:55.684960 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:59:55.686970 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 22:59:55.687066 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 22:59:55.688636 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 22:59:55.688685 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 22:59:55.690305 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 22:59:55.691573 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 22:59:55.702519 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 22:59:55.703305 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 22:59:55.703416 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 22:59:55.707026 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 22:59:55.707352 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 22:59:55.707450 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 22:59:55.709678 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 22:59:55.709772 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 22:59:55.711415 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 22:59:55.711485 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:59:55.714391 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:59:55.714685 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 22:59:55.714803 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 22:59:55.717439 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 22:59:55.717658 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 22:59:55.718709 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 22:59:55.718746 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:59:55.720991 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 22:59:55.724212 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 22:59:55.724274 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:59:55.727208 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 22:59:55.727260 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:59:55.729184 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 22:59:55.729231 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 22:59:55.730560 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:59:55.736661 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 22:59:55.744283 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 22:59:55.744392 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:59:55.746014 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 22:59:55.746074 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 22:59:55.747654 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 22:59:55.747678 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:59:55.749341 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 22:59:55.749406 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:59:55.749878 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 22:59:55.749906 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 22:59:55.751059 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 22:59:55.751093 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:59:55.755159 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 22:59:55.755792 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 22:59:55.755835 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:59:55.757680 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 22:59:55.757712 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:59:55.759867 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:59:55.759899 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:55.761652 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 22:59:55.762141 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 22:59:55.770227 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 22:59:55.770303 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 22:59:55.771603 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 22:59:55.774140 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 22:59:55.791718 systemd[1]: Switching root. Sep 12 22:59:55.835151 systemd-journald[215]: Received SIGTERM from PID 1 (systemd). Sep 12 22:59:55.835249 systemd-journald[215]: Journal stopped Sep 12 22:59:57.041833 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 22:59:57.041879 kernel: SELinux: policy capability open_perms=1 Sep 12 22:59:57.041889 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 22:59:57.041901 kernel: SELinux: policy capability always_check_network=0 Sep 12 22:59:57.041909 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 22:59:57.041918 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 22:59:57.041929 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 22:59:57.041938 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 22:59:57.041946 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 22:59:57.041957 kernel: audit: type=1403 audit(1757717996.019:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 22:59:57.041971 systemd[1]: Successfully loaded SELinux policy in 70.586ms. Sep 12 22:59:57.041988 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.636ms. Sep 12 22:59:57.041999 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:59:57.042009 systemd[1]: Detected virtualization kvm. Sep 12 22:59:57.042020 systemd[1]: Detected architecture x86-64. Sep 12 22:59:57.042030 systemd[1]: Detected first boot. Sep 12 22:59:57.042771 systemd[1]: Hostname set to . Sep 12 22:59:57.042796 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:59:57.042806 zram_generator::config[1130]: No configuration found. Sep 12 22:59:57.042821 kernel: Guest personality initialized and is inactive Sep 12 22:59:57.042830 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 12 22:59:57.042839 kernel: Initialized host personality Sep 12 22:59:57.042850 kernel: NET: Registered PF_VSOCK protocol family Sep 12 22:59:57.042859 systemd[1]: Populated /etc with preset unit settings. Sep 12 22:59:57.042870 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 22:59:57.042880 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 22:59:57.042891 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 22:59:57.042900 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 22:59:57.042910 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 22:59:57.042921 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 22:59:57.042930 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 22:59:57.042941 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 22:59:57.042954 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 22:59:57.042966 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 22:59:57.042977 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 22:59:57.042987 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 22:59:57.042998 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:59:57.043008 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:59:57.043018 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 22:59:57.043027 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 22:59:57.043037 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 22:59:57.043068 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:59:57.043078 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 12 22:59:57.043089 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:59:57.043100 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:59:57.043110 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 22:59:57.043120 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 22:59:57.043129 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 22:59:57.043139 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 22:59:57.043149 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:59:57.043159 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:59:57.043168 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:59:57.043179 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:59:57.043189 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 22:59:57.043199 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 22:59:57.043209 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 22:59:57.043219 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:59:57.043228 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:59:57.043238 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:59:57.043247 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 22:59:57.043257 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 22:59:57.043268 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 22:59:57.043278 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 22:59:57.043287 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:59:57.043297 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 22:59:57.043307 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 22:59:57.043316 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 22:59:57.043326 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 22:59:57.043336 systemd[1]: Reached target machines.target - Containers. Sep 12 22:59:57.043347 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 22:59:57.043357 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:59:57.043367 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:59:57.043376 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 22:59:57.043386 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:59:57.043396 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:59:57.043406 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:59:57.043415 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 22:59:57.043426 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:59:57.043437 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 22:59:57.043447 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 22:59:57.043456 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 22:59:57.043477 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 22:59:57.043486 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 22:59:57.043497 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:59:57.043507 kernel: loop: module loaded Sep 12 22:59:57.043516 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:59:57.043527 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:59:57.043537 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:59:57.043548 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 22:59:57.043558 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 22:59:57.043568 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:59:57.043579 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 22:59:57.043588 systemd[1]: Stopped verity-setup.service. Sep 12 22:59:57.043599 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:59:57.043609 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 22:59:57.043619 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 22:59:57.043629 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 22:59:57.043641 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 22:59:57.043651 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 22:59:57.043661 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 22:59:57.043671 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:59:57.043680 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 22:59:57.043690 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 22:59:57.043699 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:59:57.043710 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:59:57.043720 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:59:57.043730 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:59:57.043740 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:59:57.043749 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:59:57.043759 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:59:57.043768 kernel: fuse: init (API version 7.41) Sep 12 22:59:57.043778 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:59:57.043788 kernel: ACPI: bus type drm_connector registered Sep 12 22:59:57.043798 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 22:59:57.043808 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 22:59:57.043817 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:59:57.043827 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:59:57.043837 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 22:59:57.043848 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 22:59:57.043873 systemd-journald[1204]: Collecting audit messages is disabled. Sep 12 22:59:57.043898 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 22:59:57.043909 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:59:57.043920 systemd-journald[1204]: Journal started Sep 12 22:59:57.043939 systemd-journald[1204]: Runtime Journal (/run/log/journal/52f104f02c4c4e1e85b0642dd438c206) is 4.8M, max 38.6M, 33.7M free. Sep 12 22:59:56.670121 systemd[1]: Queued start job for default target multi-user.target. Sep 12 22:59:56.695233 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 22:59:56.695849 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 22:59:57.048089 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 22:59:57.053144 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 22:59:57.058709 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 22:59:57.058759 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:59:57.062068 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 22:59:57.065062 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 22:59:57.069060 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:59:57.074057 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 22:59:57.074103 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:59:57.092060 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 22:59:57.095061 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:59:57.109762 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:59:57.109833 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 22:59:57.113060 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 22:59:57.118127 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:59:57.124097 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:59:57.125287 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 22:59:57.127527 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 22:59:57.136065 kernel: loop0: detected capacity change from 0 to 8 Sep 12 22:59:57.152067 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 22:59:57.155754 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 22:59:57.159527 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:59:57.167125 kernel: loop1: detected capacity change from 0 to 221472 Sep 12 22:59:57.168265 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 22:59:57.175141 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 22:59:57.176650 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 22:59:57.192989 systemd-journald[1204]: Time spent on flushing to /var/log/journal/52f104f02c4c4e1e85b0642dd438c206 is 29.487ms for 1174 entries. Sep 12 22:59:57.192989 systemd-journald[1204]: System Journal (/var/log/journal/52f104f02c4c4e1e85b0642dd438c206) is 8M, max 584.8M, 576.8M free. Sep 12 22:59:57.238516 systemd-journald[1204]: Received client request to flush runtime journal. Sep 12 22:59:57.238553 kernel: loop2: detected capacity change from 0 to 110984 Sep 12 22:59:57.195335 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 22:59:57.197432 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:59:57.221960 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 22:59:57.232880 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 12 22:59:57.232891 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Sep 12 22:59:57.240477 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 22:59:57.243826 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:59:57.270409 kernel: loop3: detected capacity change from 0 to 128016 Sep 12 22:59:57.314079 kernel: loop4: detected capacity change from 0 to 8 Sep 12 22:59:57.317297 kernel: loop5: detected capacity change from 0 to 221472 Sep 12 22:59:57.344851 kernel: loop6: detected capacity change from 0 to 110984 Sep 12 22:59:57.368107 kernel: loop7: detected capacity change from 0 to 128016 Sep 12 22:59:57.388363 (sd-merge)[1278]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 12 22:59:57.388791 (sd-merge)[1278]: Merged extensions into '/usr'. Sep 12 22:59:57.398582 systemd[1]: Reload requested from client PID 1236 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 22:59:57.398602 systemd[1]: Reloading... Sep 12 22:59:57.504081 zram_generator::config[1304]: No configuration found. Sep 12 22:59:57.650482 ldconfig[1232]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 22:59:57.732364 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 22:59:57.732724 systemd[1]: Reloading finished in 333 ms. Sep 12 22:59:57.748301 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 22:59:57.749293 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 22:59:57.760208 systemd[1]: Starting ensure-sysext.service... Sep 12 22:59:57.764195 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:59:57.798360 systemd[1]: Reload requested from client PID 1347 ('systemctl') (unit ensure-sysext.service)... Sep 12 22:59:57.798500 systemd[1]: Reloading... Sep 12 22:59:57.799943 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 22:59:57.801820 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 22:59:57.803361 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 22:59:57.803779 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 22:59:57.806709 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 22:59:57.807070 systemd-tmpfiles[1348]: ACLs are not supported, ignoring. Sep 12 22:59:57.807140 systemd-tmpfiles[1348]: ACLs are not supported, ignoring. Sep 12 22:59:57.811775 systemd-tmpfiles[1348]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:59:57.811789 systemd-tmpfiles[1348]: Skipping /boot Sep 12 22:59:57.820704 systemd-tmpfiles[1348]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:59:57.820832 systemd-tmpfiles[1348]: Skipping /boot Sep 12 22:59:57.864115 zram_generator::config[1378]: No configuration found. Sep 12 22:59:58.031008 systemd[1]: Reloading finished in 231 ms. Sep 12 22:59:58.051645 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 22:59:58.052560 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:59:58.063214 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:59:58.071145 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 22:59:58.075452 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 22:59:58.080177 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:59:58.083528 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:59:58.087679 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 22:59:58.093940 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:59:58.094390 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:59:58.096174 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:59:58.101000 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:59:58.112352 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:59:58.113163 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:59:58.113283 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:59:58.113394 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:59:58.119095 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 22:59:58.123675 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:59:58.123837 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:59:58.123973 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:59:58.124064 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:59:58.124145 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:59:58.133940 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:59:58.136488 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:59:58.137594 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 22:59:58.139611 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:59:58.139916 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:59:58.143218 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:59:58.143900 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:59:58.143997 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:59:58.144154 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 12 22:59:58.149153 systemd[1]: Finished ensure-sysext.service. Sep 12 22:59:58.153324 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 22:59:58.155328 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:59:58.156552 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:59:58.158442 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:59:58.160866 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:59:58.161067 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:59:58.171103 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:59:58.171274 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:59:58.173320 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:59:58.180895 systemd-udevd[1425]: Using default interface naming scheme 'v255'. Sep 12 22:59:58.185141 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 22:59:58.186778 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 22:59:58.203251 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 22:59:58.204058 augenrules[1459]: No rules Sep 12 22:59:58.205060 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:59:58.205307 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:59:58.220238 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:59:58.223501 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:59:58.225705 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 22:59:58.234806 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 22:59:58.237217 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 22:59:58.328599 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 12 22:59:58.369113 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 22:59:58.400203 systemd-networkd[1470]: lo: Link UP Sep 12 22:59:58.400226 systemd-networkd[1470]: lo: Gained carrier Sep 12 22:59:58.401117 systemd-networkd[1470]: Enumeration completed Sep 12 22:59:58.401192 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:59:58.406265 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 22:59:58.408823 systemd-resolved[1423]: Positive Trust Anchors: Sep 12 22:59:58.408831 systemd-resolved[1423]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:59:58.408861 systemd-resolved[1423]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:59:58.410197 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 22:59:58.414399 systemd-resolved[1423]: Using system hostname 'ci-4459-0-0-5-bd33a49fb7'. Sep 12 22:59:58.415589 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:59:58.416382 systemd[1]: Reached target network.target - Network. Sep 12 22:59:58.417265 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:59:58.439794 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 22:59:58.440948 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:59:58.442177 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 22:59:58.443155 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 22:59:58.444103 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 12 22:59:58.445117 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 22:59:58.446131 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 22:59:58.446168 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:59:58.447101 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 22:59:58.448221 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 22:59:58.448364 systemd-networkd[1470]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:58.448376 systemd-networkd[1470]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:59:58.449060 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 22:59:58.449623 systemd-networkd[1470]: eth0: Link UP Sep 12 22:59:58.449762 systemd-networkd[1470]: eth0: Gained carrier Sep 12 22:59:58.449781 systemd-networkd[1470]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:58.450034 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:59:58.452203 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 22:59:58.455380 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 22:59:58.459850 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 22:59:58.461770 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 22:59:58.462479 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 22:59:58.470075 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 12 22:59:58.470874 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 22:59:58.472350 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 22:59:58.474881 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 22:59:58.475548 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 22:59:58.483527 systemd-networkd[1470]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:58.483534 systemd-networkd[1470]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:59:58.483927 systemd-networkd[1470]: eth1: Link UP Sep 12 22:59:58.485114 systemd-networkd[1470]: eth1: Gained carrier Sep 12 22:59:58.485128 systemd-networkd[1470]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:59:58.486493 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:59:58.488106 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:59:58.488589 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:59:58.488611 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:59:58.489652 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 22:59:58.491485 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 22:59:58.493803 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 22:59:58.499090 kernel: ACPI: button: Power Button [PWRF] Sep 12 22:59:58.497060 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 22:59:58.501051 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 22:59:58.505117 systemd-networkd[1470]: eth0: DHCPv4 address 46.62.198.104/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 22:59:58.507529 systemd-timesyncd[1443]: Network configuration changed, trying to establish connection. Sep 12 22:59:58.508030 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 22:59:58.509125 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 22:59:58.512482 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 12 22:59:58.517557 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 22:59:58.520165 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 22:59:58.531133 jq[1523]: false Sep 12 22:59:58.531621 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 22:59:58.533389 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 22:59:58.536476 systemd-networkd[1470]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 22:59:58.542259 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 22:59:58.546265 google_oslogin_nss_cache[1527]: oslogin_cache_refresh[1527]: Refreshing passwd entry cache Sep 12 22:59:58.546121 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 22:59:58.544296 oslogin_cache_refresh[1527]: Refreshing passwd entry cache Sep 12 22:59:58.547347 google_oslogin_nss_cache[1527]: oslogin_cache_refresh[1527]: Failure getting users, quitting Sep 12 22:59:58.547424 oslogin_cache_refresh[1527]: Failure getting users, quitting Sep 12 22:59:58.547497 google_oslogin_nss_cache[1527]: oslogin_cache_refresh[1527]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 22:59:58.547530 oslogin_cache_refresh[1527]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 12 22:59:58.547608 google_oslogin_nss_cache[1527]: oslogin_cache_refresh[1527]: Refreshing group entry cache Sep 12 22:59:58.547634 oslogin_cache_refresh[1527]: Refreshing group entry cache Sep 12 22:59:58.550021 google_oslogin_nss_cache[1527]: oslogin_cache_refresh[1527]: Failure getting groups, quitting Sep 12 22:59:58.550021 google_oslogin_nss_cache[1527]: oslogin_cache_refresh[1527]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 22:59:58.549156 oslogin_cache_refresh[1527]: Failure getting groups, quitting Sep 12 22:59:58.549163 oslogin_cache_refresh[1527]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 12 22:59:58.552863 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 22:59:58.557270 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 22:59:58.564222 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 22:59:58.571257 extend-filesystems[1524]: Found /dev/sda6 Sep 12 22:59:58.572591 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 22:59:58.573510 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 22:59:58.574323 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 22:59:58.574607 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 12 22:59:58.575377 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 12 22:59:58.582580 extend-filesystems[1524]: Found /dev/sda9 Sep 12 22:59:58.582580 extend-filesystems[1524]: Checking size of /dev/sda9 Sep 12 22:59:58.582430 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 22:59:58.583478 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 22:59:58.590059 jq[1538]: true Sep 12 22:59:58.626539 extend-filesystems[1524]: Resized partition /dev/sda9 Sep 12 22:59:58.628670 extend-filesystems[1567]: resize2fs 1.47.3 (8-Jul-2025) Sep 12 22:59:58.648163 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 12 22:59:58.648187 coreos-metadata[1520]: Sep 12 22:59:58.642 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 12 22:59:58.648187 coreos-metadata[1520]: Sep 12 22:59:58.644 INFO Fetch successful Sep 12 22:59:58.648187 coreos-metadata[1520]: Sep 12 22:59:58.644 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 12 22:59:58.648721 systemd-timesyncd[1443]: Contacted time server 152.70.19.169:123 (0.flatcar.pool.ntp.org). Sep 12 22:59:58.651323 jq[1554]: true Sep 12 22:59:58.651463 coreos-metadata[1520]: Sep 12 22:59:58.648 INFO Fetch successful Sep 12 22:59:58.648762 systemd-timesyncd[1443]: Initial clock synchronization to Fri 2025-09-12 22:59:58.858078 UTC. Sep 12 22:59:58.655082 update_engine[1537]: I20250912 22:59:58.654558 1537 main.cc:92] Flatcar Update Engine starting Sep 12 22:59:58.652877 (ntainerd)[1563]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 22:59:58.656739 tar[1544]: linux-amd64/helm Sep 12 22:59:58.659280 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 22:59:58.659450 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 22:59:58.689021 dbus-daemon[1521]: [system] SELinux support is enabled Sep 12 22:59:58.689189 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 22:59:58.693998 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 22:59:58.694024 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 22:59:58.697776 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 22:59:58.697799 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 22:59:58.712801 systemd[1]: Started update-engine.service - Update Engine. Sep 12 22:59:58.712921 update_engine[1537]: I20250912 22:59:58.712853 1537 update_check_scheduler.cc:74] Next update check in 4m21s Sep 12 22:59:58.714161 systemd-logind[1533]: New seat seat0. Sep 12 22:59:58.727389 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 22:59:58.728463 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 22:59:58.845988 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 22:59:58.848778 bash[1592]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:59:58.851804 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 22:59:58.858771 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 22:59:58.862281 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 22:59:58.867257 systemd[1]: Starting sshkeys.service... Sep 12 22:59:58.868740 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 22:59:58.877092 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 12 22:59:58.905902 extend-filesystems[1567]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 22:59:58.905902 extend-filesystems[1567]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 12 22:59:58.905902 extend-filesystems[1567]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 12 22:59:58.905154 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 22:59:58.911688 extend-filesystems[1524]: Resized filesystem in /dev/sda9 Sep 12 22:59:58.905363 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 22:59:58.912439 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 22:59:58.920894 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 22:59:58.922995 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 22:59:58.950231 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 12 22:59:58.956115 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 12 22:59:58.981314 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 12 22:59:58.981546 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 12 22:59:59.011988 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Sep 12 22:59:59.014250 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Sep 12 22:59:59.027093 coreos-metadata[1611]: Sep 12 22:59:59.026 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 12 22:59:59.037471 coreos-metadata[1611]: Sep 12 22:59:59.035 INFO Fetch successful Sep 12 22:59:59.039359 unknown[1611]: wrote ssh authorized keys file for user: core Sep 12 22:59:59.072590 kernel: Console: switching to colour dummy device 80x25 Sep 12 22:59:59.072675 locksmithd[1576]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 22:59:59.081106 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 12 22:59:59.081177 kernel: [drm] features: -context_init Sep 12 22:59:59.090197 kernel: [drm] number of scanouts: 1 Sep 12 22:59:59.091857 containerd[1563]: time="2025-09-12T22:59:59Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 22:59:59.096164 containerd[1563]: time="2025-09-12T22:59:59.095248391Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 22:59:59.100615 update-ssh-keys[1627]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:59:59.101111 kernel: [drm] number of cap sets: 0 Sep 12 22:59:59.103381 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 22:59:59.105203 systemd[1]: Finished sshkeys.service. Sep 12 22:59:59.113653 containerd[1563]: time="2025-09-12T22:59:59.113615004Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.904µs" Sep 12 22:59:59.114109 containerd[1563]: time="2025-09-12T22:59:59.114089917Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 22:59:59.114168 containerd[1563]: time="2025-09-12T22:59:59.114157557Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 22:59:59.114342 containerd[1563]: time="2025-09-12T22:59:59.114327543Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 22:59:59.114405 containerd[1563]: time="2025-09-12T22:59:59.114382013Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 22:59:59.114462 containerd[1563]: time="2025-09-12T22:59:59.114452263Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:59:59.114550 containerd[1563]: time="2025-09-12T22:59:59.114537215Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:59:59.114593 containerd[1563]: time="2025-09-12T22:59:59.114583942Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:59:59.114844 containerd[1563]: time="2025-09-12T22:59:59.114824005Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116088309Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116108964Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116116581Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116176479Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116331970Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116353642Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116361497Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116396062Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116654857Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 22:59:59.116790 containerd[1563]: time="2025-09-12T22:59:59.116701585Z" level=info msg="metadata content store policy set" policy=shared Sep 12 22:59:59.120808 containerd[1563]: time="2025-09-12T22:59:59.120787524Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 22:59:59.120883 containerd[1563]: time="2025-09-12T22:59:59.120873608Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 22:59:59.120943 containerd[1563]: time="2025-09-12T22:59:59.120934339Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 22:59:59.120997 containerd[1563]: time="2025-09-12T22:59:59.120987605Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 22:59:59.121038 containerd[1563]: time="2025-09-12T22:59:59.121029962Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 22:59:59.121093 containerd[1563]: time="2025-09-12T22:59:59.121084904Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 22:59:59.121135 containerd[1563]: time="2025-09-12T22:59:59.121127602Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 22:59:59.121175 containerd[1563]: time="2025-09-12T22:59:59.121167626Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 22:59:59.121212 containerd[1563]: time="2025-09-12T22:59:59.121204946Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 22:59:59.121264 containerd[1563]: time="2025-09-12T22:59:59.121254727Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 22:59:59.121391 containerd[1563]: time="2025-09-12T22:59:59.121380866Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 22:59:59.121436 containerd[1563]: time="2025-09-12T22:59:59.121426420Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 22:59:59.121560 containerd[1563]: time="2025-09-12T22:59:59.121546771Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122105331Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122127723Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122137839Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122147360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122156335Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122165270Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122173371Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122182644Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122191404Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122200605Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122259064Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122275862Z" level=info msg="Start snapshots syncer" Sep 12 22:59:59.122419 containerd[1563]: time="2025-09-12T22:59:59.122300075Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 22:59:59.122945 containerd[1563]: time="2025-09-12T22:59:59.122913412Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.123080767Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124143808Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124234179Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124251677Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124260838Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124272322Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124282171Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124292071Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124301613Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124321002Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124329895Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124339313Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124361201Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:59:59.124706 containerd[1563]: time="2025-09-12T22:59:59.124371616Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124379039Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124387501Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124395448Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124403898Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124412493Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124425426Z" level=info msg="runtime interface created" Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124429817Z" level=info msg="created NRI interface" Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124436119Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124445825Z" level=info msg="Connect containerd service" Sep 12 22:59:59.124962 containerd[1563]: time="2025-09-12T22:59:59.124463704Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 22:59:59.129949 containerd[1563]: time="2025-09-12T22:59:59.129630637Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:59:59.133079 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Sep 12 22:59:59.172550 kernel: EDAC MC: Ver: 3.0.0 Sep 12 22:59:59.206464 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Sep 12 22:59:59.206548 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 22:59:59.256852 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:59.279739 containerd[1563]: time="2025-09-12T22:59:59.279703805Z" level=info msg="Start subscribing containerd event" Sep 12 22:59:59.279904 containerd[1563]: time="2025-09-12T22:59:59.279892812Z" level=info msg="Start recovering state" Sep 12 22:59:59.280039 containerd[1563]: time="2025-09-12T22:59:59.280017851Z" level=info msg="Start event monitor" Sep 12 22:59:59.280111 containerd[1563]: time="2025-09-12T22:59:59.280093601Z" level=info msg="Start cni network conf syncer for default" Sep 12 22:59:59.280155 containerd[1563]: time="2025-09-12T22:59:59.280147885Z" level=info msg="Start streaming server" Sep 12 22:59:59.280194 containerd[1563]: time="2025-09-12T22:59:59.280186255Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 22:59:59.280243 containerd[1563]: time="2025-09-12T22:59:59.280235860Z" level=info msg="runtime interface starting up..." Sep 12 22:59:59.280278 containerd[1563]: time="2025-09-12T22:59:59.280271896Z" level=info msg="starting plugins..." Sep 12 22:59:59.280346 containerd[1563]: time="2025-09-12T22:59:59.280337190Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 22:59:59.284086 containerd[1563]: time="2025-09-12T22:59:59.282931473Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 22:59:59.284086 containerd[1563]: time="2025-09-12T22:59:59.283121662Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 22:59:59.286526 containerd[1563]: time="2025-09-12T22:59:59.283183657Z" level=info msg="containerd successfully booted in 0.191633s" Sep 12 22:59:59.320583 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 12 22:59:59.319288 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 22:59:59.370706 systemd-logind[1533]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 12 22:59:59.382922 sshd_keygen[1542]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 22:59:59.410318 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 22:59:59.416734 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 22:59:59.428967 systemd-logind[1533]: Watching system buttons on /dev/input/event3 (Power Button) Sep 12 22:59:59.448475 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:59.450948 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 22:59:59.451493 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 22:59:59.456274 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 22:59:59.460250 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:59:59.460551 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:59.462239 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:59.465214 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:59.467709 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:59:59.476747 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:59:59.476931 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:59:59.480416 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:59:59.483674 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 22:59:59.487949 tar[1544]: linux-amd64/LICENSE Sep 12 22:59:59.487949 tar[1544]: linux-amd64/README.md Sep 12 22:59:59.490325 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 22:59:59.493315 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 12 22:59:59.494219 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 22:59:59.519998 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 22:59:59.559585 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 23:00:00.203403 systemd-networkd[1470]: eth0: Gained IPv6LL Sep 12 23:00:00.205437 systemd-networkd[1470]: eth1: Gained IPv6LL Sep 12 23:00:00.208920 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 23:00:00.211504 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 23:00:00.216979 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:00.220428 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 23:00:00.275057 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 23:00:01.440224 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:01.444901 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 23:00:01.447988 systemd[1]: Startup finished in 3.459s (kernel) + 6.298s (initrd) + 5.498s (userspace) = 15.256s. Sep 12 23:00:01.449565 (kubelet)[1711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:00:02.224844 kubelet[1711]: E0912 23:00:02.224744 1711 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:00:02.227359 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:00:02.227538 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:00:02.227902 systemd[1]: kubelet.service: Consumed 1.306s CPU time, 266.6M memory peak. Sep 12 23:00:04.385313 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 23:00:04.388488 systemd[1]: Started sshd@0-46.62.198.104:22-139.178.89.65:36802.service - OpenSSH per-connection server daemon (139.178.89.65:36802). Sep 12 23:00:05.428150 sshd[1724]: Accepted publickey for core from 139.178.89.65 port 36802 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:00:05.430715 sshd-session[1724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:05.443765 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 23:00:05.446533 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 23:00:05.470975 systemd-logind[1533]: New session 1 of user core. Sep 12 23:00:05.482293 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 23:00:05.487052 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 23:00:05.503531 (systemd)[1729]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 23:00:05.508640 systemd-logind[1533]: New session c1 of user core. Sep 12 23:00:05.701962 systemd[1729]: Queued start job for default target default.target. Sep 12 23:00:05.714145 systemd[1729]: Created slice app.slice - User Application Slice. Sep 12 23:00:05.714408 systemd[1729]: Reached target paths.target - Paths. Sep 12 23:00:05.714468 systemd[1729]: Reached target timers.target - Timers. Sep 12 23:00:05.715624 systemd[1729]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 23:00:05.739754 systemd[1729]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 23:00:05.739849 systemd[1729]: Reached target sockets.target - Sockets. Sep 12 23:00:05.739936 systemd[1729]: Reached target basic.target - Basic System. Sep 12 23:00:05.739987 systemd[1729]: Reached target default.target - Main User Target. Sep 12 23:00:05.740030 systemd[1729]: Startup finished in 219ms. Sep 12 23:00:05.740240 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 23:00:05.750396 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 23:00:06.449704 systemd[1]: Started sshd@1-46.62.198.104:22-139.178.89.65:36818.service - OpenSSH per-connection server daemon (139.178.89.65:36818). Sep 12 23:00:07.466560 sshd[1740]: Accepted publickey for core from 139.178.89.65 port 36818 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:00:07.469419 sshd-session[1740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:07.479153 systemd-logind[1533]: New session 2 of user core. Sep 12 23:00:07.486373 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 23:00:08.150648 sshd[1743]: Connection closed by 139.178.89.65 port 36818 Sep 12 23:00:08.151289 sshd-session[1740]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:08.154617 systemd-logind[1533]: Session 2 logged out. Waiting for processes to exit. Sep 12 23:00:08.155215 systemd[1]: sshd@1-46.62.198.104:22-139.178.89.65:36818.service: Deactivated successfully. Sep 12 23:00:08.156645 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 23:00:08.157820 systemd-logind[1533]: Removed session 2. Sep 12 23:00:08.358944 systemd[1]: Started sshd@2-46.62.198.104:22-139.178.89.65:36832.service - OpenSSH per-connection server daemon (139.178.89.65:36832). Sep 12 23:00:09.476414 sshd[1749]: Accepted publickey for core from 139.178.89.65 port 36832 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:00:09.478278 sshd-session[1749]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:09.485132 systemd-logind[1533]: New session 3 of user core. Sep 12 23:00:09.495380 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 23:00:10.223258 sshd[1752]: Connection closed by 139.178.89.65 port 36832 Sep 12 23:00:10.224027 sshd-session[1749]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:10.229116 systemd[1]: sshd@2-46.62.198.104:22-139.178.89.65:36832.service: Deactivated successfully. Sep 12 23:00:10.231194 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 23:00:10.232392 systemd-logind[1533]: Session 3 logged out. Waiting for processes to exit. Sep 12 23:00:10.233722 systemd-logind[1533]: Removed session 3. Sep 12 23:00:10.377599 systemd[1]: Started sshd@3-46.62.198.104:22-139.178.89.65:48252.service - OpenSSH per-connection server daemon (139.178.89.65:48252). Sep 12 23:00:11.376924 sshd[1758]: Accepted publickey for core from 139.178.89.65 port 48252 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:00:11.379185 sshd-session[1758]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:11.388129 systemd-logind[1533]: New session 4 of user core. Sep 12 23:00:11.397350 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 23:00:12.055837 sshd[1761]: Connection closed by 139.178.89.65 port 48252 Sep 12 23:00:12.056503 sshd-session[1758]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:12.060639 systemd[1]: sshd@3-46.62.198.104:22-139.178.89.65:48252.service: Deactivated successfully. Sep 12 23:00:12.063244 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 23:00:12.064872 systemd-logind[1533]: Session 4 logged out. Waiting for processes to exit. Sep 12 23:00:12.066371 systemd-logind[1533]: Removed session 4. Sep 12 23:00:12.225304 systemd[1]: Started sshd@4-46.62.198.104:22-139.178.89.65:48264.service - OpenSSH per-connection server daemon (139.178.89.65:48264). Sep 12 23:00:12.232856 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 23:00:12.237333 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:12.377377 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:12.387362 (kubelet)[1778]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:00:12.442373 kubelet[1778]: E0912 23:00:12.442321 1778 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:00:12.446096 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:00:12.446225 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:00:12.446460 systemd[1]: kubelet.service: Consumed 164ms CPU time, 108.3M memory peak. Sep 12 23:00:13.213488 sshd[1767]: Accepted publickey for core from 139.178.89.65 port 48264 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:00:13.215770 sshd-session[1767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:13.226147 systemd-logind[1533]: New session 5 of user core. Sep 12 23:00:13.237290 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 23:00:13.750436 sudo[1786]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 23:00:13.750890 sudo[1786]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:00:13.768961 sudo[1786]: pam_unix(sudo:session): session closed for user root Sep 12 23:00:13.928525 sshd[1785]: Connection closed by 139.178.89.65 port 48264 Sep 12 23:00:13.929586 sshd-session[1767]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:13.935864 systemd[1]: sshd@4-46.62.198.104:22-139.178.89.65:48264.service: Deactivated successfully. Sep 12 23:00:13.939011 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 23:00:13.940355 systemd-logind[1533]: Session 5 logged out. Waiting for processes to exit. Sep 12 23:00:13.942742 systemd-logind[1533]: Removed session 5. Sep 12 23:00:14.142509 systemd[1]: Started sshd@5-46.62.198.104:22-139.178.89.65:48280.service - OpenSSH per-connection server daemon (139.178.89.65:48280). Sep 12 23:00:15.273630 sshd[1792]: Accepted publickey for core from 139.178.89.65 port 48280 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:00:15.275433 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:15.282020 systemd-logind[1533]: New session 6 of user core. Sep 12 23:00:15.289289 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 23:00:15.853275 sudo[1797]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 23:00:15.853726 sudo[1797]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:00:15.862570 sudo[1797]: pam_unix(sudo:session): session closed for user root Sep 12 23:00:15.871627 sudo[1796]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 23:00:15.872137 sudo[1796]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:00:15.891419 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 23:00:15.947696 augenrules[1819]: No rules Sep 12 23:00:15.948848 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 23:00:15.949014 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 23:00:15.951985 sudo[1796]: pam_unix(sudo:session): session closed for user root Sep 12 23:00:16.128525 sshd[1795]: Connection closed by 139.178.89.65 port 48280 Sep 12 23:00:16.129576 sshd-session[1792]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:16.135922 systemd[1]: sshd@5-46.62.198.104:22-139.178.89.65:48280.service: Deactivated successfully. Sep 12 23:00:16.138539 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 23:00:16.139720 systemd-logind[1533]: Session 6 logged out. Waiting for processes to exit. Sep 12 23:00:16.142196 systemd-logind[1533]: Removed session 6. Sep 12 23:00:16.321947 systemd[1]: Started sshd@6-46.62.198.104:22-139.178.89.65:48286.service - OpenSSH per-connection server daemon (139.178.89.65:48286). Sep 12 23:00:17.434464 sshd[1828]: Accepted publickey for core from 139.178.89.65 port 48286 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:00:17.436789 sshd-session[1828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:00:17.444573 systemd-logind[1533]: New session 7 of user core. Sep 12 23:00:17.453356 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 23:00:18.007952 sudo[1832]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 23:00:18.008470 sudo[1832]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 23:00:18.522501 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 23:00:18.537502 (dockerd)[1849]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 23:00:18.902734 dockerd[1849]: time="2025-09-12T23:00:18.902605797Z" level=info msg="Starting up" Sep 12 23:00:18.904686 dockerd[1849]: time="2025-09-12T23:00:18.904082371Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 23:00:18.917272 dockerd[1849]: time="2025-09-12T23:00:18.917228832Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 23:00:18.934752 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1308547939-merged.mount: Deactivated successfully. Sep 12 23:00:18.982244 dockerd[1849]: time="2025-09-12T23:00:18.982182522Z" level=info msg="Loading containers: start." Sep 12 23:00:18.997147 kernel: Initializing XFRM netlink socket Sep 12 23:00:19.272085 systemd-networkd[1470]: docker0: Link UP Sep 12 23:00:19.277287 dockerd[1849]: time="2025-09-12T23:00:19.277221724Z" level=info msg="Loading containers: done." Sep 12 23:00:19.294458 dockerd[1849]: time="2025-09-12T23:00:19.294399758Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 23:00:19.294641 dockerd[1849]: time="2025-09-12T23:00:19.294488796Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 23:00:19.294641 dockerd[1849]: time="2025-09-12T23:00:19.294566431Z" level=info msg="Initializing buildkit" Sep 12 23:00:19.322518 dockerd[1849]: time="2025-09-12T23:00:19.322444873Z" level=info msg="Completed buildkit initialization" Sep 12 23:00:19.331306 dockerd[1849]: time="2025-09-12T23:00:19.331248640Z" level=info msg="Daemon has completed initialization" Sep 12 23:00:19.332496 dockerd[1849]: time="2025-09-12T23:00:19.331504982Z" level=info msg="API listen on /run/docker.sock" Sep 12 23:00:19.331576 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 23:00:19.931465 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1238260830-merged.mount: Deactivated successfully. Sep 12 23:00:20.696583 containerd[1563]: time="2025-09-12T23:00:20.696491347Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 23:00:21.310355 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3046472948.mount: Deactivated successfully. Sep 12 23:00:22.576780 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 23:00:22.578468 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:22.582375 containerd[1563]: time="2025-09-12T23:00:22.582191107Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:22.583534 containerd[1563]: time="2025-09-12T23:00:22.583484426Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117224" Sep 12 23:00:22.584938 containerd[1563]: time="2025-09-12T23:00:22.584839867Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:22.588829 containerd[1563]: time="2025-09-12T23:00:22.588788508Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:22.590436 containerd[1563]: time="2025-09-12T23:00:22.589615991Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 1.893071514s" Sep 12 23:00:22.590436 containerd[1563]: time="2025-09-12T23:00:22.589657710Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 12 23:00:22.591144 containerd[1563]: time="2025-09-12T23:00:22.591126320Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 23:00:22.717191 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:22.729459 (kubelet)[2126]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:00:22.782192 kubelet[2126]: E0912 23:00:22.782079 2126 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:00:22.786602 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:00:22.786852 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:00:22.787906 systemd[1]: kubelet.service: Consumed 159ms CPU time, 108.4M memory peak. Sep 12 23:00:23.898091 containerd[1563]: time="2025-09-12T23:00:23.897975827Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:23.899362 containerd[1563]: time="2025-09-12T23:00:23.899280214Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716654" Sep 12 23:00:23.900409 containerd[1563]: time="2025-09-12T23:00:23.900388595Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:23.903800 containerd[1563]: time="2025-09-12T23:00:23.902914676Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:23.903800 containerd[1563]: time="2025-09-12T23:00:23.903665458Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 1.312464278s" Sep 12 23:00:23.903800 containerd[1563]: time="2025-09-12T23:00:23.903690512Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 12 23:00:23.904140 containerd[1563]: time="2025-09-12T23:00:23.904114887Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 23:00:25.144431 containerd[1563]: time="2025-09-12T23:00:25.144367972Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:25.145854 containerd[1563]: time="2025-09-12T23:00:25.145624517Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787720" Sep 12 23:00:25.146894 containerd[1563]: time="2025-09-12T23:00:25.146866933Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:25.149799 containerd[1563]: time="2025-09-12T23:00:25.149778574Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:25.150688 containerd[1563]: time="2025-09-12T23:00:25.150657715Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 1.246518819s" Sep 12 23:00:25.150737 containerd[1563]: time="2025-09-12T23:00:25.150693781Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 12 23:00:25.151423 containerd[1563]: time="2025-09-12T23:00:25.151398634Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 23:00:26.193162 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2232564231.mount: Deactivated successfully. Sep 12 23:00:26.532189 containerd[1563]: time="2025-09-12T23:00:26.532071311Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:26.533430 containerd[1563]: time="2025-09-12T23:00:26.533303767Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410280" Sep 12 23:00:26.534375 containerd[1563]: time="2025-09-12T23:00:26.534346752Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:26.536127 containerd[1563]: time="2025-09-12T23:00:26.536100134Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:26.536634 containerd[1563]: time="2025-09-12T23:00:26.536604768Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 1.38517426s" Sep 12 23:00:26.536694 containerd[1563]: time="2025-09-12T23:00:26.536683591Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 12 23:00:26.537198 containerd[1563]: time="2025-09-12T23:00:26.537183604Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 23:00:27.084709 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4097798073.mount: Deactivated successfully. Sep 12 23:00:28.092295 containerd[1563]: time="2025-09-12T23:00:28.092221835Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:28.093525 containerd[1563]: time="2025-09-12T23:00:28.093489581Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Sep 12 23:00:28.095219 containerd[1563]: time="2025-09-12T23:00:28.095170415Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:28.099528 containerd[1563]: time="2025-09-12T23:00:28.099466965Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:28.100209 containerd[1563]: time="2025-09-12T23:00:28.100169083Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.562913956s" Sep 12 23:00:28.100209 containerd[1563]: time="2025-09-12T23:00:28.100210172Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 12 23:00:28.101058 containerd[1563]: time="2025-09-12T23:00:28.100930183Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 23:00:28.602397 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1169574644.mount: Deactivated successfully. Sep 12 23:00:28.613333 containerd[1563]: time="2025-09-12T23:00:28.613213944Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:00:28.614703 containerd[1563]: time="2025-09-12T23:00:28.614598021Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Sep 12 23:00:28.616173 containerd[1563]: time="2025-09-12T23:00:28.615899258Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:00:28.622086 containerd[1563]: time="2025-09-12T23:00:28.620578263Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 23:00:28.622304 containerd[1563]: time="2025-09-12T23:00:28.622228593Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 521.255535ms" Sep 12 23:00:28.622304 containerd[1563]: time="2025-09-12T23:00:28.622284428Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 12 23:00:28.623471 containerd[1563]: time="2025-09-12T23:00:28.623400578Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 23:00:29.146777 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2934656680.mount: Deactivated successfully. Sep 12 23:00:30.667569 containerd[1563]: time="2025-09-12T23:00:30.667508505Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:30.669101 containerd[1563]: time="2025-09-12T23:00:30.668847903Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910785" Sep 12 23:00:30.670252 containerd[1563]: time="2025-09-12T23:00:30.670229378Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:30.674434 containerd[1563]: time="2025-09-12T23:00:30.674409044Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:30.675751 containerd[1563]: time="2025-09-12T23:00:30.675713843Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.052268469s" Sep 12 23:00:30.675805 containerd[1563]: time="2025-09-12T23:00:30.675760510Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 12 23:00:32.826777 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 23:00:32.830189 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:32.969983 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:32.977465 (kubelet)[2288]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 23:00:33.021458 kubelet[2288]: E0912 23:00:33.021415 2288 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 23:00:33.024697 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 23:00:33.025145 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 23:00:33.025700 systemd[1]: kubelet.service: Consumed 129ms CPU time, 107.6M memory peak. Sep 12 23:00:33.386507 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:33.386747 systemd[1]: kubelet.service: Consumed 129ms CPU time, 107.6M memory peak. Sep 12 23:00:33.390610 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:33.423946 systemd[1]: Reload requested from client PID 2302 ('systemctl') (unit session-7.scope)... Sep 12 23:00:33.424114 systemd[1]: Reloading... Sep 12 23:00:33.518066 zram_generator::config[2343]: No configuration found. Sep 12 23:00:33.725210 systemd[1]: Reloading finished in 300 ms. Sep 12 23:00:33.768206 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 23:00:33.768269 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 23:00:33.768506 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:33.768542 systemd[1]: kubelet.service: Consumed 102ms CPU time, 98.3M memory peak. Sep 12 23:00:33.769820 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:33.914266 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:33.918906 (kubelet)[2398]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 23:00:33.962187 kubelet[2398]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:00:33.962553 kubelet[2398]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 23:00:33.962602 kubelet[2398]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:00:33.964622 kubelet[2398]: I0912 23:00:33.964589 2398 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 23:00:34.244240 kubelet[2398]: I0912 23:00:34.244181 2398 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 23:00:34.244240 kubelet[2398]: I0912 23:00:34.244234 2398 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 23:00:34.244745 kubelet[2398]: I0912 23:00:34.244720 2398 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 23:00:34.290612 kubelet[2398]: I0912 23:00:34.290254 2398 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:00:34.291006 kubelet[2398]: E0912 23:00:34.290919 2398 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://46.62.198.104:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:34.303311 kubelet[2398]: I0912 23:00:34.303270 2398 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 23:00:34.310169 kubelet[2398]: I0912 23:00:34.310123 2398 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 23:00:34.310967 kubelet[2398]: I0912 23:00:34.310929 2398 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 23:00:34.311155 kubelet[2398]: I0912 23:00:34.311116 2398 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 23:00:34.311337 kubelet[2398]: I0912 23:00:34.311145 2398 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-0-0-5-bd33a49fb7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 23:00:34.311337 kubelet[2398]: I0912 23:00:34.311335 2398 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 23:00:34.311498 kubelet[2398]: I0912 23:00:34.311347 2398 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 23:00:34.311498 kubelet[2398]: I0912 23:00:34.311460 2398 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:00:34.314755 kubelet[2398]: I0912 23:00:34.314437 2398 kubelet.go:408] "Attempting to sync node with API server" Sep 12 23:00:34.314755 kubelet[2398]: I0912 23:00:34.314462 2398 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 23:00:34.314755 kubelet[2398]: I0912 23:00:34.314493 2398 kubelet.go:314] "Adding apiserver pod source" Sep 12 23:00:34.314755 kubelet[2398]: I0912 23:00:34.314507 2398 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 23:00:34.321851 kubelet[2398]: I0912 23:00:34.321818 2398 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 23:00:34.327491 kubelet[2398]: I0912 23:00:34.327461 2398 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 23:00:34.327702 kubelet[2398]: W0912 23:00:34.327686 2398 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 23:00:34.328702 kubelet[2398]: I0912 23:00:34.328682 2398 server.go:1274] "Started kubelet" Sep 12 23:00:34.329115 kubelet[2398]: W0912 23:00:34.329017 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://46.62.198.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-5-bd33a49fb7&limit=500&resourceVersion=0": dial tcp 46.62.198.104:6443: connect: connection refused Sep 12 23:00:34.329276 kubelet[2398]: E0912 23:00:34.329246 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://46.62.198.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-5-bd33a49fb7&limit=500&resourceVersion=0\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:34.331094 kubelet[2398]: I0912 23:00:34.330422 2398 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 23:00:34.332417 kubelet[2398]: W0912 23:00:34.332287 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://46.62.198.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 46.62.198.104:6443: connect: connection refused Sep 12 23:00:34.332417 kubelet[2398]: E0912 23:00:34.332390 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://46.62.198.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:34.332831 kubelet[2398]: I0912 23:00:34.332694 2398 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 23:00:34.333151 kubelet[2398]: I0912 23:00:34.333120 2398 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 23:00:34.333918 kubelet[2398]: I0912 23:00:34.333900 2398 server.go:449] "Adding debug handlers to kubelet server" Sep 12 23:00:34.339551 kubelet[2398]: I0912 23:00:34.339525 2398 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 23:00:34.349891 kubelet[2398]: E0912 23:00:34.344920 2398 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://46.62.198.104:6443/api/v1/namespaces/default/events\": dial tcp 46.62.198.104:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-0-0-5-bd33a49fb7.1864ab3d4b7505cf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-0-0-5-bd33a49fb7,UID:ci-4459-0-0-5-bd33a49fb7,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-5-bd33a49fb7,},FirstTimestamp:2025-09-12 23:00:34.328651215 +0000 UTC m=+0.405723357,LastTimestamp:2025-09-12 23:00:34.328651215 +0000 UTC m=+0.405723357,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-5-bd33a49fb7,}" Sep 12 23:00:34.349891 kubelet[2398]: I0912 23:00:34.347985 2398 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 23:00:34.352136 kubelet[2398]: I0912 23:00:34.351155 2398 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 23:00:34.352136 kubelet[2398]: E0912 23:00:34.351450 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-5-bd33a49fb7\" not found" Sep 12 23:00:34.352744 kubelet[2398]: E0912 23:00:34.352696 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.198.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-5-bd33a49fb7?timeout=10s\": dial tcp 46.62.198.104:6443: connect: connection refused" interval="200ms" Sep 12 23:00:34.353084 kubelet[2398]: I0912 23:00:34.353057 2398 factory.go:221] Registration of the systemd container factory successfully Sep 12 23:00:34.353157 kubelet[2398]: I0912 23:00:34.353137 2398 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 23:00:34.354813 kubelet[2398]: I0912 23:00:34.354781 2398 reconciler.go:26] "Reconciler: start to sync state" Sep 12 23:00:34.354887 kubelet[2398]: I0912 23:00:34.354836 2398 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 23:00:34.355336 kubelet[2398]: W0912 23:00:34.355286 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://46.62.198.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 46.62.198.104:6443: connect: connection refused Sep 12 23:00:34.355426 kubelet[2398]: E0912 23:00:34.355333 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://46.62.198.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:34.356203 kubelet[2398]: I0912 23:00:34.356174 2398 factory.go:221] Registration of the containerd container factory successfully Sep 12 23:00:34.357871 kubelet[2398]: E0912 23:00:34.357834 2398 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 23:00:34.376437 kubelet[2398]: I0912 23:00:34.376380 2398 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 23:00:34.378013 kubelet[2398]: I0912 23:00:34.377987 2398 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 23:00:34.378179 kubelet[2398]: I0912 23:00:34.378159 2398 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 23:00:34.378293 kubelet[2398]: I0912 23:00:34.378282 2398 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 23:00:34.378433 kubelet[2398]: E0912 23:00:34.378411 2398 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 23:00:34.379541 kubelet[2398]: I0912 23:00:34.379513 2398 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 23:00:34.379541 kubelet[2398]: I0912 23:00:34.379531 2398 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 23:00:34.379639 kubelet[2398]: I0912 23:00:34.379549 2398 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:00:34.383630 kubelet[2398]: I0912 23:00:34.383596 2398 policy_none.go:49] "None policy: Start" Sep 12 23:00:34.384444 kubelet[2398]: W0912 23:00:34.384390 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://46.62.198.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 46.62.198.104:6443: connect: connection refused Sep 12 23:00:34.384519 kubelet[2398]: E0912 23:00:34.384445 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://46.62.198.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:34.384814 kubelet[2398]: I0912 23:00:34.384779 2398 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 23:00:34.384814 kubelet[2398]: I0912 23:00:34.384806 2398 state_mem.go:35] "Initializing new in-memory state store" Sep 12 23:00:34.396558 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 23:00:34.411597 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 23:00:34.415071 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 23:00:34.429115 kubelet[2398]: I0912 23:00:34.429027 2398 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 23:00:34.429396 kubelet[2398]: I0912 23:00:34.429358 2398 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 23:00:34.429443 kubelet[2398]: I0912 23:00:34.429390 2398 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 23:00:34.430263 kubelet[2398]: I0912 23:00:34.430089 2398 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 23:00:34.432023 kubelet[2398]: E0912 23:00:34.431980 2398 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-0-0-5-bd33a49fb7\" not found" Sep 12 23:00:34.494714 systemd[1]: Created slice kubepods-burstable-pod4911ce518e34ace21e2d4f5064fa6e92.slice - libcontainer container kubepods-burstable-pod4911ce518e34ace21e2d4f5064fa6e92.slice. Sep 12 23:00:34.529437 systemd[1]: Created slice kubepods-burstable-podec0deae82b4ac3f1e1285b8502252048.slice - libcontainer container kubepods-burstable-podec0deae82b4ac3f1e1285b8502252048.slice. Sep 12 23:00:34.533694 kubelet[2398]: I0912 23:00:34.533630 2398 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.534757 kubelet[2398]: E0912 23:00:34.534700 2398 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://46.62.198.104:6443/api/v1/nodes\": dial tcp 46.62.198.104:6443: connect: connection refused" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.536071 systemd[1]: Created slice kubepods-burstable-podac3e2ae3cf5e560a9e257c3f1fc75e01.slice - libcontainer container kubepods-burstable-podac3e2ae3cf5e560a9e257c3f1fc75e01.slice. Sep 12 23:00:34.553597 kubelet[2398]: E0912 23:00:34.553542 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.198.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-5-bd33a49fb7?timeout=10s\": dial tcp 46.62.198.104:6443: connect: connection refused" interval="400ms" Sep 12 23:00:34.656464 kubelet[2398]: I0912 23:00:34.656316 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4911ce518e34ace21e2d4f5064fa6e92-ca-certs\") pod \"kube-apiserver-ci-4459-0-0-5-bd33a49fb7\" (UID: \"4911ce518e34ace21e2d4f5064fa6e92\") " pod="kube-system/kube-apiserver-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.656464 kubelet[2398]: I0912 23:00:34.656421 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4911ce518e34ace21e2d4f5064fa6e92-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-0-0-5-bd33a49fb7\" (UID: \"4911ce518e34ace21e2d4f5064fa6e92\") " pod="kube-system/kube-apiserver-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.656464 kubelet[2398]: I0912 23:00:34.656469 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-ca-certs\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.656828 kubelet[2398]: I0912 23:00:34.656506 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-k8s-certs\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.656828 kubelet[2398]: I0912 23:00:34.656549 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-kubeconfig\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.656828 kubelet[2398]: I0912 23:00:34.656582 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4911ce518e34ace21e2d4f5064fa6e92-k8s-certs\") pod \"kube-apiserver-ci-4459-0-0-5-bd33a49fb7\" (UID: \"4911ce518e34ace21e2d4f5064fa6e92\") " pod="kube-system/kube-apiserver-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.656828 kubelet[2398]: I0912 23:00:34.656617 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.656828 kubelet[2398]: I0912 23:00:34.656655 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.657185 kubelet[2398]: I0912 23:00:34.656694 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ac3e2ae3cf5e560a9e257c3f1fc75e01-kubeconfig\") pod \"kube-scheduler-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ac3e2ae3cf5e560a9e257c3f1fc75e01\") " pod="kube-system/kube-scheduler-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.738926 kubelet[2398]: I0912 23:00:34.738880 2398 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.739396 kubelet[2398]: E0912 23:00:34.739328 2398 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://46.62.198.104:6443/api/v1/nodes\": dial tcp 46.62.198.104:6443: connect: connection refused" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:34.824664 containerd[1563]: time="2025-09-12T23:00:34.824477305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-0-0-5-bd33a49fb7,Uid:4911ce518e34ace21e2d4f5064fa6e92,Namespace:kube-system,Attempt:0,}" Sep 12 23:00:34.839582 containerd[1563]: time="2025-09-12T23:00:34.839521413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-0-0-5-bd33a49fb7,Uid:ec0deae82b4ac3f1e1285b8502252048,Namespace:kube-system,Attempt:0,}" Sep 12 23:00:34.841783 containerd[1563]: time="2025-09-12T23:00:34.841693396Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-0-0-5-bd33a49fb7,Uid:ac3e2ae3cf5e560a9e257c3f1fc75e01,Namespace:kube-system,Attempt:0,}" Sep 12 23:00:34.955233 kubelet[2398]: E0912 23:00:34.955132 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.198.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-5-bd33a49fb7?timeout=10s\": dial tcp 46.62.198.104:6443: connect: connection refused" interval="800ms" Sep 12 23:00:34.988392 containerd[1563]: time="2025-09-12T23:00:34.988270337Z" level=info msg="connecting to shim 09ed218eae56404898232043adb802ea97fb508f47016896390603b2cd7a92e5" address="unix:///run/containerd/s/1752c538c7f5da4930c646e14036359e0163a6f892dc20d144a1ff870c50cabd" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:00:35.008975 containerd[1563]: time="2025-09-12T23:00:35.007266279Z" level=info msg="connecting to shim 4b01d472e63e9a66e8201593e6e351e58a698279ae7e96cb46c438c22e7a1729" address="unix:///run/containerd/s/aca000276a9f8873857c49c201e16eb47224c1c103e0fcf7d84fb3f9c43a0442" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:00:35.022288 containerd[1563]: time="2025-09-12T23:00:35.022246212Z" level=info msg="connecting to shim c82e6c4d3436eeacd6310ce0d6bd8c0265883acb43ada004d369590eff989e49" address="unix:///run/containerd/s/f957353839717f1a503b73ae4e621e8945afbade56f3648175cef0f197e38304" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:00:35.100420 systemd[1]: Started cri-containerd-c82e6c4d3436eeacd6310ce0d6bd8c0265883acb43ada004d369590eff989e49.scope - libcontainer container c82e6c4d3436eeacd6310ce0d6bd8c0265883acb43ada004d369590eff989e49. Sep 12 23:00:35.107182 systemd[1]: Started cri-containerd-09ed218eae56404898232043adb802ea97fb508f47016896390603b2cd7a92e5.scope - libcontainer container 09ed218eae56404898232043adb802ea97fb508f47016896390603b2cd7a92e5. Sep 12 23:00:35.109099 systemd[1]: Started cri-containerd-4b01d472e63e9a66e8201593e6e351e58a698279ae7e96cb46c438c22e7a1729.scope - libcontainer container 4b01d472e63e9a66e8201593e6e351e58a698279ae7e96cb46c438c22e7a1729. Sep 12 23:00:35.143510 kubelet[2398]: I0912 23:00:35.143483 2398 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:35.143915 kubelet[2398]: E0912 23:00:35.143805 2398 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://46.62.198.104:6443/api/v1/nodes\": dial tcp 46.62.198.104:6443: connect: connection refused" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:35.171012 containerd[1563]: time="2025-09-12T23:00:35.170973337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-0-0-5-bd33a49fb7,Uid:ec0deae82b4ac3f1e1285b8502252048,Namespace:kube-system,Attempt:0,} returns sandbox id \"4b01d472e63e9a66e8201593e6e351e58a698279ae7e96cb46c438c22e7a1729\"" Sep 12 23:00:35.172386 containerd[1563]: time="2025-09-12T23:00:35.172333214Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-0-0-5-bd33a49fb7,Uid:ac3e2ae3cf5e560a9e257c3f1fc75e01,Namespace:kube-system,Attempt:0,} returns sandbox id \"c82e6c4d3436eeacd6310ce0d6bd8c0265883acb43ada004d369590eff989e49\"" Sep 12 23:00:35.175945 containerd[1563]: time="2025-09-12T23:00:35.175900162Z" level=info msg="CreateContainer within sandbox \"c82e6c4d3436eeacd6310ce0d6bd8c0265883acb43ada004d369590eff989e49\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 23:00:35.176172 containerd[1563]: time="2025-09-12T23:00:35.175934656Z" level=info msg="CreateContainer within sandbox \"4b01d472e63e9a66e8201593e6e351e58a698279ae7e96cb46c438c22e7a1729\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 23:00:35.177691 kubelet[2398]: W0912 23:00:35.177592 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://46.62.198.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 46.62.198.104:6443: connect: connection refused Sep 12 23:00:35.177691 kubelet[2398]: E0912 23:00:35.177667 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://46.62.198.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:35.194382 containerd[1563]: time="2025-09-12T23:00:35.194326597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-0-0-5-bd33a49fb7,Uid:4911ce518e34ace21e2d4f5064fa6e92,Namespace:kube-system,Attempt:0,} returns sandbox id \"09ed218eae56404898232043adb802ea97fb508f47016896390603b2cd7a92e5\"" Sep 12 23:00:35.195124 kubelet[2398]: W0912 23:00:35.194889 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://46.62.198.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 46.62.198.104:6443: connect: connection refused Sep 12 23:00:35.195124 kubelet[2398]: E0912 23:00:35.194977 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://46.62.198.104:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:35.196746 containerd[1563]: time="2025-09-12T23:00:35.196703425Z" level=info msg="CreateContainer within sandbox \"09ed218eae56404898232043adb802ea97fb508f47016896390603b2cd7a92e5\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 23:00:35.205146 containerd[1563]: time="2025-09-12T23:00:35.205031017Z" level=info msg="Container a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:00:35.209003 containerd[1563]: time="2025-09-12T23:00:35.208430313Z" level=info msg="Container 760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:00:35.213635 containerd[1563]: time="2025-09-12T23:00:35.213595069Z" level=info msg="Container 64ad0477256a33560c5f75564cd4922ea089c603b67355015064e1014dc82b57: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:00:35.221535 containerd[1563]: time="2025-09-12T23:00:35.221485670Z" level=info msg="CreateContainer within sandbox \"4b01d472e63e9a66e8201593e6e351e58a698279ae7e96cb46c438c22e7a1729\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b\"" Sep 12 23:00:35.227383 containerd[1563]: time="2025-09-12T23:00:35.222145096Z" level=info msg="StartContainer for \"a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b\"" Sep 12 23:00:35.228571 containerd[1563]: time="2025-09-12T23:00:35.228517605Z" level=info msg="connecting to shim a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b" address="unix:///run/containerd/s/aca000276a9f8873857c49c201e16eb47224c1c103e0fcf7d84fb3f9c43a0442" protocol=ttrpc version=3 Sep 12 23:00:35.238032 containerd[1563]: time="2025-09-12T23:00:35.237421668Z" level=info msg="CreateContainer within sandbox \"09ed218eae56404898232043adb802ea97fb508f47016896390603b2cd7a92e5\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"64ad0477256a33560c5f75564cd4922ea089c603b67355015064e1014dc82b57\"" Sep 12 23:00:35.238154 containerd[1563]: time="2025-09-12T23:00:35.238085502Z" level=info msg="CreateContainer within sandbox \"c82e6c4d3436eeacd6310ce0d6bd8c0265883acb43ada004d369590eff989e49\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5\"" Sep 12 23:00:35.238651 containerd[1563]: time="2025-09-12T23:00:35.238619404Z" level=info msg="StartContainer for \"760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5\"" Sep 12 23:00:35.239686 containerd[1563]: time="2025-09-12T23:00:35.239650981Z" level=info msg="connecting to shim 760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5" address="unix:///run/containerd/s/f957353839717f1a503b73ae4e621e8945afbade56f3648175cef0f197e38304" protocol=ttrpc version=3 Sep 12 23:00:35.240853 containerd[1563]: time="2025-09-12T23:00:35.240095067Z" level=info msg="StartContainer for \"64ad0477256a33560c5f75564cd4922ea089c603b67355015064e1014dc82b57\"" Sep 12 23:00:35.241915 containerd[1563]: time="2025-09-12T23:00:35.241449864Z" level=info msg="connecting to shim 64ad0477256a33560c5f75564cd4922ea089c603b67355015064e1014dc82b57" address="unix:///run/containerd/s/1752c538c7f5da4930c646e14036359e0163a6f892dc20d144a1ff870c50cabd" protocol=ttrpc version=3 Sep 12 23:00:35.246897 systemd[1]: Started cri-containerd-a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b.scope - libcontainer container a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b. Sep 12 23:00:35.259636 kubelet[2398]: W0912 23:00:35.259439 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://46.62.198.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-5-bd33a49fb7&limit=500&resourceVersion=0": dial tcp 46.62.198.104:6443: connect: connection refused Sep 12 23:00:35.259636 kubelet[2398]: E0912 23:00:35.259614 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://46.62.198.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-5-bd33a49fb7&limit=500&resourceVersion=0\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:35.272271 systemd[1]: Started cri-containerd-760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5.scope - libcontainer container 760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5. Sep 12 23:00:35.276577 systemd[1]: Started cri-containerd-64ad0477256a33560c5f75564cd4922ea089c603b67355015064e1014dc82b57.scope - libcontainer container 64ad0477256a33560c5f75564cd4922ea089c603b67355015064e1014dc82b57. Sep 12 23:00:35.322777 containerd[1563]: time="2025-09-12T23:00:35.322695158Z" level=info msg="StartContainer for \"a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b\" returns successfully" Sep 12 23:00:35.349783 containerd[1563]: time="2025-09-12T23:00:35.349726312Z" level=info msg="StartContainer for \"64ad0477256a33560c5f75564cd4922ea089c603b67355015064e1014dc82b57\" returns successfully" Sep 12 23:00:35.364567 kubelet[2398]: W0912 23:00:35.363656 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://46.62.198.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 46.62.198.104:6443: connect: connection refused Sep 12 23:00:35.364567 kubelet[2398]: E0912 23:00:35.364496 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://46.62.198.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.62.198.104:6443: connect: connection refused" logger="UnhandledError" Sep 12 23:00:35.374604 containerd[1563]: time="2025-09-12T23:00:35.374540717Z" level=info msg="StartContainer for \"760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5\" returns successfully" Sep 12 23:00:35.946407 kubelet[2398]: I0912 23:00:35.945950 2398 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:37.127416 kubelet[2398]: E0912 23:00:37.127336 2398 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-0-0-5-bd33a49fb7\" not found" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:37.234982 kubelet[2398]: I0912 23:00:37.234914 2398 kubelet_node_status.go:75] "Successfully registered node" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:37.234982 kubelet[2398]: E0912 23:00:37.234966 2398 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4459-0-0-5-bd33a49fb7\": node \"ci-4459-0-0-5-bd33a49fb7\" not found" Sep 12 23:00:37.248179 kubelet[2398]: E0912 23:00:37.248033 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-5-bd33a49fb7\" not found" Sep 12 23:00:37.349230 kubelet[2398]: E0912 23:00:37.349180 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-5-bd33a49fb7\" not found" Sep 12 23:00:37.449901 kubelet[2398]: E0912 23:00:37.449749 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-5-bd33a49fb7\" not found" Sep 12 23:00:38.333202 kubelet[2398]: I0912 23:00:38.333122 2398 apiserver.go:52] "Watching apiserver" Sep 12 23:00:38.355088 kubelet[2398]: I0912 23:00:38.355013 2398 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 23:00:39.425457 systemd[1]: Reload requested from client PID 2668 ('systemctl') (unit session-7.scope)... Sep 12 23:00:39.425482 systemd[1]: Reloading... Sep 12 23:00:39.560155 zram_generator::config[2712]: No configuration found. Sep 12 23:00:39.818217 systemd[1]: Reloading finished in 392 ms. Sep 12 23:00:39.851332 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:39.872812 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 23:00:39.873093 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:39.873144 systemd[1]: kubelet.service: Consumed 805ms CPU time, 126.1M memory peak. Sep 12 23:00:39.875357 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 23:00:40.013406 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 23:00:40.022375 (kubelet)[2763]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 23:00:40.080068 kubelet[2763]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:00:40.080068 kubelet[2763]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 23:00:40.080068 kubelet[2763]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 23:00:40.080068 kubelet[2763]: I0912 23:00:40.079709 2763 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 23:00:40.088575 kubelet[2763]: I0912 23:00:40.088547 2763 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 23:00:40.088721 kubelet[2763]: I0912 23:00:40.088714 2763 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 23:00:40.088990 kubelet[2763]: I0912 23:00:40.088981 2763 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 23:00:40.090329 kubelet[2763]: I0912 23:00:40.090315 2763 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 23:00:40.092183 kubelet[2763]: I0912 23:00:40.092170 2763 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 23:00:40.097677 kubelet[2763]: I0912 23:00:40.097654 2763 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 23:00:40.103825 kubelet[2763]: I0912 23:00:40.103146 2763 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 23:00:40.103825 kubelet[2763]: I0912 23:00:40.103326 2763 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 23:00:40.103825 kubelet[2763]: I0912 23:00:40.103431 2763 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 23:00:40.103825 kubelet[2763]: I0912 23:00:40.103459 2763 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-0-0-5-bd33a49fb7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 23:00:40.104093 kubelet[2763]: I0912 23:00:40.103688 2763 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 23:00:40.104093 kubelet[2763]: I0912 23:00:40.103699 2763 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 23:00:40.104093 kubelet[2763]: I0912 23:00:40.103729 2763 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:00:40.104187 kubelet[2763]: I0912 23:00:40.104178 2763 kubelet.go:408] "Attempting to sync node with API server" Sep 12 23:00:40.104422 kubelet[2763]: I0912 23:00:40.104412 2763 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 23:00:40.104491 kubelet[2763]: I0912 23:00:40.104486 2763 kubelet.go:314] "Adding apiserver pod source" Sep 12 23:00:40.104539 kubelet[2763]: I0912 23:00:40.104530 2763 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 23:00:40.108155 kubelet[2763]: I0912 23:00:40.108142 2763 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 23:00:40.108575 kubelet[2763]: I0912 23:00:40.108563 2763 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 23:00:40.110064 kubelet[2763]: I0912 23:00:40.108948 2763 server.go:1274] "Started kubelet" Sep 12 23:00:40.114902 kubelet[2763]: I0912 23:00:40.114874 2763 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 23:00:40.127119 kubelet[2763]: I0912 23:00:40.125359 2763 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 23:00:40.130566 kubelet[2763]: I0912 23:00:40.130546 2763 server.go:449] "Adding debug handlers to kubelet server" Sep 12 23:00:40.132172 kubelet[2763]: I0912 23:00:40.132117 2763 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 23:00:40.132376 kubelet[2763]: I0912 23:00:40.132319 2763 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 23:00:40.132698 kubelet[2763]: I0912 23:00:40.132569 2763 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 23:00:40.138147 kubelet[2763]: I0912 23:00:40.138125 2763 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 23:00:40.138384 kubelet[2763]: I0912 23:00:40.138373 2763 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 23:00:40.138529 kubelet[2763]: I0912 23:00:40.138518 2763 reconciler.go:26] "Reconciler: start to sync state" Sep 12 23:00:40.140395 kubelet[2763]: E0912 23:00:40.140374 2763 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 23:00:40.141346 kubelet[2763]: I0912 23:00:40.141321 2763 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 23:00:40.141570 kubelet[2763]: I0912 23:00:40.141492 2763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 23:00:40.142513 kubelet[2763]: I0912 23:00:40.142498 2763 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 23:00:40.142590 kubelet[2763]: I0912 23:00:40.142582 2763 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 23:00:40.142993 kubelet[2763]: I0912 23:00:40.142745 2763 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 23:00:40.142993 kubelet[2763]: E0912 23:00:40.142789 2763 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 23:00:40.144737 kubelet[2763]: I0912 23:00:40.144712 2763 factory.go:221] Registration of the containerd container factory successfully Sep 12 23:00:40.144737 kubelet[2763]: I0912 23:00:40.144729 2763 factory.go:221] Registration of the systemd container factory successfully Sep 12 23:00:40.193656 kubelet[2763]: I0912 23:00:40.193589 2763 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 23:00:40.193656 kubelet[2763]: I0912 23:00:40.193641 2763 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 23:00:40.193656 kubelet[2763]: I0912 23:00:40.193662 2763 state_mem.go:36] "Initialized new in-memory state store" Sep 12 23:00:40.193917 kubelet[2763]: I0912 23:00:40.193882 2763 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 23:00:40.193961 kubelet[2763]: I0912 23:00:40.193908 2763 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 23:00:40.193961 kubelet[2763]: I0912 23:00:40.193941 2763 policy_none.go:49] "None policy: Start" Sep 12 23:00:40.194825 kubelet[2763]: I0912 23:00:40.194791 2763 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 23:00:40.194825 kubelet[2763]: I0912 23:00:40.194818 2763 state_mem.go:35] "Initializing new in-memory state store" Sep 12 23:00:40.195098 kubelet[2763]: I0912 23:00:40.195083 2763 state_mem.go:75] "Updated machine memory state" Sep 12 23:00:40.200217 kubelet[2763]: I0912 23:00:40.199584 2763 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 23:00:40.200217 kubelet[2763]: I0912 23:00:40.199720 2763 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 23:00:40.200217 kubelet[2763]: I0912 23:00:40.199737 2763 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 23:00:40.200350 kubelet[2763]: I0912 23:00:40.200273 2763 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 23:00:40.315818 kubelet[2763]: I0912 23:00:40.315728 2763 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.327734 kubelet[2763]: I0912 23:00:40.327669 2763 kubelet_node_status.go:111] "Node was previously registered" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.328376 kubelet[2763]: I0912 23:00:40.328252 2763 kubelet_node_status.go:75] "Successfully registered node" node="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340616 kubelet[2763]: I0912 23:00:40.339407 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4911ce518e34ace21e2d4f5064fa6e92-ca-certs\") pod \"kube-apiserver-ci-4459-0-0-5-bd33a49fb7\" (UID: \"4911ce518e34ace21e2d4f5064fa6e92\") " pod="kube-system/kube-apiserver-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340616 kubelet[2763]: I0912 23:00:40.339466 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4911ce518e34ace21e2d4f5064fa6e92-k8s-certs\") pod \"kube-apiserver-ci-4459-0-0-5-bd33a49fb7\" (UID: \"4911ce518e34ace21e2d4f5064fa6e92\") " pod="kube-system/kube-apiserver-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340616 kubelet[2763]: I0912 23:00:40.339500 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-k8s-certs\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340616 kubelet[2763]: I0912 23:00:40.339529 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-kubeconfig\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340616 kubelet[2763]: I0912 23:00:40.339558 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ac3e2ae3cf5e560a9e257c3f1fc75e01-kubeconfig\") pod \"kube-scheduler-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ac3e2ae3cf5e560a9e257c3f1fc75e01\") " pod="kube-system/kube-scheduler-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340925 kubelet[2763]: I0912 23:00:40.339586 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4911ce518e34ace21e2d4f5064fa6e92-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-0-0-5-bd33a49fb7\" (UID: \"4911ce518e34ace21e2d4f5064fa6e92\") " pod="kube-system/kube-apiserver-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340925 kubelet[2763]: I0912 23:00:40.339619 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-ca-certs\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340925 kubelet[2763]: I0912 23:00:40.339652 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:40.340925 kubelet[2763]: I0912 23:00:40.339682 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ec0deae82b4ac3f1e1285b8502252048-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" (UID: \"ec0deae82b4ac3f1e1285b8502252048\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:41.105995 kubelet[2763]: I0912 23:00:41.105908 2763 apiserver.go:52] "Watching apiserver" Sep 12 23:00:41.138741 kubelet[2763]: I0912 23:00:41.138674 2763 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 23:00:41.192714 kubelet[2763]: E0912 23:00:41.192304 2763 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4459-0-0-5-bd33a49fb7\" already exists" pod="kube-system/kube-apiserver-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:41.193508 kubelet[2763]: E0912 23:00:41.193370 2763 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4459-0-0-5-bd33a49fb7\" already exists" pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" Sep 12 23:00:41.241536 kubelet[2763]: I0912 23:00:41.241477 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-0-0-5-bd33a49fb7" podStartSLOduration=1.24143632 podStartE2EDuration="1.24143632s" podCreationTimestamp="2025-09-12 23:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:00:41.227987469 +0000 UTC m=+1.201141918" watchObservedRunningTime="2025-09-12 23:00:41.24143632 +0000 UTC m=+1.214590760" Sep 12 23:00:41.256038 kubelet[2763]: I0912 23:00:41.255865 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-0-0-5-bd33a49fb7" podStartSLOduration=1.255844938 podStartE2EDuration="1.255844938s" podCreationTimestamp="2025-09-12 23:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:00:41.253598895 +0000 UTC m=+1.226753335" watchObservedRunningTime="2025-09-12 23:00:41.255844938 +0000 UTC m=+1.228999377" Sep 12 23:00:41.256038 kubelet[2763]: I0912 23:00:41.255948 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-0-0-5-bd33a49fb7" podStartSLOduration=1.255942991 podStartE2EDuration="1.255942991s" podCreationTimestamp="2025-09-12 23:00:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:00:41.242213648 +0000 UTC m=+1.215368087" watchObservedRunningTime="2025-09-12 23:00:41.255942991 +0000 UTC m=+1.229097440" Sep 12 23:00:44.194303 update_engine[1537]: I20250912 23:00:44.194164 1537 update_attempter.cc:509] Updating boot flags... Sep 12 23:00:45.868370 kubelet[2763]: I0912 23:00:45.868321 2763 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 23:00:45.869202 kubelet[2763]: I0912 23:00:45.868770 2763 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 23:00:45.869265 containerd[1563]: time="2025-09-12T23:00:45.868603940Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 23:00:46.802008 systemd[1]: Created slice kubepods-besteffort-pod24b92c39_35eb_4a8e_be63_75ecf760556b.slice - libcontainer container kubepods-besteffort-pod24b92c39_35eb_4a8e_be63_75ecf760556b.slice. Sep 12 23:00:46.885361 kubelet[2763]: I0912 23:00:46.885245 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/24b92c39-35eb-4a8e-be63-75ecf760556b-kube-proxy\") pod \"kube-proxy-4fpbk\" (UID: \"24b92c39-35eb-4a8e-be63-75ecf760556b\") " pod="kube-system/kube-proxy-4fpbk" Sep 12 23:00:46.885361 kubelet[2763]: I0912 23:00:46.885277 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/24b92c39-35eb-4a8e-be63-75ecf760556b-xtables-lock\") pod \"kube-proxy-4fpbk\" (UID: \"24b92c39-35eb-4a8e-be63-75ecf760556b\") " pod="kube-system/kube-proxy-4fpbk" Sep 12 23:00:46.885361 kubelet[2763]: I0912 23:00:46.885294 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/24b92c39-35eb-4a8e-be63-75ecf760556b-lib-modules\") pod \"kube-proxy-4fpbk\" (UID: \"24b92c39-35eb-4a8e-be63-75ecf760556b\") " pod="kube-system/kube-proxy-4fpbk" Sep 12 23:00:46.885361 kubelet[2763]: I0912 23:00:46.885308 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tv9c\" (UniqueName: \"kubernetes.io/projected/24b92c39-35eb-4a8e-be63-75ecf760556b-kube-api-access-4tv9c\") pod \"kube-proxy-4fpbk\" (UID: \"24b92c39-35eb-4a8e-be63-75ecf760556b\") " pod="kube-system/kube-proxy-4fpbk" Sep 12 23:00:46.969340 systemd[1]: Created slice kubepods-besteffort-podf4f6e8a3_1388_4edf_9c6a_167733fd1f77.slice - libcontainer container kubepods-besteffort-podf4f6e8a3_1388_4edf_9c6a_167733fd1f77.slice. Sep 12 23:00:46.986345 kubelet[2763]: I0912 23:00:46.986264 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f4f6e8a3-1388-4edf-9c6a-167733fd1f77-var-lib-calico\") pod \"tigera-operator-58fc44c59b-v76mm\" (UID: \"f4f6e8a3-1388-4edf-9c6a-167733fd1f77\") " pod="tigera-operator/tigera-operator-58fc44c59b-v76mm" Sep 12 23:00:46.986345 kubelet[2763]: I0912 23:00:46.986349 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hcd7\" (UniqueName: \"kubernetes.io/projected/f4f6e8a3-1388-4edf-9c6a-167733fd1f77-kube-api-access-7hcd7\") pod \"tigera-operator-58fc44c59b-v76mm\" (UID: \"f4f6e8a3-1388-4edf-9c6a-167733fd1f77\") " pod="tigera-operator/tigera-operator-58fc44c59b-v76mm" Sep 12 23:00:47.111543 containerd[1563]: time="2025-09-12T23:00:47.111459522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4fpbk,Uid:24b92c39-35eb-4a8e-be63-75ecf760556b,Namespace:kube-system,Attempt:0,}" Sep 12 23:00:47.144839 containerd[1563]: time="2025-09-12T23:00:47.144580148Z" level=info msg="connecting to shim 0cc950c51da9f3feb34f4ba62b57e5f06bb8fd82c94163d108db52794c3d6783" address="unix:///run/containerd/s/0437d00e110b68f74be96f3a36528fd6913bf5ab2fd746249c0a8ce3c6bfcb09" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:00:47.177523 systemd[1]: Started cri-containerd-0cc950c51da9f3feb34f4ba62b57e5f06bb8fd82c94163d108db52794c3d6783.scope - libcontainer container 0cc950c51da9f3feb34f4ba62b57e5f06bb8fd82c94163d108db52794c3d6783. Sep 12 23:00:47.225054 containerd[1563]: time="2025-09-12T23:00:47.224886654Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4fpbk,Uid:24b92c39-35eb-4a8e-be63-75ecf760556b,Namespace:kube-system,Attempt:0,} returns sandbox id \"0cc950c51da9f3feb34f4ba62b57e5f06bb8fd82c94163d108db52794c3d6783\"" Sep 12 23:00:47.228312 containerd[1563]: time="2025-09-12T23:00:47.228270619Z" level=info msg="CreateContainer within sandbox \"0cc950c51da9f3feb34f4ba62b57e5f06bb8fd82c94163d108db52794c3d6783\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 23:00:47.242001 containerd[1563]: time="2025-09-12T23:00:47.241938249Z" level=info msg="Container ce5f454461cca631121998cfe8b10d46df19b381d69794fb973a540e9bd9827e: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:00:47.255378 containerd[1563]: time="2025-09-12T23:00:47.255307824Z" level=info msg="CreateContainer within sandbox \"0cc950c51da9f3feb34f4ba62b57e5f06bb8fd82c94163d108db52794c3d6783\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ce5f454461cca631121998cfe8b10d46df19b381d69794fb973a540e9bd9827e\"" Sep 12 23:00:47.256241 containerd[1563]: time="2025-09-12T23:00:47.256153662Z" level=info msg="StartContainer for \"ce5f454461cca631121998cfe8b10d46df19b381d69794fb973a540e9bd9827e\"" Sep 12 23:00:47.258189 containerd[1563]: time="2025-09-12T23:00:47.258158773Z" level=info msg="connecting to shim ce5f454461cca631121998cfe8b10d46df19b381d69794fb973a540e9bd9827e" address="unix:///run/containerd/s/0437d00e110b68f74be96f3a36528fd6913bf5ab2fd746249c0a8ce3c6bfcb09" protocol=ttrpc version=3 Sep 12 23:00:47.275005 containerd[1563]: time="2025-09-12T23:00:47.274931829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-v76mm,Uid:f4f6e8a3-1388-4edf-9c6a-167733fd1f77,Namespace:tigera-operator,Attempt:0,}" Sep 12 23:00:47.290397 systemd[1]: Started cri-containerd-ce5f454461cca631121998cfe8b10d46df19b381d69794fb973a540e9bd9827e.scope - libcontainer container ce5f454461cca631121998cfe8b10d46df19b381d69794fb973a540e9bd9827e. Sep 12 23:00:47.301251 containerd[1563]: time="2025-09-12T23:00:47.300593548Z" level=info msg="connecting to shim bd429d61506f4dfb8ff1527a3ff3b0277cb69582a81e35773e2c0e9f30f3aa88" address="unix:///run/containerd/s/602b9e110756cc263263b6d42721e3ca3ddabe4e278c3ed97aed497669c883df" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:00:47.332614 systemd[1]: Started cri-containerd-bd429d61506f4dfb8ff1527a3ff3b0277cb69582a81e35773e2c0e9f30f3aa88.scope - libcontainer container bd429d61506f4dfb8ff1527a3ff3b0277cb69582a81e35773e2c0e9f30f3aa88. Sep 12 23:00:47.358637 containerd[1563]: time="2025-09-12T23:00:47.358546837Z" level=info msg="StartContainer for \"ce5f454461cca631121998cfe8b10d46df19b381d69794fb973a540e9bd9827e\" returns successfully" Sep 12 23:00:47.389537 containerd[1563]: time="2025-09-12T23:00:47.389374616Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-v76mm,Uid:f4f6e8a3-1388-4edf-9c6a-167733fd1f77,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"bd429d61506f4dfb8ff1527a3ff3b0277cb69582a81e35773e2c0e9f30f3aa88\"" Sep 12 23:00:47.393722 containerd[1563]: time="2025-09-12T23:00:47.393134110Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 23:00:48.217321 kubelet[2763]: I0912 23:00:48.216926 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-4fpbk" podStartSLOduration=2.216898358 podStartE2EDuration="2.216898358s" podCreationTimestamp="2025-09-12 23:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:00:48.216580565 +0000 UTC m=+8.189735025" watchObservedRunningTime="2025-09-12 23:00:48.216898358 +0000 UTC m=+8.190052818" Sep 12 23:00:50.984058 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1966749457.mount: Deactivated successfully. Sep 12 23:00:51.427153 containerd[1563]: time="2025-09-12T23:00:51.427030992Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:51.428368 containerd[1563]: time="2025-09-12T23:00:51.428186471Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 12 23:00:51.429364 containerd[1563]: time="2025-09-12T23:00:51.429327761Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:51.432020 containerd[1563]: time="2025-09-12T23:00:51.431975536Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:00:51.432828 containerd[1563]: time="2025-09-12T23:00:51.432787592Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 4.039611833s" Sep 12 23:00:51.432929 containerd[1563]: time="2025-09-12T23:00:51.432912285Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 12 23:00:51.435851 containerd[1563]: time="2025-09-12T23:00:51.435804205Z" level=info msg="CreateContainer within sandbox \"bd429d61506f4dfb8ff1527a3ff3b0277cb69582a81e35773e2c0e9f30f3aa88\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 23:00:51.447929 containerd[1563]: time="2025-09-12T23:00:51.446446548Z" level=info msg="Container 45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:00:51.453484 containerd[1563]: time="2025-09-12T23:00:51.453425279Z" level=info msg="CreateContainer within sandbox \"bd429d61506f4dfb8ff1527a3ff3b0277cb69582a81e35773e2c0e9f30f3aa88\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772\"" Sep 12 23:00:51.454399 containerd[1563]: time="2025-09-12T23:00:51.454367749Z" level=info msg="StartContainer for \"45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772\"" Sep 12 23:00:51.456134 containerd[1563]: time="2025-09-12T23:00:51.456008172Z" level=info msg="connecting to shim 45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772" address="unix:///run/containerd/s/602b9e110756cc263263b6d42721e3ca3ddabe4e278c3ed97aed497669c883df" protocol=ttrpc version=3 Sep 12 23:00:51.478276 systemd[1]: Started cri-containerd-45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772.scope - libcontainer container 45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772. Sep 12 23:00:51.508616 containerd[1563]: time="2025-09-12T23:00:51.508512274Z" level=info msg="StartContainer for \"45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772\" returns successfully" Sep 12 23:00:57.642598 sudo[1832]: pam_unix(sudo:session): session closed for user root Sep 12 23:00:57.820087 sshd[1831]: Connection closed by 139.178.89.65 port 48286 Sep 12 23:00:57.820481 sshd-session[1828]: pam_unix(sshd:session): session closed for user core Sep 12 23:00:57.824827 systemd-logind[1533]: Session 7 logged out. Waiting for processes to exit. Sep 12 23:00:57.825782 systemd[1]: sshd@6-46.62.198.104:22-139.178.89.65:48286.service: Deactivated successfully. Sep 12 23:00:57.830070 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 23:00:57.830452 systemd[1]: session-7.scope: Consumed 5.004s CPU time, 163.6M memory peak. Sep 12 23:00:57.833190 systemd-logind[1533]: Removed session 7. Sep 12 23:01:00.425063 kubelet[2763]: I0912 23:01:00.424221 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-v76mm" podStartSLOduration=10.382181831 podStartE2EDuration="14.424203823s" podCreationTimestamp="2025-09-12 23:00:46 +0000 UTC" firstStartedPulling="2025-09-12 23:00:47.391648348 +0000 UTC m=+7.364802787" lastFinishedPulling="2025-09-12 23:00:51.43367034 +0000 UTC m=+11.406824779" observedRunningTime="2025-09-12 23:00:52.235807197 +0000 UTC m=+12.208961667" watchObservedRunningTime="2025-09-12 23:01:00.424203823 +0000 UTC m=+20.397358252" Sep 12 23:01:00.431333 systemd[1]: Created slice kubepods-besteffort-poda001132d_dbcf_4eb0_8709_6505d1bd86b6.slice - libcontainer container kubepods-besteffort-poda001132d_dbcf_4eb0_8709_6505d1bd86b6.slice. Sep 12 23:01:00.478633 kubelet[2763]: I0912 23:01:00.478584 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/a001132d-dbcf-4eb0-8709-6505d1bd86b6-typha-certs\") pod \"calico-typha-b6b56df74-gs7nw\" (UID: \"a001132d-dbcf-4eb0-8709-6505d1bd86b6\") " pod="calico-system/calico-typha-b6b56df74-gs7nw" Sep 12 23:01:00.478780 kubelet[2763]: I0912 23:01:00.478643 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh8nz\" (UniqueName: \"kubernetes.io/projected/a001132d-dbcf-4eb0-8709-6505d1bd86b6-kube-api-access-jh8nz\") pod \"calico-typha-b6b56df74-gs7nw\" (UID: \"a001132d-dbcf-4eb0-8709-6505d1bd86b6\") " pod="calico-system/calico-typha-b6b56df74-gs7nw" Sep 12 23:01:00.478780 kubelet[2763]: I0912 23:01:00.478663 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a001132d-dbcf-4eb0-8709-6505d1bd86b6-tigera-ca-bundle\") pod \"calico-typha-b6b56df74-gs7nw\" (UID: \"a001132d-dbcf-4eb0-8709-6505d1bd86b6\") " pod="calico-system/calico-typha-b6b56df74-gs7nw" Sep 12 23:01:00.711344 systemd[1]: Created slice kubepods-besteffort-podf640cd7d_ab5a_49b2_8ed5_41c6590121c4.slice - libcontainer container kubepods-besteffort-podf640cd7d_ab5a_49b2_8ed5_41c6590121c4.slice. Sep 12 23:01:00.743063 containerd[1563]: time="2025-09-12T23:01:00.742862642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b6b56df74-gs7nw,Uid:a001132d-dbcf-4eb0-8709-6505d1bd86b6,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:00.778833 containerd[1563]: time="2025-09-12T23:01:00.778705464Z" level=info msg="connecting to shim c1ffe1429b3888490641b47799070b6c29d06665350f3d9d60aadc8fe98309ae" address="unix:///run/containerd/s/966134c7bc07545836de2f3e908742b67018574ccdd93d05b102ba78102cd782" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:00.781316 kubelet[2763]: I0912 23:01:00.781274 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-node-certs\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781431 kubelet[2763]: I0912 23:01:00.781325 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-xtables-lock\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781431 kubelet[2763]: I0912 23:01:00.781344 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-var-lib-calico\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781431 kubelet[2763]: I0912 23:01:00.781362 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-var-run-calico\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781431 kubelet[2763]: I0912 23:01:00.781381 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-cni-bin-dir\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781431 kubelet[2763]: I0912 23:01:00.781398 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-lib-modules\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781583 kubelet[2763]: I0912 23:01:00.781413 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-cni-log-dir\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781583 kubelet[2763]: I0912 23:01:00.781432 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-cni-net-dir\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781583 kubelet[2763]: I0912 23:01:00.781450 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-flexvol-driver-host\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781583 kubelet[2763]: I0912 23:01:00.781467 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-policysync\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781583 kubelet[2763]: I0912 23:01:00.781484 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-tigera-ca-bundle\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.781714 kubelet[2763]: I0912 23:01:00.781508 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnxj4\" (UniqueName: \"kubernetes.io/projected/f640cd7d-ab5a-49b2-8ed5-41c6590121c4-kube-api-access-fnxj4\") pod \"calico-node-cj6rj\" (UID: \"f640cd7d-ab5a-49b2-8ed5-41c6590121c4\") " pod="calico-system/calico-node-cj6rj" Sep 12 23:01:00.814262 systemd[1]: Started cri-containerd-c1ffe1429b3888490641b47799070b6c29d06665350f3d9d60aadc8fe98309ae.scope - libcontainer container c1ffe1429b3888490641b47799070b6c29d06665350f3d9d60aadc8fe98309ae. Sep 12 23:01:00.876644 containerd[1563]: time="2025-09-12T23:01:00.876472537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-b6b56df74-gs7nw,Uid:a001132d-dbcf-4eb0-8709-6505d1bd86b6,Namespace:calico-system,Attempt:0,} returns sandbox id \"c1ffe1429b3888490641b47799070b6c29d06665350f3d9d60aadc8fe98309ae\"" Sep 12 23:01:00.879382 containerd[1563]: time="2025-09-12T23:01:00.879179438Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 23:01:00.885701 kubelet[2763]: E0912 23:01:00.885677 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:00.885995 kubelet[2763]: W0912 23:01:00.885815 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:00.885995 kubelet[2763]: E0912 23:01:00.885845 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:00.896203 kubelet[2763]: E0912 23:01:00.896176 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:00.897096 kubelet[2763]: W0912 23:01:00.896331 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:00.897096 kubelet[2763]: E0912 23:01:00.896396 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:00.902090 kubelet[2763]: E0912 23:01:00.902008 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:00.902090 kubelet[2763]: W0912 23:01:00.902034 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:00.902324 kubelet[2763]: E0912 23:01:00.902105 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.000952 kubelet[2763]: E0912 23:01:01.000832 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:01.016977 containerd[1563]: time="2025-09-12T23:01:01.016926600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cj6rj,Uid:f640cd7d-ab5a-49b2-8ed5-41c6590121c4,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:01.042734 containerd[1563]: time="2025-09-12T23:01:01.042587981Z" level=info msg="connecting to shim 48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac" address="unix:///run/containerd/s/6be7c9720bca30298a62d56055e65f0e9273cb742612e377348c50f201cf379e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:01.066610 kubelet[2763]: E0912 23:01:01.066564 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.066610 kubelet[2763]: W0912 23:01:01.066590 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.066901 kubelet[2763]: E0912 23:01:01.066633 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.066901 kubelet[2763]: E0912 23:01:01.066814 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.066901 kubelet[2763]: W0912 23:01:01.066822 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.066901 kubelet[2763]: E0912 23:01:01.066829 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.067008 kubelet[2763]: E0912 23:01:01.066928 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.067008 kubelet[2763]: W0912 23:01:01.066934 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.067008 kubelet[2763]: E0912 23:01:01.066941 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.067418 kubelet[2763]: E0912 23:01:01.067079 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.067418 kubelet[2763]: W0912 23:01:01.067086 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.067418 kubelet[2763]: E0912 23:01:01.067093 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.067418 kubelet[2763]: E0912 23:01:01.067223 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.067418 kubelet[2763]: W0912 23:01:01.067229 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.067418 kubelet[2763]: E0912 23:01:01.067237 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.067418 kubelet[2763]: E0912 23:01:01.067363 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.067418 kubelet[2763]: W0912 23:01:01.067369 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.067418 kubelet[2763]: E0912 23:01:01.067376 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.067798 kubelet[2763]: E0912 23:01:01.067738 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.067798 kubelet[2763]: W0912 23:01:01.067755 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.067798 kubelet[2763]: E0912 23:01:01.067771 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.067968 kubelet[2763]: E0912 23:01:01.067936 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.067968 kubelet[2763]: W0912 23:01:01.067944 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.067968 kubelet[2763]: E0912 23:01:01.067967 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.068105 kubelet[2763]: E0912 23:01:01.068088 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.068105 kubelet[2763]: W0912 23:01:01.068094 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.068105 kubelet[2763]: E0912 23:01:01.068100 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.068300 kubelet[2763]: E0912 23:01:01.068242 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.068300 kubelet[2763]: W0912 23:01:01.068248 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.068300 kubelet[2763]: E0912 23:01:01.068254 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.068398 kubelet[2763]: E0912 23:01:01.068352 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.068398 kubelet[2763]: W0912 23:01:01.068357 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.068398 kubelet[2763]: E0912 23:01:01.068363 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.068566 kubelet[2763]: E0912 23:01:01.068485 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.068566 kubelet[2763]: W0912 23:01:01.068491 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.068566 kubelet[2763]: E0912 23:01:01.068497 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.068704 kubelet[2763]: E0912 23:01:01.068655 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.068704 kubelet[2763]: W0912 23:01:01.068664 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.068704 kubelet[2763]: E0912 23:01:01.068671 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.068822 kubelet[2763]: E0912 23:01:01.068797 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.068822 kubelet[2763]: W0912 23:01:01.068820 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.068860 kubelet[2763]: E0912 23:01:01.068827 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.069099 kubelet[2763]: E0912 23:01:01.068924 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.069099 kubelet[2763]: W0912 23:01:01.068931 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.069099 kubelet[2763]: E0912 23:01:01.068937 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.069099 kubelet[2763]: E0912 23:01:01.069036 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.069099 kubelet[2763]: W0912 23:01:01.069066 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.069099 kubelet[2763]: E0912 23:01:01.069074 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.069244 kubelet[2763]: E0912 23:01:01.069186 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.069244 kubelet[2763]: W0912 23:01:01.069191 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.069244 kubelet[2763]: E0912 23:01:01.069198 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.069384 kubelet[2763]: E0912 23:01:01.069306 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.069384 kubelet[2763]: W0912 23:01:01.069313 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.069384 kubelet[2763]: E0912 23:01:01.069320 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.069451 kubelet[2763]: E0912 23:01:01.069417 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.069451 kubelet[2763]: W0912 23:01:01.069423 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.069451 kubelet[2763]: E0912 23:01:01.069430 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.069685 kubelet[2763]: E0912 23:01:01.069569 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.069685 kubelet[2763]: W0912 23:01:01.069580 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.069685 kubelet[2763]: E0912 23:01:01.069588 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.070257 systemd[1]: Started cri-containerd-48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac.scope - libcontainer container 48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac. Sep 12 23:01:01.082958 kubelet[2763]: E0912 23:01:01.082928 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.082958 kubelet[2763]: W0912 23:01:01.082945 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.083128 kubelet[2763]: E0912 23:01:01.082969 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.083327 kubelet[2763]: I0912 23:01:01.083306 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hmq8\" (UniqueName: \"kubernetes.io/projected/bf244edb-3fa8-4ebb-a022-bf4d9953e4eb-kube-api-access-4hmq8\") pod \"csi-node-driver-mbsv9\" (UID: \"bf244edb-3fa8-4ebb-a022-bf4d9953e4eb\") " pod="calico-system/csi-node-driver-mbsv9" Sep 12 23:01:01.083674 kubelet[2763]: E0912 23:01:01.083653 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.083674 kubelet[2763]: W0912 23:01:01.083665 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.083767 kubelet[2763]: E0912 23:01:01.083750 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.083790 kubelet[2763]: I0912 23:01:01.083768 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/bf244edb-3fa8-4ebb-a022-bf4d9953e4eb-varrun\") pod \"csi-node-driver-mbsv9\" (UID: \"bf244edb-3fa8-4ebb-a022-bf4d9953e4eb\") " pod="calico-system/csi-node-driver-mbsv9" Sep 12 23:01:01.084118 kubelet[2763]: E0912 23:01:01.084101 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.084118 kubelet[2763]: W0912 23:01:01.084111 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.084234 kubelet[2763]: E0912 23:01:01.084119 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.084234 kubelet[2763]: I0912 23:01:01.084134 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bf244edb-3fa8-4ebb-a022-bf4d9953e4eb-kubelet-dir\") pod \"csi-node-driver-mbsv9\" (UID: \"bf244edb-3fa8-4ebb-a022-bf4d9953e4eb\") " pod="calico-system/csi-node-driver-mbsv9" Sep 12 23:01:01.084542 kubelet[2763]: E0912 23:01:01.084524 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.084542 kubelet[2763]: W0912 23:01:01.084535 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.084600 kubelet[2763]: E0912 23:01:01.084549 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.084835 kubelet[2763]: E0912 23:01:01.084687 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.084835 kubelet[2763]: W0912 23:01:01.084828 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.084889 kubelet[2763]: E0912 23:01:01.084837 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.085170 kubelet[2763]: E0912 23:01:01.085130 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.085170 kubelet[2763]: W0912 23:01:01.085147 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.085251 kubelet[2763]: E0912 23:01:01.085214 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.085529 kubelet[2763]: I0912 23:01:01.085466 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bf244edb-3fa8-4ebb-a022-bf4d9953e4eb-registration-dir\") pod \"csi-node-driver-mbsv9\" (UID: \"bf244edb-3fa8-4ebb-a022-bf4d9953e4eb\") " pod="calico-system/csi-node-driver-mbsv9" Sep 12 23:01:01.085625 kubelet[2763]: E0912 23:01:01.085603 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.085625 kubelet[2763]: W0912 23:01:01.085616 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.085722 kubelet[2763]: E0912 23:01:01.085627 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.086113 kubelet[2763]: E0912 23:01:01.086096 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.086113 kubelet[2763]: W0912 23:01:01.086108 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.086197 kubelet[2763]: E0912 23:01:01.086116 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.086312 kubelet[2763]: E0912 23:01:01.086279 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.086312 kubelet[2763]: W0912 23:01:01.086291 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.086312 kubelet[2763]: E0912 23:01:01.086298 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.087865 kubelet[2763]: E0912 23:01:01.087838 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.087865 kubelet[2763]: W0912 23:01:01.087852 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.087865 kubelet[2763]: E0912 23:01:01.087864 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.088027 kubelet[2763]: I0912 23:01:01.087886 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bf244edb-3fa8-4ebb-a022-bf4d9953e4eb-socket-dir\") pod \"csi-node-driver-mbsv9\" (UID: \"bf244edb-3fa8-4ebb-a022-bf4d9953e4eb\") " pod="calico-system/csi-node-driver-mbsv9" Sep 12 23:01:01.088311 kubelet[2763]: E0912 23:01:01.088286 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.088311 kubelet[2763]: W0912 23:01:01.088300 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.088311 kubelet[2763]: E0912 23:01:01.088310 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.090429 kubelet[2763]: E0912 23:01:01.090406 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.090429 kubelet[2763]: W0912 23:01:01.090417 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.090429 kubelet[2763]: E0912 23:01:01.090427 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.090672 kubelet[2763]: E0912 23:01:01.090655 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.090672 kubelet[2763]: W0912 23:01:01.090666 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.090739 kubelet[2763]: E0912 23:01:01.090673 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.090867 kubelet[2763]: E0912 23:01:01.090847 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.090867 kubelet[2763]: W0912 23:01:01.090858 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.090867 kubelet[2763]: E0912 23:01:01.090866 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.091075 kubelet[2763]: E0912 23:01:01.091028 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.091075 kubelet[2763]: W0912 23:01:01.091055 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.091075 kubelet[2763]: E0912 23:01:01.091064 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.104205 containerd[1563]: time="2025-09-12T23:01:01.104030093Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-cj6rj,Uid:f640cd7d-ab5a-49b2-8ed5-41c6590121c4,Namespace:calico-system,Attempt:0,} returns sandbox id \"48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac\"" Sep 12 23:01:01.190999 kubelet[2763]: E0912 23:01:01.190944 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.190999 kubelet[2763]: W0912 23:01:01.190991 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.191233 kubelet[2763]: E0912 23:01:01.191014 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.191380 kubelet[2763]: E0912 23:01:01.191324 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.191380 kubelet[2763]: W0912 23:01:01.191375 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.191487 kubelet[2763]: E0912 23:01:01.191392 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.191712 kubelet[2763]: E0912 23:01:01.191597 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.191712 kubelet[2763]: W0912 23:01:01.191607 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.191712 kubelet[2763]: E0912 23:01:01.191632 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.191859 kubelet[2763]: E0912 23:01:01.191832 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.191859 kubelet[2763]: W0912 23:01:01.191844 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.191859 kubelet[2763]: E0912 23:01:01.191853 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.192347 kubelet[2763]: E0912 23:01:01.192002 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.192347 kubelet[2763]: W0912 23:01:01.192008 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.192347 kubelet[2763]: E0912 23:01:01.192014 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.192347 kubelet[2763]: E0912 23:01:01.192264 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.192347 kubelet[2763]: W0912 23:01:01.192271 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.192347 kubelet[2763]: E0912 23:01:01.192278 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.192536 kubelet[2763]: E0912 23:01:01.192396 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.192536 kubelet[2763]: W0912 23:01:01.192401 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.192536 kubelet[2763]: E0912 23:01:01.192408 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.192536 kubelet[2763]: E0912 23:01:01.192530 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.192660 kubelet[2763]: W0912 23:01:01.192538 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.192660 kubelet[2763]: E0912 23:01:01.192547 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.192735 kubelet[2763]: E0912 23:01:01.192706 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.192735 kubelet[2763]: W0912 23:01:01.192712 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.192735 kubelet[2763]: E0912 23:01:01.192719 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.193340 kubelet[2763]: E0912 23:01:01.193305 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.193340 kubelet[2763]: W0912 23:01:01.193323 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.193592 kubelet[2763]: E0912 23:01:01.193469 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.193965 kubelet[2763]: E0912 23:01:01.193935 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.193965 kubelet[2763]: W0912 23:01:01.193949 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.194237 kubelet[2763]: E0912 23:01:01.194210 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.194451 kubelet[2763]: E0912 23:01:01.194435 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.194485 kubelet[2763]: W0912 23:01:01.194451 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.194634 kubelet[2763]: E0912 23:01:01.194614 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.194709 kubelet[2763]: E0912 23:01:01.194689 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.194709 kubelet[2763]: W0912 23:01:01.194705 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.194861 kubelet[2763]: E0912 23:01:01.194848 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.194916 kubelet[2763]: E0912 23:01:01.194907 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.194998 kubelet[2763]: W0912 23:01:01.194917 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.194998 kubelet[2763]: E0912 23:01:01.194944 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.195254 kubelet[2763]: E0912 23:01:01.195224 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.195254 kubelet[2763]: W0912 23:01:01.195239 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.195312 kubelet[2763]: E0912 23:01:01.195276 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.195590 kubelet[2763]: E0912 23:01:01.195570 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.195590 kubelet[2763]: W0912 23:01:01.195585 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.195650 kubelet[2763]: E0912 23:01:01.195603 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.195995 kubelet[2763]: E0912 23:01:01.195952 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.196143 kubelet[2763]: W0912 23:01:01.196104 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.196143 kubelet[2763]: E0912 23:01:01.196130 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.196362 kubelet[2763]: E0912 23:01:01.196348 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.196391 kubelet[2763]: W0912 23:01:01.196363 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.196499 kubelet[2763]: E0912 23:01:01.196408 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.196656 kubelet[2763]: E0912 23:01:01.196642 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.196681 kubelet[2763]: W0912 23:01:01.196657 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.196849 kubelet[2763]: E0912 23:01:01.196729 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.196909 kubelet[2763]: E0912 23:01:01.196895 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.196933 kubelet[2763]: W0912 23:01:01.196910 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.197035 kubelet[2763]: E0912 23:01:01.197019 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.197279 kubelet[2763]: E0912 23:01:01.197259 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.197325 kubelet[2763]: W0912 23:01:01.197311 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.197350 kubelet[2763]: E0912 23:01:01.197339 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.197559 kubelet[2763]: E0912 23:01:01.197543 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.197606 kubelet[2763]: W0912 23:01:01.197583 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.197669 kubelet[2763]: E0912 23:01:01.197652 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.197982 kubelet[2763]: E0912 23:01:01.197967 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.198007 kubelet[2763]: W0912 23:01:01.197985 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.198037 kubelet[2763]: E0912 23:01:01.198008 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.198387 kubelet[2763]: E0912 23:01:01.198368 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.198439 kubelet[2763]: W0912 23:01:01.198387 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.198439 kubelet[2763]: E0912 23:01:01.198414 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.199237 kubelet[2763]: E0912 23:01:01.199197 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.199237 kubelet[2763]: W0912 23:01:01.199207 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.199237 kubelet[2763]: E0912 23:01:01.199215 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:01.201682 kubelet[2763]: E0912 23:01:01.201660 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:01.201736 kubelet[2763]: W0912 23:01:01.201681 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:01.201736 kubelet[2763]: E0912 23:01:01.201695 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:03.143969 kubelet[2763]: E0912 23:01:03.143853 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:05.143731 kubelet[2763]: E0912 23:01:05.143637 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:07.143651 kubelet[2763]: E0912 23:01:07.143546 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:09.143902 kubelet[2763]: E0912 23:01:09.143264 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:11.144021 kubelet[2763]: E0912 23:01:11.143469 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:12.202510 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount86887289.mount: Deactivated successfully. Sep 12 23:01:12.800423 containerd[1563]: time="2025-09-12T23:01:12.800366592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:12.801677 containerd[1563]: time="2025-09-12T23:01:12.801538656Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 12 23:01:12.802781 containerd[1563]: time="2025-09-12T23:01:12.802756333Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:12.805220 containerd[1563]: time="2025-09-12T23:01:12.805170891Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:12.805830 containerd[1563]: time="2025-09-12T23:01:12.805798786Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 11.92658834s" Sep 12 23:01:12.805830 containerd[1563]: time="2025-09-12T23:01:12.805826999Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 12 23:01:12.808544 containerd[1563]: time="2025-09-12T23:01:12.807231728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 23:01:12.818757 containerd[1563]: time="2025-09-12T23:01:12.818668528Z" level=info msg="CreateContainer within sandbox \"c1ffe1429b3888490641b47799070b6c29d06665350f3d9d60aadc8fe98309ae\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 23:01:12.827134 containerd[1563]: time="2025-09-12T23:01:12.824936926Z" level=info msg="Container 7f65812b094fe3d74cafa22c850e6c06edf6c48d001419556431a68f2d84d9f3: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:12.865628 containerd[1563]: time="2025-09-12T23:01:12.865538137Z" level=info msg="CreateContainer within sandbox \"c1ffe1429b3888490641b47799070b6c29d06665350f3d9d60aadc8fe98309ae\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7f65812b094fe3d74cafa22c850e6c06edf6c48d001419556431a68f2d84d9f3\"" Sep 12 23:01:12.866522 containerd[1563]: time="2025-09-12T23:01:12.866481662Z" level=info msg="StartContainer for \"7f65812b094fe3d74cafa22c850e6c06edf6c48d001419556431a68f2d84d9f3\"" Sep 12 23:01:12.867595 containerd[1563]: time="2025-09-12T23:01:12.867568155Z" level=info msg="connecting to shim 7f65812b094fe3d74cafa22c850e6c06edf6c48d001419556431a68f2d84d9f3" address="unix:///run/containerd/s/966134c7bc07545836de2f3e908742b67018574ccdd93d05b102ba78102cd782" protocol=ttrpc version=3 Sep 12 23:01:12.886254 systemd[1]: Started cri-containerd-7f65812b094fe3d74cafa22c850e6c06edf6c48d001419556431a68f2d84d9f3.scope - libcontainer container 7f65812b094fe3d74cafa22c850e6c06edf6c48d001419556431a68f2d84d9f3. Sep 12 23:01:12.955715 containerd[1563]: time="2025-09-12T23:01:12.955625653Z" level=info msg="StartContainer for \"7f65812b094fe3d74cafa22c850e6c06edf6c48d001419556431a68f2d84d9f3\" returns successfully" Sep 12 23:01:13.144488 kubelet[2763]: E0912 23:01:13.144126 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:13.286521 kubelet[2763]: I0912 23:01:13.286441 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-b6b56df74-gs7nw" podStartSLOduration=1.357260999 podStartE2EDuration="13.286236413s" podCreationTimestamp="2025-09-12 23:01:00 +0000 UTC" firstStartedPulling="2025-09-12 23:01:00.877665427 +0000 UTC m=+20.850819867" lastFinishedPulling="2025-09-12 23:01:12.806640842 +0000 UTC m=+32.779795281" observedRunningTime="2025-09-12 23:01:13.283059118 +0000 UTC m=+33.256213557" watchObservedRunningTime="2025-09-12 23:01:13.286236413 +0000 UTC m=+33.259390852" Sep 12 23:01:13.357577 kubelet[2763]: E0912 23:01:13.357541 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.357577 kubelet[2763]: W0912 23:01:13.357564 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.357722 kubelet[2763]: E0912 23:01:13.357586 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.357746 kubelet[2763]: E0912 23:01:13.357737 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.357746 kubelet[2763]: W0912 23:01:13.357743 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.357794 kubelet[2763]: E0912 23:01:13.357750 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.357889 kubelet[2763]: E0912 23:01:13.357869 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.357889 kubelet[2763]: W0912 23:01:13.357881 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.358454 kubelet[2763]: E0912 23:01:13.357890 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.358454 kubelet[2763]: E0912 23:01:13.358055 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.358454 kubelet[2763]: W0912 23:01:13.358067 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.358454 kubelet[2763]: E0912 23:01:13.358075 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.358454 kubelet[2763]: E0912 23:01:13.358238 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.358454 kubelet[2763]: W0912 23:01:13.358245 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.358454 kubelet[2763]: E0912 23:01:13.358253 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.358454 kubelet[2763]: E0912 23:01:13.358395 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.358454 kubelet[2763]: W0912 23:01:13.358403 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.358454 kubelet[2763]: E0912 23:01:13.358413 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.358799 kubelet[2763]: E0912 23:01:13.358507 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.358799 kubelet[2763]: W0912 23:01:13.358513 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.358799 kubelet[2763]: E0912 23:01:13.358519 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.358799 kubelet[2763]: E0912 23:01:13.358715 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.358799 kubelet[2763]: W0912 23:01:13.358720 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.358799 kubelet[2763]: E0912 23:01:13.358726 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.358910 kubelet[2763]: E0912 23:01:13.358807 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.358910 kubelet[2763]: W0912 23:01:13.358813 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.358910 kubelet[2763]: E0912 23:01:13.358819 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.358910 kubelet[2763]: E0912 23:01:13.358896 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.358910 kubelet[2763]: W0912 23:01:13.358901 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.358910 kubelet[2763]: E0912 23:01:13.358907 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.359066 kubelet[2763]: E0912 23:01:13.358988 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.359066 kubelet[2763]: W0912 23:01:13.358993 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.359066 kubelet[2763]: E0912 23:01:13.358999 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.359150 kubelet[2763]: E0912 23:01:13.359097 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.359150 kubelet[2763]: W0912 23:01:13.359103 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.359150 kubelet[2763]: E0912 23:01:13.359109 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.359223 kubelet[2763]: E0912 23:01:13.359211 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.359223 kubelet[2763]: W0912 23:01:13.359218 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.359317 kubelet[2763]: E0912 23:01:13.359225 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.359347 kubelet[2763]: E0912 23:01:13.359337 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.359347 kubelet[2763]: W0912 23:01:13.359343 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.359387 kubelet[2763]: E0912 23:01:13.359349 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.359468 kubelet[2763]: E0912 23:01:13.359458 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.359468 kubelet[2763]: W0912 23:01:13.359466 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.359515 kubelet[2763]: E0912 23:01:13.359472 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.383151 kubelet[2763]: E0912 23:01:13.383102 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.383151 kubelet[2763]: W0912 23:01:13.383143 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.383365 kubelet[2763]: E0912 23:01:13.383167 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.383365 kubelet[2763]: E0912 23:01:13.383352 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.383365 kubelet[2763]: W0912 23:01:13.383362 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.383722 kubelet[2763]: E0912 23:01:13.383381 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.383722 kubelet[2763]: E0912 23:01:13.383571 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.383722 kubelet[2763]: W0912 23:01:13.383589 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.383888 kubelet[2763]: E0912 23:01:13.383857 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.384221 kubelet[2763]: E0912 23:01:13.384195 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.384221 kubelet[2763]: W0912 23:01:13.384214 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.384315 kubelet[2763]: E0912 23:01:13.384234 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.384485 kubelet[2763]: E0912 23:01:13.384461 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.384485 kubelet[2763]: W0912 23:01:13.384477 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.384609 kubelet[2763]: E0912 23:01:13.384579 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.384786 kubelet[2763]: E0912 23:01:13.384761 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.384836 kubelet[2763]: W0912 23:01:13.384794 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.384914 kubelet[2763]: E0912 23:01:13.384888 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.385165 kubelet[2763]: E0912 23:01:13.385113 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.385165 kubelet[2763]: W0912 23:01:13.385156 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.385335 kubelet[2763]: E0912 23:01:13.385290 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.385383 kubelet[2763]: E0912 23:01:13.385370 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.385383 kubelet[2763]: W0912 23:01:13.385380 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.385449 kubelet[2763]: E0912 23:01:13.385407 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.385740 kubelet[2763]: E0912 23:01:13.385714 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.385740 kubelet[2763]: W0912 23:01:13.385731 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.385827 kubelet[2763]: E0912 23:01:13.385749 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.386014 kubelet[2763]: E0912 23:01:13.385986 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.386014 kubelet[2763]: W0912 23:01:13.386005 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.386154 kubelet[2763]: E0912 23:01:13.386027 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.386319 kubelet[2763]: E0912 23:01:13.386290 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.386319 kubelet[2763]: W0912 23:01:13.386312 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.386407 kubelet[2763]: E0912 23:01:13.386341 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.386593 kubelet[2763]: E0912 23:01:13.386569 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.386593 kubelet[2763]: W0912 23:01:13.386584 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.386742 kubelet[2763]: E0912 23:01:13.386681 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.387183 kubelet[2763]: E0912 23:01:13.387155 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.387183 kubelet[2763]: W0912 23:01:13.387172 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.387282 kubelet[2763]: E0912 23:01:13.387262 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.387426 kubelet[2763]: E0912 23:01:13.387408 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.387426 kubelet[2763]: W0912 23:01:13.387425 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.387541 kubelet[2763]: E0912 23:01:13.387516 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.387696 kubelet[2763]: E0912 23:01:13.387679 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.387696 kubelet[2763]: W0912 23:01:13.387694 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.387794 kubelet[2763]: E0912 23:01:13.387722 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.387956 kubelet[2763]: E0912 23:01:13.387936 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.387956 kubelet[2763]: W0912 23:01:13.387951 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.388092 kubelet[2763]: E0912 23:01:13.387963 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.388313 kubelet[2763]: E0912 23:01:13.388285 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.388313 kubelet[2763]: W0912 23:01:13.388306 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.388420 kubelet[2763]: E0912 23:01:13.388322 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:13.388769 kubelet[2763]: E0912 23:01:13.388743 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:13.388769 kubelet[2763]: W0912 23:01:13.388760 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:13.388859 kubelet[2763]: E0912 23:01:13.388773 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.368026 kubelet[2763]: E0912 23:01:14.367977 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.368026 kubelet[2763]: W0912 23:01:14.368000 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.369401 kubelet[2763]: E0912 23:01:14.368066 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.369401 kubelet[2763]: E0912 23:01:14.368310 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.369401 kubelet[2763]: W0912 23:01:14.368320 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.369401 kubelet[2763]: E0912 23:01:14.368363 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.369401 kubelet[2763]: E0912 23:01:14.368665 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.369401 kubelet[2763]: W0912 23:01:14.368675 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.369401 kubelet[2763]: E0912 23:01:14.368684 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.369401 kubelet[2763]: E0912 23:01:14.368849 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.369401 kubelet[2763]: W0912 23:01:14.368858 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.369401 kubelet[2763]: E0912 23:01:14.368888 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.369778 kubelet[2763]: E0912 23:01:14.369071 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.369778 kubelet[2763]: W0912 23:01:14.369080 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.369778 kubelet[2763]: E0912 23:01:14.369089 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.369778 kubelet[2763]: E0912 23:01:14.369256 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.369778 kubelet[2763]: W0912 23:01:14.369264 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.369778 kubelet[2763]: E0912 23:01:14.369272 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.369778 kubelet[2763]: E0912 23:01:14.369434 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.369778 kubelet[2763]: W0912 23:01:14.369443 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.369778 kubelet[2763]: E0912 23:01:14.369460 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.369778 kubelet[2763]: E0912 23:01:14.369609 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.371516 kubelet[2763]: W0912 23:01:14.369616 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.371516 kubelet[2763]: E0912 23:01:14.369625 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.371516 kubelet[2763]: E0912 23:01:14.369797 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.371516 kubelet[2763]: W0912 23:01:14.369805 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.371516 kubelet[2763]: E0912 23:01:14.369816 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.371516 kubelet[2763]: E0912 23:01:14.369971 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.371516 kubelet[2763]: W0912 23:01:14.369980 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.371516 kubelet[2763]: E0912 23:01:14.369989 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.371516 kubelet[2763]: E0912 23:01:14.371312 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.371516 kubelet[2763]: W0912 23:01:14.371324 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.372036 kubelet[2763]: E0912 23:01:14.371339 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.372036 kubelet[2763]: E0912 23:01:14.371578 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.372036 kubelet[2763]: W0912 23:01:14.371587 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.372036 kubelet[2763]: E0912 23:01:14.371597 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.372036 kubelet[2763]: E0912 23:01:14.371784 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.372036 kubelet[2763]: W0912 23:01:14.371792 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.372036 kubelet[2763]: E0912 23:01:14.371801 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.372036 kubelet[2763]: E0912 23:01:14.371936 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.372036 kubelet[2763]: W0912 23:01:14.371944 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.372036 kubelet[2763]: E0912 23:01:14.371953 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.372493 kubelet[2763]: E0912 23:01:14.372174 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.372493 kubelet[2763]: W0912 23:01:14.372182 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.372493 kubelet[2763]: E0912 23:01:14.372192 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.393913 kubelet[2763]: E0912 23:01:14.393877 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.393913 kubelet[2763]: W0912 23:01:14.393903 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.393913 kubelet[2763]: E0912 23:01:14.393924 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.394513 kubelet[2763]: E0912 23:01:14.394485 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.394513 kubelet[2763]: W0912 23:01:14.394496 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.394513 kubelet[2763]: E0912 23:01:14.394513 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.394720 kubelet[2763]: E0912 23:01:14.394707 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.394869 kubelet[2763]: W0912 23:01:14.394779 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.395010 kubelet[2763]: E0912 23:01:14.394979 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.395205 kubelet[2763]: E0912 23:01:14.395086 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.395205 kubelet[2763]: W0912 23:01:14.395093 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.395205 kubelet[2763]: E0912 23:01:14.395101 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.395565 kubelet[2763]: E0912 23:01:14.395509 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.395565 kubelet[2763]: W0912 23:01:14.395520 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.395565 kubelet[2763]: E0912 23:01:14.395533 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.395809 kubelet[2763]: E0912 23:01:14.395799 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.395924 kubelet[2763]: W0912 23:01:14.395866 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.396174 kubelet[2763]: E0912 23:01:14.395913 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.396298 kubelet[2763]: E0912 23:01:14.396279 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.396298 kubelet[2763]: W0912 23:01:14.396289 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.396407 kubelet[2763]: E0912 23:01:14.396362 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.396735 kubelet[2763]: E0912 23:01:14.396712 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.396735 kubelet[2763]: W0912 23:01:14.396723 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.397003 kubelet[2763]: E0912 23:01:14.396954 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.397206 kubelet[2763]: E0912 23:01:14.397196 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.397311 kubelet[2763]: W0912 23:01:14.397300 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.397626 kubelet[2763]: E0912 23:01:14.397595 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.397758 kubelet[2763]: E0912 23:01:14.397750 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.397864 kubelet[2763]: W0912 23:01:14.397811 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.398452 kubelet[2763]: E0912 23:01:14.398153 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.398748 kubelet[2763]: E0912 23:01:14.398532 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.398748 kubelet[2763]: W0912 23:01:14.398637 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.398748 kubelet[2763]: E0912 23:01:14.398654 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.398962 kubelet[2763]: E0912 23:01:14.398952 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.399030 kubelet[2763]: W0912 23:01:14.399019 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.399503 kubelet[2763]: E0912 23:01:14.399487 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.399724 kubelet[2763]: E0912 23:01:14.399635 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.399790 kubelet[2763]: W0912 23:01:14.399780 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.400073 kubelet[2763]: E0912 23:01:14.399973 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.400868 kubelet[2763]: W0912 23:01:14.400592 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.400868 kubelet[2763]: E0912 23:01:14.400749 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.400979 kubelet[2763]: E0912 23:01:14.400962 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.401010 kubelet[2763]: W0912 23:01:14.400990 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.401010 kubelet[2763]: E0912 23:01:14.401000 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.401077 kubelet[2763]: E0912 23:01:14.400384 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.401222 kubelet[2763]: E0912 23:01:14.401154 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.401222 kubelet[2763]: W0912 23:01:14.401161 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.401222 kubelet[2763]: E0912 23:01:14.401168 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.401326 kubelet[2763]: E0912 23:01:14.401314 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.401326 kubelet[2763]: W0912 23:01:14.401325 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.401370 kubelet[2763]: E0912 23:01:14.401333 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.401901 kubelet[2763]: E0912 23:01:14.401877 2763 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 23:01:14.401947 kubelet[2763]: W0912 23:01:14.401902 2763 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 23:01:14.401947 kubelet[2763]: E0912 23:01:14.401916 2763 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 23:01:14.516914 containerd[1563]: time="2025-09-12T23:01:14.516833046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:14.518201 containerd[1563]: time="2025-09-12T23:01:14.517979862Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 12 23:01:14.519065 containerd[1563]: time="2025-09-12T23:01:14.519024267Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:14.521380 containerd[1563]: time="2025-09-12T23:01:14.521347264Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:14.533515 containerd[1563]: time="2025-09-12T23:01:14.533468506Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.726206803s" Sep 12 23:01:14.533802 containerd[1563]: time="2025-09-12T23:01:14.533687515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 12 23:01:14.538679 containerd[1563]: time="2025-09-12T23:01:14.538553552Z" level=info msg="CreateContainer within sandbox \"48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 23:01:14.554271 containerd[1563]: time="2025-09-12T23:01:14.554214818Z" level=info msg="Container 84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:14.565275 containerd[1563]: time="2025-09-12T23:01:14.565217057Z" level=info msg="CreateContainer within sandbox \"48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd\"" Sep 12 23:01:14.566529 containerd[1563]: time="2025-09-12T23:01:14.566177874Z" level=info msg="StartContainer for \"84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd\"" Sep 12 23:01:14.569661 containerd[1563]: time="2025-09-12T23:01:14.569584359Z" level=info msg="connecting to shim 84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd" address="unix:///run/containerd/s/6be7c9720bca30298a62d56055e65f0e9273cb742612e377348c50f201cf379e" protocol=ttrpc version=3 Sep 12 23:01:14.603276 systemd[1]: Started cri-containerd-84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd.scope - libcontainer container 84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd. Sep 12 23:01:14.647889 containerd[1563]: time="2025-09-12T23:01:14.647759689Z" level=info msg="StartContainer for \"84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd\" returns successfully" Sep 12 23:01:14.662067 systemd[1]: cri-containerd-84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd.scope: Deactivated successfully. Sep 12 23:01:14.676029 containerd[1563]: time="2025-09-12T23:01:14.675893776Z" level=info msg="received exit event container_id:\"84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd\" id:\"84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd\" pid:3491 exited_at:{seconds:1757718074 nanos:664514783}" Sep 12 23:01:14.721388 containerd[1563]: time="2025-09-12T23:01:14.721229132Z" level=info msg="TaskExit event in podsandbox handler container_id:\"84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd\" id:\"84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd\" pid:3491 exited_at:{seconds:1757718074 nanos:664514783}" Sep 12 23:01:14.752482 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-84f7379499f11256eb8f2140a8d34a49d6d5a0324ec3d862eb221b3c598f06dd-rootfs.mount: Deactivated successfully. Sep 12 23:01:15.143540 kubelet[2763]: E0912 23:01:15.143453 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:15.278387 containerd[1563]: time="2025-09-12T23:01:15.278289446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 23:01:17.146903 kubelet[2763]: E0912 23:01:17.146763 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:19.143987 kubelet[2763]: E0912 23:01:19.143921 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:19.170428 containerd[1563]: time="2025-09-12T23:01:19.170348183Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:19.172175 containerd[1563]: time="2025-09-12T23:01:19.171884458Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 12 23:01:19.173170 containerd[1563]: time="2025-09-12T23:01:19.173132544Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:19.176169 containerd[1563]: time="2025-09-12T23:01:19.176131878Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:19.177132 containerd[1563]: time="2025-09-12T23:01:19.177067840Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 3.898720015s" Sep 12 23:01:19.178098 containerd[1563]: time="2025-09-12T23:01:19.177237717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 12 23:01:19.181665 containerd[1563]: time="2025-09-12T23:01:19.181443730Z" level=info msg="CreateContainer within sandbox \"48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 23:01:19.194962 containerd[1563]: time="2025-09-12T23:01:19.194924180Z" level=info msg="Container c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:19.211976 containerd[1563]: time="2025-09-12T23:01:19.211909751Z" level=info msg="CreateContainer within sandbox \"48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538\"" Sep 12 23:01:19.214204 containerd[1563]: time="2025-09-12T23:01:19.212775681Z" level=info msg="StartContainer for \"c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538\"" Sep 12 23:01:19.215151 containerd[1563]: time="2025-09-12T23:01:19.215121020Z" level=info msg="connecting to shim c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538" address="unix:///run/containerd/s/6be7c9720bca30298a62d56055e65f0e9273cb742612e377348c50f201cf379e" protocol=ttrpc version=3 Sep 12 23:01:19.242271 systemd[1]: Started cri-containerd-c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538.scope - libcontainer container c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538. Sep 12 23:01:19.313030 containerd[1563]: time="2025-09-12T23:01:19.312991537Z" level=info msg="StartContainer for \"c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538\" returns successfully" Sep 12 23:01:19.818642 systemd[1]: cri-containerd-c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538.scope: Deactivated successfully. Sep 12 23:01:19.818967 systemd[1]: cri-containerd-c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538.scope: Consumed 472ms CPU time, 170.8M memory peak, 15.7M read from disk, 171.3M written to disk. Sep 12 23:01:19.820853 containerd[1563]: time="2025-09-12T23:01:19.820832440Z" level=info msg="received exit event container_id:\"c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538\" id:\"c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538\" pid:3549 exited_at:{seconds:1757718079 nanos:820287420}" Sep 12 23:01:19.822932 containerd[1563]: time="2025-09-12T23:01:19.822881054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538\" id:\"c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538\" pid:3549 exited_at:{seconds:1757718079 nanos:820287420}" Sep 12 23:01:19.855220 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c8bb114ccf33d9540a49b8f3c0b783e90b4558d58f9d05737e7359a02de12538-rootfs.mount: Deactivated successfully. Sep 12 23:01:19.879507 kubelet[2763]: I0912 23:01:19.879414 2763 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 23:01:19.920336 systemd[1]: Created slice kubepods-besteffort-podfe96040e_cb83_4d2d_acce_421f62e4c4db.slice - libcontainer container kubepods-besteffort-podfe96040e_cb83_4d2d_acce_421f62e4c4db.slice. Sep 12 23:01:19.933370 systemd[1]: Created slice kubepods-burstable-podf749bc25_4ea0_4be7_801c_efad8b6d5195.slice - libcontainer container kubepods-burstable-podf749bc25_4ea0_4be7_801c_efad8b6d5195.slice. Sep 12 23:01:19.937784 kubelet[2763]: I0912 23:01:19.937652 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fe96040e-cb83-4d2d-acce-421f62e4c4db-calico-apiserver-certs\") pod \"calico-apiserver-7fcf7b468-ml5pp\" (UID: \"fe96040e-cb83-4d2d-acce-421f62e4c4db\") " pod="calico-apiserver/calico-apiserver-7fcf7b468-ml5pp" Sep 12 23:01:19.937784 kubelet[2763]: I0912 23:01:19.937698 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8tkd\" (UniqueName: \"kubernetes.io/projected/fe96040e-cb83-4d2d-acce-421f62e4c4db-kube-api-access-d8tkd\") pod \"calico-apiserver-7fcf7b468-ml5pp\" (UID: \"fe96040e-cb83-4d2d-acce-421f62e4c4db\") " pod="calico-apiserver/calico-apiserver-7fcf7b468-ml5pp" Sep 12 23:01:19.937784 kubelet[2763]: I0912 23:01:19.937713 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f749bc25-4ea0-4be7-801c-efad8b6d5195-config-volume\") pod \"coredns-7c65d6cfc9-745ct\" (UID: \"f749bc25-4ea0-4be7-801c-efad8b6d5195\") " pod="kube-system/coredns-7c65d6cfc9-745ct" Sep 12 23:01:19.937784 kubelet[2763]: I0912 23:01:19.937730 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6nb\" (UniqueName: \"kubernetes.io/projected/f749bc25-4ea0-4be7-801c-efad8b6d5195-kube-api-access-4d6nb\") pod \"coredns-7c65d6cfc9-745ct\" (UID: \"f749bc25-4ea0-4be7-801c-efad8b6d5195\") " pod="kube-system/coredns-7c65d6cfc9-745ct" Sep 12 23:01:19.943210 systemd[1]: Created slice kubepods-besteffort-podd0b02990_63aa_436e_8d45_83a2fb9fb79e.slice - libcontainer container kubepods-besteffort-podd0b02990_63aa_436e_8d45_83a2fb9fb79e.slice. Sep 12 23:01:19.953286 systemd[1]: Created slice kubepods-burstable-poddc787179_cf7b_486b_bfa0_8a2ffd9994c2.slice - libcontainer container kubepods-burstable-poddc787179_cf7b_486b_bfa0_8a2ffd9994c2.slice. Sep 12 23:01:19.962330 systemd[1]: Created slice kubepods-besteffort-pod8820a8d0_00eb_40d2_b1fe_1ffca3a1eeda.slice - libcontainer container kubepods-besteffort-pod8820a8d0_00eb_40d2_b1fe_1ffca3a1eeda.slice. Sep 12 23:01:19.971949 systemd[1]: Created slice kubepods-besteffort-pod1a615f0e_57a8_4819_ae03_9bae73eb3e3d.slice - libcontainer container kubepods-besteffort-pod1a615f0e_57a8_4819_ae03_9bae73eb3e3d.slice. Sep 12 23:01:19.980472 systemd[1]: Created slice kubepods-besteffort-pod78a7b686_9379_4d2e_b60a_84ef6a14d85c.slice - libcontainer container kubepods-besteffort-pod78a7b686_9379_4d2e_b60a_84ef6a14d85c.slice. Sep 12 23:01:20.038526 kubelet[2763]: I0912 23:01:20.038432 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dc787179-cf7b-486b-bfa0-8a2ffd9994c2-config-volume\") pod \"coredns-7c65d6cfc9-vds8s\" (UID: \"dc787179-cf7b-486b-bfa0-8a2ffd9994c2\") " pod="kube-system/coredns-7c65d6cfc9-vds8s" Sep 12 23:01:20.038526 kubelet[2763]: I0912 23:01:20.038484 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a615f0e-57a8-4819-ae03-9bae73eb3e3d-config\") pod \"goldmane-7988f88666-zwkck\" (UID: \"1a615f0e-57a8-4819-ae03-9bae73eb3e3d\") " pod="calico-system/goldmane-7988f88666-zwkck" Sep 12 23:01:20.038526 kubelet[2763]: I0912 23:01:20.038513 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/1a615f0e-57a8-4819-ae03-9bae73eb3e3d-goldmane-key-pair\") pod \"goldmane-7988f88666-zwkck\" (UID: \"1a615f0e-57a8-4819-ae03-9bae73eb3e3d\") " pod="calico-system/goldmane-7988f88666-zwkck" Sep 12 23:01:20.038526 kubelet[2763]: I0912 23:01:20.038539 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sdj2\" (UniqueName: \"kubernetes.io/projected/dc787179-cf7b-486b-bfa0-8a2ffd9994c2-kube-api-access-8sdj2\") pod \"coredns-7c65d6cfc9-vds8s\" (UID: \"dc787179-cf7b-486b-bfa0-8a2ffd9994c2\") " pod="kube-system/coredns-7c65d6cfc9-vds8s" Sep 12 23:01:20.038883 kubelet[2763]: I0912 23:01:20.038560 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86qjx\" (UniqueName: \"kubernetes.io/projected/d0b02990-63aa-436e-8d45-83a2fb9fb79e-kube-api-access-86qjx\") pod \"calico-apiserver-7fcf7b468-4m2kd\" (UID: \"d0b02990-63aa-436e-8d45-83a2fb9fb79e\") " pod="calico-apiserver/calico-apiserver-7fcf7b468-4m2kd" Sep 12 23:01:20.038883 kubelet[2763]: I0912 23:01:20.038661 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5dl7\" (UniqueName: \"kubernetes.io/projected/78a7b686-9379-4d2e-b60a-84ef6a14d85c-kube-api-access-q5dl7\") pod \"whisker-76d7b4dbbc-jknr4\" (UID: \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\") " pod="calico-system/whisker-76d7b4dbbc-jknr4" Sep 12 23:01:20.038883 kubelet[2763]: I0912 23:01:20.038692 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a615f0e-57a8-4819-ae03-9bae73eb3e3d-goldmane-ca-bundle\") pod \"goldmane-7988f88666-zwkck\" (UID: \"1a615f0e-57a8-4819-ae03-9bae73eb3e3d\") " pod="calico-system/goldmane-7988f88666-zwkck" Sep 12 23:01:20.038883 kubelet[2763]: I0912 23:01:20.038728 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d0b02990-63aa-436e-8d45-83a2fb9fb79e-calico-apiserver-certs\") pod \"calico-apiserver-7fcf7b468-4m2kd\" (UID: \"d0b02990-63aa-436e-8d45-83a2fb9fb79e\") " pod="calico-apiserver/calico-apiserver-7fcf7b468-4m2kd" Sep 12 23:01:20.038883 kubelet[2763]: I0912 23:01:20.038746 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda-tigera-ca-bundle\") pod \"calico-kube-controllers-579768695b-96v9q\" (UID: \"8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda\") " pod="calico-system/calico-kube-controllers-579768695b-96v9q" Sep 12 23:01:20.039037 kubelet[2763]: I0912 23:01:20.038765 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76qkl\" (UniqueName: \"kubernetes.io/projected/8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda-kube-api-access-76qkl\") pod \"calico-kube-controllers-579768695b-96v9q\" (UID: \"8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda\") " pod="calico-system/calico-kube-controllers-579768695b-96v9q" Sep 12 23:01:20.039037 kubelet[2763]: I0912 23:01:20.038783 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4rm9\" (UniqueName: \"kubernetes.io/projected/1a615f0e-57a8-4819-ae03-9bae73eb3e3d-kube-api-access-h4rm9\") pod \"goldmane-7988f88666-zwkck\" (UID: \"1a615f0e-57a8-4819-ae03-9bae73eb3e3d\") " pod="calico-system/goldmane-7988f88666-zwkck" Sep 12 23:01:20.039037 kubelet[2763]: I0912 23:01:20.038799 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/78a7b686-9379-4d2e-b60a-84ef6a14d85c-whisker-backend-key-pair\") pod \"whisker-76d7b4dbbc-jknr4\" (UID: \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\") " pod="calico-system/whisker-76d7b4dbbc-jknr4" Sep 12 23:01:20.039037 kubelet[2763]: I0912 23:01:20.038841 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78a7b686-9379-4d2e-b60a-84ef6a14d85c-whisker-ca-bundle\") pod \"whisker-76d7b4dbbc-jknr4\" (UID: \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\") " pod="calico-system/whisker-76d7b4dbbc-jknr4" Sep 12 23:01:20.236441 containerd[1563]: time="2025-09-12T23:01:20.236373667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fcf7b468-ml5pp,Uid:fe96040e-cb83-4d2d-acce-421f62e4c4db,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:01:20.245938 containerd[1563]: time="2025-09-12T23:01:20.245513894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-745ct,Uid:f749bc25-4ea0-4be7-801c-efad8b6d5195,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:20.252484 containerd[1563]: time="2025-09-12T23:01:20.252449464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fcf7b468-4m2kd,Uid:d0b02990-63aa-436e-8d45-83a2fb9fb79e,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:01:20.266310 containerd[1563]: time="2025-09-12T23:01:20.266269141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vds8s,Uid:dc787179-cf7b-486b-bfa0-8a2ffd9994c2,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:20.297878 containerd[1563]: time="2025-09-12T23:01:20.297807590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76d7b4dbbc-jknr4,Uid:78a7b686-9379-4d2e-b60a-84ef6a14d85c,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:20.305507 containerd[1563]: time="2025-09-12T23:01:20.305270779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-579768695b-96v9q,Uid:8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:20.349757 containerd[1563]: time="2025-09-12T23:01:20.305317696Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-zwkck,Uid:1a615f0e-57a8-4819-ae03-9bae73eb3e3d,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:20.437620 containerd[1563]: time="2025-09-12T23:01:20.437590036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 23:01:20.523069 containerd[1563]: time="2025-09-12T23:01:20.522694965Z" level=error msg="Failed to destroy network for sandbox \"fef7e5314e1e46f3cabbf63058854f1a4fc96c05d64a1a9951c2b3a534cb15e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.530368 containerd[1563]: time="2025-09-12T23:01:20.530147013Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-745ct,Uid:f749bc25-4ea0-4be7-801c-efad8b6d5195,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fef7e5314e1e46f3cabbf63058854f1a4fc96c05d64a1a9951c2b3a534cb15e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.554435 kubelet[2763]: E0912 23:01:20.554317 2763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fef7e5314e1e46f3cabbf63058854f1a4fc96c05d64a1a9951c2b3a534cb15e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.554435 kubelet[2763]: E0912 23:01:20.554419 2763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fef7e5314e1e46f3cabbf63058854f1a4fc96c05d64a1a9951c2b3a534cb15e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-745ct" Sep 12 23:01:20.554435 kubelet[2763]: E0912 23:01:20.554442 2763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fef7e5314e1e46f3cabbf63058854f1a4fc96c05d64a1a9951c2b3a534cb15e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-745ct" Sep 12 23:01:20.556812 kubelet[2763]: E0912 23:01:20.554493 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-745ct_kube-system(f749bc25-4ea0-4be7-801c-efad8b6d5195)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-745ct_kube-system(f749bc25-4ea0-4be7-801c-efad8b6d5195)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fef7e5314e1e46f3cabbf63058854f1a4fc96c05d64a1a9951c2b3a534cb15e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-745ct" podUID="f749bc25-4ea0-4be7-801c-efad8b6d5195" Sep 12 23:01:20.592745 containerd[1563]: time="2025-09-12T23:01:20.592702777Z" level=error msg="Failed to destroy network for sandbox \"f6d8ae7538543918788c19ef61fd052e861ba3fd4a43c8e7c3fa4d8d1db270e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.596761 containerd[1563]: time="2025-09-12T23:01:20.596665733Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fcf7b468-ml5pp,Uid:fe96040e-cb83-4d2d-acce-421f62e4c4db,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6d8ae7538543918788c19ef61fd052e861ba3fd4a43c8e7c3fa4d8d1db270e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.597240 kubelet[2763]: E0912 23:01:20.597203 2763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6d8ae7538543918788c19ef61fd052e861ba3fd4a43c8e7c3fa4d8d1db270e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.597881 kubelet[2763]: E0912 23:01:20.597764 2763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6d8ae7538543918788c19ef61fd052e861ba3fd4a43c8e7c3fa4d8d1db270e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fcf7b468-ml5pp" Sep 12 23:01:20.597881 kubelet[2763]: E0912 23:01:20.597791 2763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6d8ae7538543918788c19ef61fd052e861ba3fd4a43c8e7c3fa4d8d1db270e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fcf7b468-ml5pp" Sep 12 23:01:20.597881 kubelet[2763]: E0912 23:01:20.597845 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7fcf7b468-ml5pp_calico-apiserver(fe96040e-cb83-4d2d-acce-421f62e4c4db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7fcf7b468-ml5pp_calico-apiserver(fe96040e-cb83-4d2d-acce-421f62e4c4db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f6d8ae7538543918788c19ef61fd052e861ba3fd4a43c8e7c3fa4d8d1db270e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7fcf7b468-ml5pp" podUID="fe96040e-cb83-4d2d-acce-421f62e4c4db" Sep 12 23:01:20.614976 containerd[1563]: time="2025-09-12T23:01:20.614915641Z" level=error msg="Failed to destroy network for sandbox \"ebed12eed62819d92d4c801871cdb6573fb1ace3537fe7f368f225d91edb3957\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.616987 containerd[1563]: time="2025-09-12T23:01:20.616952965Z" level=error msg="Failed to destroy network for sandbox \"109cb8840d7b3df5f51084292735b74eeba483aad1ad117f21d79100f4fd6fe1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.617497 containerd[1563]: time="2025-09-12T23:01:20.617171804Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-zwkck,Uid:1a615f0e-57a8-4819-ae03-9bae73eb3e3d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebed12eed62819d92d4c801871cdb6573fb1ace3537fe7f368f225d91edb3957\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.617904 kubelet[2763]: E0912 23:01:20.617866 2763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebed12eed62819d92d4c801871cdb6573fb1ace3537fe7f368f225d91edb3957\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.617968 kubelet[2763]: E0912 23:01:20.617930 2763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebed12eed62819d92d4c801871cdb6573fb1ace3537fe7f368f225d91edb3957\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-zwkck" Sep 12 23:01:20.617968 kubelet[2763]: E0912 23:01:20.617951 2763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebed12eed62819d92d4c801871cdb6573fb1ace3537fe7f368f225d91edb3957\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-zwkck" Sep 12 23:01:20.618080 kubelet[2763]: E0912 23:01:20.618022 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-zwkck_calico-system(1a615f0e-57a8-4819-ae03-9bae73eb3e3d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-zwkck_calico-system(1a615f0e-57a8-4819-ae03-9bae73eb3e3d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ebed12eed62819d92d4c801871cdb6573fb1ace3537fe7f368f225d91edb3957\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-zwkck" podUID="1a615f0e-57a8-4819-ae03-9bae73eb3e3d" Sep 12 23:01:20.618311 containerd[1563]: time="2025-09-12T23:01:20.618265391Z" level=error msg="Failed to destroy network for sandbox \"cbb7328d8d8ed74c4ed0f382101706bc60163d5d0467d2fc78ada180b6f9c096\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.620540 containerd[1563]: time="2025-09-12T23:01:20.620313595Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-579768695b-96v9q,Uid:8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"109cb8840d7b3df5f51084292735b74eeba483aad1ad117f21d79100f4fd6fe1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.620817 kubelet[2763]: E0912 23:01:20.620791 2763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"109cb8840d7b3df5f51084292735b74eeba483aad1ad117f21d79100f4fd6fe1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.621125 kubelet[2763]: E0912 23:01:20.620995 2763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"109cb8840d7b3df5f51084292735b74eeba483aad1ad117f21d79100f4fd6fe1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-579768695b-96v9q" Sep 12 23:01:20.621383 kubelet[2763]: E0912 23:01:20.621300 2763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"109cb8840d7b3df5f51084292735b74eeba483aad1ad117f21d79100f4fd6fe1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-579768695b-96v9q" Sep 12 23:01:20.621763 kubelet[2763]: E0912 23:01:20.621598 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-579768695b-96v9q_calico-system(8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-579768695b-96v9q_calico-system(8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"109cb8840d7b3df5f51084292735b74eeba483aad1ad117f21d79100f4fd6fe1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-579768695b-96v9q" podUID="8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda" Sep 12 23:01:20.631092 containerd[1563]: time="2025-09-12T23:01:20.630153261Z" level=error msg="Failed to destroy network for sandbox \"236df6159de8d4c157dd98cd4058f2bf456e763a5796332cf02a7885531b1720\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.632406 containerd[1563]: time="2025-09-12T23:01:20.632369529Z" level=error msg="Failed to destroy network for sandbox \"eb6cc53eda865c823363e2eb752a49fed69b83ab71668e1c16cbb738bfcb9959\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.632785 containerd[1563]: time="2025-09-12T23:01:20.632740603Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fcf7b468-4m2kd,Uid:d0b02990-63aa-436e-8d45-83a2fb9fb79e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"236df6159de8d4c157dd98cd4058f2bf456e763a5796332cf02a7885531b1720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.633724 kubelet[2763]: E0912 23:01:20.633599 2763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"236df6159de8d4c157dd98cd4058f2bf456e763a5796332cf02a7885531b1720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.633724 kubelet[2763]: E0912 23:01:20.633698 2763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"236df6159de8d4c157dd98cd4058f2bf456e763a5796332cf02a7885531b1720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fcf7b468-4m2kd" Sep 12 23:01:20.633875 kubelet[2763]: E0912 23:01:20.633849 2763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"236df6159de8d4c157dd98cd4058f2bf456e763a5796332cf02a7885531b1720\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fcf7b468-4m2kd" Sep 12 23:01:20.634442 kubelet[2763]: E0912 23:01:20.634273 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7fcf7b468-4m2kd_calico-apiserver(d0b02990-63aa-436e-8d45-83a2fb9fb79e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7fcf7b468-4m2kd_calico-apiserver(d0b02990-63aa-436e-8d45-83a2fb9fb79e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"236df6159de8d4c157dd98cd4058f2bf456e763a5796332cf02a7885531b1720\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7fcf7b468-4m2kd" podUID="d0b02990-63aa-436e-8d45-83a2fb9fb79e" Sep 12 23:01:20.636572 containerd[1563]: time="2025-09-12T23:01:20.636533773Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vds8s,Uid:dc787179-cf7b-486b-bfa0-8a2ffd9994c2,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb6cc53eda865c823363e2eb752a49fed69b83ab71668e1c16cbb738bfcb9959\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.636912 kubelet[2763]: E0912 23:01:20.636841 2763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb6cc53eda865c823363e2eb752a49fed69b83ab71668e1c16cbb738bfcb9959\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.636912 kubelet[2763]: E0912 23:01:20.636873 2763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb6cc53eda865c823363e2eb752a49fed69b83ab71668e1c16cbb738bfcb9959\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-vds8s" Sep 12 23:01:20.636912 kubelet[2763]: E0912 23:01:20.636887 2763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eb6cc53eda865c823363e2eb752a49fed69b83ab71668e1c16cbb738bfcb9959\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-vds8s" Sep 12 23:01:20.637191 kubelet[2763]: E0912 23:01:20.637030 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-vds8s_kube-system(dc787179-cf7b-486b-bfa0-8a2ffd9994c2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-vds8s_kube-system(dc787179-cf7b-486b-bfa0-8a2ffd9994c2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eb6cc53eda865c823363e2eb752a49fed69b83ab71668e1c16cbb738bfcb9959\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-vds8s" podUID="dc787179-cf7b-486b-bfa0-8a2ffd9994c2" Sep 12 23:01:20.643740 containerd[1563]: time="2025-09-12T23:01:20.643692913Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-76d7b4dbbc-jknr4,Uid:78a7b686-9379-4d2e-b60a-84ef6a14d85c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbb7328d8d8ed74c4ed0f382101706bc60163d5d0467d2fc78ada180b6f9c096\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.644095 kubelet[2763]: E0912 23:01:20.643898 2763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbb7328d8d8ed74c4ed0f382101706bc60163d5d0467d2fc78ada180b6f9c096\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:20.644095 kubelet[2763]: E0912 23:01:20.643933 2763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbb7328d8d8ed74c4ed0f382101706bc60163d5d0467d2fc78ada180b6f9c096\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76d7b4dbbc-jknr4" Sep 12 23:01:20.644095 kubelet[2763]: E0912 23:01:20.643948 2763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cbb7328d8d8ed74c4ed0f382101706bc60163d5d0467d2fc78ada180b6f9c096\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-76d7b4dbbc-jknr4" Sep 12 23:01:20.644219 kubelet[2763]: E0912 23:01:20.643981 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-76d7b4dbbc-jknr4_calico-system(78a7b686-9379-4d2e-b60a-84ef6a14d85c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-76d7b4dbbc-jknr4_calico-system(78a7b686-9379-4d2e-b60a-84ef6a14d85c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cbb7328d8d8ed74c4ed0f382101706bc60163d5d0467d2fc78ada180b6f9c096\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-76d7b4dbbc-jknr4" podUID="78a7b686-9379-4d2e-b60a-84ef6a14d85c" Sep 12 23:01:21.155704 systemd[1]: Created slice kubepods-besteffort-podbf244edb_3fa8_4ebb_a022_bf4d9953e4eb.slice - libcontainer container kubepods-besteffort-podbf244edb_3fa8_4ebb_a022_bf4d9953e4eb.slice. Sep 12 23:01:21.162718 containerd[1563]: time="2025-09-12T23:01:21.162667159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mbsv9,Uid:bf244edb-3fa8-4ebb-a022-bf4d9953e4eb,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:21.203878 systemd[1]: run-netns-cni\x2d839df6ee\x2d6db0\x2def6a\x2d5f63\x2dbf405607a0bd.mount: Deactivated successfully. Sep 12 23:01:21.204153 systemd[1]: run-netns-cni\x2d02230b3a\x2d3373\x2dd733\x2dd576\x2da784b5a7329a.mount: Deactivated successfully. Sep 12 23:01:21.204294 systemd[1]: run-netns-cni\x2dbabce43a\x2d95dc\x2ddab2\x2d6ec3\x2da59ba46a5f99.mount: Deactivated successfully. Sep 12 23:01:21.204412 systemd[1]: run-netns-cni\x2d28ac0dd8\x2d23ab\x2d1689\x2d9210\x2de21317dc2b69.mount: Deactivated successfully. Sep 12 23:01:21.204530 systemd[1]: run-netns-cni\x2d279bd7c4\x2d95cb\x2d0010\x2d5f3f\x2dac2fc781b4f9.mount: Deactivated successfully. Sep 12 23:01:21.256997 containerd[1563]: time="2025-09-12T23:01:21.256873807Z" level=error msg="Failed to destroy network for sandbox \"fe24678c8911b2e8c70ead7c14a4a131220395a86a9f27b0953bfb8728b8b71c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:21.259575 containerd[1563]: time="2025-09-12T23:01:21.259019683Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mbsv9,Uid:bf244edb-3fa8-4ebb-a022-bf4d9953e4eb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe24678c8911b2e8c70ead7c14a4a131220395a86a9f27b0953bfb8728b8b71c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:21.259680 kubelet[2763]: E0912 23:01:21.259218 2763 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe24678c8911b2e8c70ead7c14a4a131220395a86a9f27b0953bfb8728b8b71c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 23:01:21.259680 kubelet[2763]: E0912 23:01:21.259268 2763 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe24678c8911b2e8c70ead7c14a4a131220395a86a9f27b0953bfb8728b8b71c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mbsv9" Sep 12 23:01:21.259680 kubelet[2763]: E0912 23:01:21.259287 2763 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe24678c8911b2e8c70ead7c14a4a131220395a86a9f27b0953bfb8728b8b71c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mbsv9" Sep 12 23:01:21.260241 kubelet[2763]: E0912 23:01:21.259349 2763 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mbsv9_calico-system(bf244edb-3fa8-4ebb-a022-bf4d9953e4eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mbsv9_calico-system(bf244edb-3fa8-4ebb-a022-bf4d9953e4eb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fe24678c8911b2e8c70ead7c14a4a131220395a86a9f27b0953bfb8728b8b71c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mbsv9" podUID="bf244edb-3fa8-4ebb-a022-bf4d9953e4eb" Sep 12 23:01:21.263806 systemd[1]: run-netns-cni\x2df2480ef4\x2d3cda\x2db90e\x2d5fb7\x2d573b1800ff2a.mount: Deactivated successfully. Sep 12 23:01:27.737244 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3587777069.mount: Deactivated successfully. Sep 12 23:01:28.010123 containerd[1563]: time="2025-09-12T23:01:28.009889113Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 12 23:01:28.067837 containerd[1563]: time="2025-09-12T23:01:28.067721595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:28.116200 containerd[1563]: time="2025-09-12T23:01:28.116113635Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:28.117724 containerd[1563]: time="2025-09-12T23:01:28.117005505Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 7.671946128s" Sep 12 23:01:28.117724 containerd[1563]: time="2025-09-12T23:01:28.117064915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 12 23:01:28.117724 containerd[1563]: time="2025-09-12T23:01:28.117381518Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:28.168667 containerd[1563]: time="2025-09-12T23:01:28.168630996Z" level=info msg="CreateContainer within sandbox \"48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 23:01:28.279341 containerd[1563]: time="2025-09-12T23:01:28.279235818Z" level=info msg="Container 45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:28.283451 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount632136693.mount: Deactivated successfully. Sep 12 23:01:28.407798 containerd[1563]: time="2025-09-12T23:01:28.407735608Z" level=info msg="CreateContainer within sandbox \"48002ebed7f0251d4e8dc7bfe8ade093e9858a97c322505271b0ab39f5042fac\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\"" Sep 12 23:01:28.415947 containerd[1563]: time="2025-09-12T23:01:28.415748938Z" level=info msg="StartContainer for \"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\"" Sep 12 23:01:28.433817 containerd[1563]: time="2025-09-12T23:01:28.433760034Z" level=info msg="connecting to shim 45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f" address="unix:///run/containerd/s/6be7c9720bca30298a62d56055e65f0e9273cb742612e377348c50f201cf379e" protocol=ttrpc version=3 Sep 12 23:01:28.549348 systemd[1]: Started cri-containerd-45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f.scope - libcontainer container 45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f. Sep 12 23:01:28.645266 containerd[1563]: time="2025-09-12T23:01:28.645167127Z" level=info msg="StartContainer for \"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\" returns successfully" Sep 12 23:01:28.772544 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 23:01:28.773582 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 23:01:29.022072 kubelet[2763]: I0912 23:01:29.021831 2763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5dl7\" (UniqueName: \"kubernetes.io/projected/78a7b686-9379-4d2e-b60a-84ef6a14d85c-kube-api-access-q5dl7\") pod \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\" (UID: \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\") " Sep 12 23:01:29.022072 kubelet[2763]: I0912 23:01:29.021879 2763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78a7b686-9379-4d2e-b60a-84ef6a14d85c-whisker-ca-bundle\") pod \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\" (UID: \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\") " Sep 12 23:01:29.022072 kubelet[2763]: I0912 23:01:29.021908 2763 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/78a7b686-9379-4d2e-b60a-84ef6a14d85c-whisker-backend-key-pair\") pod \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\" (UID: \"78a7b686-9379-4d2e-b60a-84ef6a14d85c\") " Sep 12 23:01:29.028443 kubelet[2763]: I0912 23:01:29.028372 2763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78a7b686-9379-4d2e-b60a-84ef6a14d85c-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "78a7b686-9379-4d2e-b60a-84ef6a14d85c" (UID: "78a7b686-9379-4d2e-b60a-84ef6a14d85c"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 23:01:29.028880 systemd[1]: var-lib-kubelet-pods-78a7b686\x2d9379\x2d4d2e\x2db60a\x2d84ef6a14d85c-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 23:01:29.032706 kubelet[2763]: I0912 23:01:29.032641 2763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78a7b686-9379-4d2e-b60a-84ef6a14d85c-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "78a7b686-9379-4d2e-b60a-84ef6a14d85c" (UID: "78a7b686-9379-4d2e-b60a-84ef6a14d85c"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 23:01:29.032874 kubelet[2763]: I0912 23:01:29.032858 2763 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78a7b686-9379-4d2e-b60a-84ef6a14d85c-kube-api-access-q5dl7" (OuterVolumeSpecName: "kube-api-access-q5dl7") pod "78a7b686-9379-4d2e-b60a-84ef6a14d85c" (UID: "78a7b686-9379-4d2e-b60a-84ef6a14d85c"). InnerVolumeSpecName "kube-api-access-q5dl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 23:01:29.033948 systemd[1]: var-lib-kubelet-pods-78a7b686\x2d9379\x2d4d2e\x2db60a\x2d84ef6a14d85c-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dq5dl7.mount: Deactivated successfully. Sep 12 23:01:29.122388 kubelet[2763]: I0912 23:01:29.122333 2763 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5dl7\" (UniqueName: \"kubernetes.io/projected/78a7b686-9379-4d2e-b60a-84ef6a14d85c-kube-api-access-q5dl7\") on node \"ci-4459-0-0-5-bd33a49fb7\" DevicePath \"\"" Sep 12 23:01:29.122388 kubelet[2763]: I0912 23:01:29.122383 2763 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78a7b686-9379-4d2e-b60a-84ef6a14d85c-whisker-ca-bundle\") on node \"ci-4459-0-0-5-bd33a49fb7\" DevicePath \"\"" Sep 12 23:01:29.122388 kubelet[2763]: I0912 23:01:29.122398 2763 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/78a7b686-9379-4d2e-b60a-84ef6a14d85c-whisker-backend-key-pair\") on node \"ci-4459-0-0-5-bd33a49fb7\" DevicePath \"\"" Sep 12 23:01:29.495879 systemd[1]: Removed slice kubepods-besteffort-pod78a7b686_9379_4d2e_b60a_84ef6a14d85c.slice - libcontainer container kubepods-besteffort-pod78a7b686_9379_4d2e_b60a_84ef6a14d85c.slice. Sep 12 23:01:29.528028 kubelet[2763]: I0912 23:01:29.524745 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-cj6rj" podStartSLOduration=2.499771104 podStartE2EDuration="29.524195293s" podCreationTimestamp="2025-09-12 23:01:00 +0000 UTC" firstStartedPulling="2025-09-12 23:01:01.10625903 +0000 UTC m=+21.079413469" lastFinishedPulling="2025-09-12 23:01:28.130683218 +0000 UTC m=+48.103837658" observedRunningTime="2025-09-12 23:01:29.52297578 +0000 UTC m=+49.496130259" watchObservedRunningTime="2025-09-12 23:01:29.524195293 +0000 UTC m=+49.497349762" Sep 12 23:01:29.602031 systemd[1]: Created slice kubepods-besteffort-podead74c0d_e9e3_4456_9a26_53964cef12f8.slice - libcontainer container kubepods-besteffort-podead74c0d_e9e3_4456_9a26_53964cef12f8.slice. Sep 12 23:01:29.627570 kubelet[2763]: I0912 23:01:29.627414 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ead74c0d-e9e3-4456-9a26-53964cef12f8-whisker-ca-bundle\") pod \"whisker-75bf848f8-f6754\" (UID: \"ead74c0d-e9e3-4456-9a26-53964cef12f8\") " pod="calico-system/whisker-75bf848f8-f6754" Sep 12 23:01:29.627570 kubelet[2763]: I0912 23:01:29.627535 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ead74c0d-e9e3-4456-9a26-53964cef12f8-whisker-backend-key-pair\") pod \"whisker-75bf848f8-f6754\" (UID: \"ead74c0d-e9e3-4456-9a26-53964cef12f8\") " pod="calico-system/whisker-75bf848f8-f6754" Sep 12 23:01:29.628252 kubelet[2763]: I0912 23:01:29.627603 2763 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhw27\" (UniqueName: \"kubernetes.io/projected/ead74c0d-e9e3-4456-9a26-53964cef12f8-kube-api-access-zhw27\") pod \"whisker-75bf848f8-f6754\" (UID: \"ead74c0d-e9e3-4456-9a26-53964cef12f8\") " pod="calico-system/whisker-75bf848f8-f6754" Sep 12 23:01:29.911382 containerd[1563]: time="2025-09-12T23:01:29.911271887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-75bf848f8-f6754,Uid:ead74c0d-e9e3-4456-9a26-53964cef12f8,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:30.153085 kubelet[2763]: I0912 23:01:30.152925 2763 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78a7b686-9379-4d2e-b60a-84ef6a14d85c" path="/var/lib/kubelet/pods/78a7b686-9379-4d2e-b60a-84ef6a14d85c/volumes" Sep 12 23:01:30.368339 systemd-networkd[1470]: caliefd1f7c88c5: Link UP Sep 12 23:01:30.368470 systemd-networkd[1470]: caliefd1f7c88c5: Gained carrier Sep 12 23:01:30.391223 containerd[1563]: 2025-09-12 23:01:29.962 [INFO][3885] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 23:01:30.391223 containerd[1563]: 2025-09-12 23:01:30.009 [INFO][3885] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0 whisker-75bf848f8- calico-system ead74c0d-e9e3-4456-9a26-53964cef12f8 892 0 2025-09-12 23:01:29 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:75bf848f8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-0-0-5-bd33a49fb7 whisker-75bf848f8-f6754 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliefd1f7c88c5 [] [] }} ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Namespace="calico-system" Pod="whisker-75bf848f8-f6754" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-" Sep 12 23:01:30.391223 containerd[1563]: 2025-09-12 23:01:30.009 [INFO][3885] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Namespace="calico-system" Pod="whisker-75bf848f8-f6754" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" Sep 12 23:01:30.391223 containerd[1563]: 2025-09-12 23:01:30.294 [INFO][3893] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" HandleID="k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.297 [INFO][3893] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" HandleID="k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b8340), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-5-bd33a49fb7", "pod":"whisker-75bf848f8-f6754", "timestamp":"2025-09-12 23:01:30.294032801 +0000 UTC"}, Hostname:"ci-4459-0-0-5-bd33a49fb7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.297 [INFO][3893] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.297 [INFO][3893] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.298 [INFO][3893] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-5-bd33a49fb7' Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.311 [INFO][3893] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.322 [INFO][3893] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.330 [INFO][3893] ipam/ipam.go 511: Trying affinity for 192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.332 [INFO][3893] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.391625 containerd[1563]: 2025-09-12 23:01:30.335 [INFO][3893] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.394795 containerd[1563]: 2025-09-12 23:01:30.335 [INFO][3893] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.394795 containerd[1563]: 2025-09-12 23:01:30.337 [INFO][3893] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a Sep 12 23:01:30.394795 containerd[1563]: 2025-09-12 23:01:30.342 [INFO][3893] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.394795 containerd[1563]: 2025-09-12 23:01:30.349 [INFO][3893] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.65/26] block=192.168.116.64/26 handle="k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.394795 containerd[1563]: 2025-09-12 23:01:30.349 [INFO][3893] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.65/26] handle="k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:30.394795 containerd[1563]: 2025-09-12 23:01:30.349 [INFO][3893] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:30.394795 containerd[1563]: 2025-09-12 23:01:30.349 [INFO][3893] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.65/26] IPv6=[] ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" HandleID="k8s-pod-network.f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" Sep 12 23:01:30.395181 containerd[1563]: 2025-09-12 23:01:30.352 [INFO][3885] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Namespace="calico-system" Pod="whisker-75bf848f8-f6754" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0", GenerateName:"whisker-75bf848f8-", Namespace:"calico-system", SelfLink:"", UID:"ead74c0d-e9e3-4456-9a26-53964cef12f8", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"75bf848f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"", Pod:"whisker-75bf848f8-f6754", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.116.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliefd1f7c88c5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:30.395181 containerd[1563]: 2025-09-12 23:01:30.353 [INFO][3885] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.65/32] ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Namespace="calico-system" Pod="whisker-75bf848f8-f6754" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" Sep 12 23:01:30.395331 containerd[1563]: 2025-09-12 23:01:30.353 [INFO][3885] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliefd1f7c88c5 ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Namespace="calico-system" Pod="whisker-75bf848f8-f6754" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" Sep 12 23:01:30.395331 containerd[1563]: 2025-09-12 23:01:30.362 [INFO][3885] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Namespace="calico-system" Pod="whisker-75bf848f8-f6754" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" Sep 12 23:01:30.395393 containerd[1563]: 2025-09-12 23:01:30.366 [INFO][3885] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Namespace="calico-system" Pod="whisker-75bf848f8-f6754" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0", GenerateName:"whisker-75bf848f8-", Namespace:"calico-system", SelfLink:"", UID:"ead74c0d-e9e3-4456-9a26-53964cef12f8", ResourceVersion:"892", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"75bf848f8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a", Pod:"whisker-75bf848f8-f6754", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.116.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliefd1f7c88c5", MAC:"72:b7:af:d2:7f:fb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:30.395513 containerd[1563]: 2025-09-12 23:01:30.383 [INFO][3885] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" Namespace="calico-system" Pod="whisker-75bf848f8-f6754" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-whisker--75bf848f8--f6754-eth0" Sep 12 23:01:30.550968 containerd[1563]: time="2025-09-12T23:01:30.550881474Z" level=info msg="connecting to shim f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a" address="unix:///run/containerd/s/7b575fe761fad43d645cb68dac69e2df1a5bf39cca2d582326ad9a50b8f79e54" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:30.609381 systemd[1]: Started cri-containerd-f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a.scope - libcontainer container f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a. Sep 12 23:01:30.733200 containerd[1563]: time="2025-09-12T23:01:30.733163755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-75bf848f8-f6754,Uid:ead74c0d-e9e3-4456-9a26-53964cef12f8,Namespace:calico-system,Attempt:0,} returns sandbox id \"f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a\"" Sep 12 23:01:30.734762 containerd[1563]: time="2025-09-12T23:01:30.734689951Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 23:01:30.790218 containerd[1563]: time="2025-09-12T23:01:30.790171956Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\" id:\"f3d55594eaecf58fdcd8bfb60d23750d5da8a5d1a944323bff6bb39f88094d28\" pid:4050 exit_status:1 exited_at:{seconds:1757718090 nanos:777514721}" Sep 12 23:01:31.027197 systemd-networkd[1470]: vxlan.calico: Link UP Sep 12 23:01:31.027204 systemd-networkd[1470]: vxlan.calico: Gained carrier Sep 12 23:01:31.144403 containerd[1563]: time="2025-09-12T23:01:31.144364725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-579768695b-96v9q,Uid:8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:31.145723 containerd[1563]: time="2025-09-12T23:01:31.145686850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fcf7b468-4m2kd,Uid:d0b02990-63aa-436e-8d45-83a2fb9fb79e,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:01:31.307378 systemd-networkd[1470]: calidc7c79831f4: Link UP Sep 12 23:01:31.312134 systemd-networkd[1470]: calidc7c79831f4: Gained carrier Sep 12 23:01:31.332784 containerd[1563]: 2025-09-12 23:01:31.218 [INFO][4137] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0 calico-apiserver-7fcf7b468- calico-apiserver d0b02990-63aa-436e-8d45-83a2fb9fb79e 826 0 2025-09-12 23:00:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7fcf7b468 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-5-bd33a49fb7 calico-apiserver-7fcf7b468-4m2kd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidc7c79831f4 [] [] }} ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-4m2kd" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-" Sep 12 23:01:31.332784 containerd[1563]: 2025-09-12 23:01:31.219 [INFO][4137] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-4m2kd" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" Sep 12 23:01:31.332784 containerd[1563]: 2025-09-12 23:01:31.260 [INFO][4157] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" HandleID="k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.261 [INFO][4157] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" HandleID="k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000303960), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-5-bd33a49fb7", "pod":"calico-apiserver-7fcf7b468-4m2kd", "timestamp":"2025-09-12 23:01:31.26091203 +0000 UTC"}, Hostname:"ci-4459-0-0-5-bd33a49fb7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.261 [INFO][4157] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.261 [INFO][4157] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.261 [INFO][4157] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-5-bd33a49fb7' Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.267 [INFO][4157] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.271 [INFO][4157] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.279 [INFO][4157] ipam/ipam.go 511: Trying affinity for 192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.282 [INFO][4157] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.333216 containerd[1563]: 2025-09-12 23:01:31.284 [INFO][4157] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.334229 containerd[1563]: 2025-09-12 23:01:31.284 [INFO][4157] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.334229 containerd[1563]: 2025-09-12 23:01:31.285 [INFO][4157] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6 Sep 12 23:01:31.334229 containerd[1563]: 2025-09-12 23:01:31.289 [INFO][4157] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.334229 containerd[1563]: 2025-09-12 23:01:31.295 [INFO][4157] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.66/26] block=192.168.116.64/26 handle="k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.334229 containerd[1563]: 2025-09-12 23:01:31.295 [INFO][4157] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.66/26] handle="k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.334229 containerd[1563]: 2025-09-12 23:01:31.295 [INFO][4157] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:31.334229 containerd[1563]: 2025-09-12 23:01:31.296 [INFO][4157] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.66/26] IPv6=[] ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" HandleID="k8s-pod-network.943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" Sep 12 23:01:31.334430 containerd[1563]: 2025-09-12 23:01:31.304 [INFO][4137] cni-plugin/k8s.go 418: Populated endpoint ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-4m2kd" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0", GenerateName:"calico-apiserver-7fcf7b468-", Namespace:"calico-apiserver", SelfLink:"", UID:"d0b02990-63aa-436e-8d45-83a2fb9fb79e", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 0, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fcf7b468", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"", Pod:"calico-apiserver-7fcf7b468-4m2kd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidc7c79831f4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:31.334518 containerd[1563]: 2025-09-12 23:01:31.304 [INFO][4137] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.66/32] ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-4m2kd" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" Sep 12 23:01:31.334518 containerd[1563]: 2025-09-12 23:01:31.304 [INFO][4137] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidc7c79831f4 ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-4m2kd" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" Sep 12 23:01:31.334518 containerd[1563]: 2025-09-12 23:01:31.312 [INFO][4137] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-4m2kd" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" Sep 12 23:01:31.334645 containerd[1563]: 2025-09-12 23:01:31.314 [INFO][4137] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-4m2kd" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0", GenerateName:"calico-apiserver-7fcf7b468-", Namespace:"calico-apiserver", SelfLink:"", UID:"d0b02990-63aa-436e-8d45-83a2fb9fb79e", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 0, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fcf7b468", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6", Pod:"calico-apiserver-7fcf7b468-4m2kd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidc7c79831f4", MAC:"42:76:d6:e9:64:4b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:31.334722 containerd[1563]: 2025-09-12 23:01:31.329 [INFO][4137] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-4m2kd" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--4m2kd-eth0" Sep 12 23:01:31.359396 containerd[1563]: time="2025-09-12T23:01:31.358183539Z" level=info msg="connecting to shim 943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6" address="unix:///run/containerd/s/bb334c066a8c530b4c0492fa4ac8742b665a14f41a68b5a6799f886d8e8b4dfb" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:31.411756 systemd[1]: Started cri-containerd-943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6.scope - libcontainer container 943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6. Sep 12 23:01:31.448392 systemd-networkd[1470]: calic912c5c90dd: Link UP Sep 12 23:01:31.449741 systemd-networkd[1470]: calic912c5c90dd: Gained carrier Sep 12 23:01:31.493201 containerd[1563]: 2025-09-12 23:01:31.223 [INFO][4131] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0 calico-kube-controllers-579768695b- calico-system 8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda 823 0 2025-09-12 23:01:01 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:579768695b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-0-0-5-bd33a49fb7 calico-kube-controllers-579768695b-96v9q eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic912c5c90dd [] [] }} ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Namespace="calico-system" Pod="calico-kube-controllers-579768695b-96v9q" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-" Sep 12 23:01:31.493201 containerd[1563]: 2025-09-12 23:01:31.224 [INFO][4131] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Namespace="calico-system" Pod="calico-kube-controllers-579768695b-96v9q" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" Sep 12 23:01:31.493201 containerd[1563]: 2025-09-12 23:01:31.280 [INFO][4162] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" HandleID="k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.281 [INFO][4162] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" HandleID="k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00023b910), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-5-bd33a49fb7", "pod":"calico-kube-controllers-579768695b-96v9q", "timestamp":"2025-09-12 23:01:31.280592083 +0000 UTC"}, Hostname:"ci-4459-0-0-5-bd33a49fb7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.281 [INFO][4162] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.295 [INFO][4162] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.296 [INFO][4162] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-5-bd33a49fb7' Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.372 [INFO][4162] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.385 [INFO][4162] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.395 [INFO][4162] ipam/ipam.go 511: Trying affinity for 192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.399 [INFO][4162] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.493749 containerd[1563]: 2025-09-12 23:01:31.403 [INFO][4162] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.494268 containerd[1563]: 2025-09-12 23:01:31.403 [INFO][4162] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.494268 containerd[1563]: 2025-09-12 23:01:31.408 [INFO][4162] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d Sep 12 23:01:31.494268 containerd[1563]: 2025-09-12 23:01:31.415 [INFO][4162] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.494268 containerd[1563]: 2025-09-12 23:01:31.436 [INFO][4162] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.67/26] block=192.168.116.64/26 handle="k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.494268 containerd[1563]: 2025-09-12 23:01:31.437 [INFO][4162] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.67/26] handle="k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:31.494268 containerd[1563]: 2025-09-12 23:01:31.437 [INFO][4162] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:31.494268 containerd[1563]: 2025-09-12 23:01:31.437 [INFO][4162] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.67/26] IPv6=[] ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" HandleID="k8s-pod-network.9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" Sep 12 23:01:31.494578 containerd[1563]: 2025-09-12 23:01:31.443 [INFO][4131] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Namespace="calico-system" Pod="calico-kube-controllers-579768695b-96v9q" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0", GenerateName:"calico-kube-controllers-579768695b-", Namespace:"calico-system", SelfLink:"", UID:"8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"579768695b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"", Pod:"calico-kube-controllers-579768695b-96v9q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.116.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic912c5c90dd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:31.494642 containerd[1563]: 2025-09-12 23:01:31.443 [INFO][4131] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.67/32] ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Namespace="calico-system" Pod="calico-kube-controllers-579768695b-96v9q" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" Sep 12 23:01:31.494642 containerd[1563]: 2025-09-12 23:01:31.443 [INFO][4131] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic912c5c90dd ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Namespace="calico-system" Pod="calico-kube-controllers-579768695b-96v9q" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" Sep 12 23:01:31.494642 containerd[1563]: 2025-09-12 23:01:31.450 [INFO][4131] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Namespace="calico-system" Pod="calico-kube-controllers-579768695b-96v9q" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" Sep 12 23:01:31.494703 containerd[1563]: 2025-09-12 23:01:31.458 [INFO][4131] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Namespace="calico-system" Pod="calico-kube-controllers-579768695b-96v9q" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0", GenerateName:"calico-kube-controllers-579768695b-", Namespace:"calico-system", SelfLink:"", UID:"8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"579768695b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d", Pod:"calico-kube-controllers-579768695b-96v9q", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.116.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic912c5c90dd", MAC:"52:e1:88:48:1b:ae", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:31.494750 containerd[1563]: 2025-09-12 23:01:31.485 [INFO][4131] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" Namespace="calico-system" Pod="calico-kube-controllers-579768695b-96v9q" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--kube--controllers--579768695b--96v9q-eth0" Sep 12 23:01:31.537451 containerd[1563]: time="2025-09-12T23:01:31.537203327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fcf7b468-4m2kd,Uid:d0b02990-63aa-436e-8d45-83a2fb9fb79e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6\"" Sep 12 23:01:31.542919 containerd[1563]: time="2025-09-12T23:01:31.542624345Z" level=info msg="connecting to shim 9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d" address="unix:///run/containerd/s/65176e43e9e5f9509be7c0ce341afe0cb63dc614aba7f2e567b9627e9130a241" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:31.577215 systemd[1]: Started cri-containerd-9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d.scope - libcontainer container 9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d. Sep 12 23:01:31.634440 containerd[1563]: time="2025-09-12T23:01:31.634400957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-579768695b-96v9q,Uid:8820a8d0-00eb-40d2-b1fe-1ffca3a1eeda,Namespace:calico-system,Attempt:0,} returns sandbox id \"9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d\"" Sep 12 23:01:31.721566 containerd[1563]: time="2025-09-12T23:01:31.721521243Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\" id:\"21f3a97eb3647cd8dc6e2172531fa1c3e75f9e94d73de5e3d5247d2b95df7925\" pid:4263 exit_status:1 exited_at:{seconds:1757718091 nanos:721181617}" Sep 12 23:01:32.043758 systemd-networkd[1470]: caliefd1f7c88c5: Gained IPv6LL Sep 12 23:01:32.811014 systemd-networkd[1470]: vxlan.calico: Gained IPv6LL Sep 12 23:01:32.874337 systemd-networkd[1470]: calic912c5c90dd: Gained IPv6LL Sep 12 23:01:33.145994 containerd[1563]: time="2025-09-12T23:01:33.145436578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fcf7b468-ml5pp,Uid:fe96040e-cb83-4d2d-acce-421f62e4c4db,Namespace:calico-apiserver,Attempt:0,}" Sep 12 23:01:33.145994 containerd[1563]: time="2025-09-12T23:01:33.145463358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vds8s,Uid:dc787179-cf7b-486b-bfa0-8a2ffd9994c2,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:33.264920 systemd-networkd[1470]: calidc7c79831f4: Gained IPv6LL Sep 12 23:01:33.351675 systemd-networkd[1470]: cali1c7b0315a5a: Link UP Sep 12 23:01:33.353957 systemd-networkd[1470]: cali1c7b0315a5a: Gained carrier Sep 12 23:01:33.372548 containerd[1563]: 2025-09-12 23:01:33.247 [INFO][4340] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0 coredns-7c65d6cfc9- kube-system dc787179-cf7b-486b-bfa0-8a2ffd9994c2 822 0 2025-09-12 23:00:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-0-0-5-bd33a49fb7 coredns-7c65d6cfc9-vds8s eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1c7b0315a5a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vds8s" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-" Sep 12 23:01:33.372548 containerd[1563]: 2025-09-12 23:01:33.248 [INFO][4340] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vds8s" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" Sep 12 23:01:33.372548 containerd[1563]: 2025-09-12 23:01:33.285 [INFO][4359] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" HandleID="k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.286 [INFO][4359] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" HandleID="k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-0-0-5-bd33a49fb7", "pod":"coredns-7c65d6cfc9-vds8s", "timestamp":"2025-09-12 23:01:33.285543153 +0000 UTC"}, Hostname:"ci-4459-0-0-5-bd33a49fb7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.286 [INFO][4359] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.286 [INFO][4359] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.286 [INFO][4359] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-5-bd33a49fb7' Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.295 [INFO][4359] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.302 [INFO][4359] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.310 [INFO][4359] ipam/ipam.go 511: Trying affinity for 192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.312 [INFO][4359] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372731 containerd[1563]: 2025-09-12 23:01:33.319 [INFO][4359] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372918 containerd[1563]: 2025-09-12 23:01:33.319 [INFO][4359] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372918 containerd[1563]: 2025-09-12 23:01:33.322 [INFO][4359] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf Sep 12 23:01:33.372918 containerd[1563]: 2025-09-12 23:01:33.330 [INFO][4359] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372918 containerd[1563]: 2025-09-12 23:01:33.339 [INFO][4359] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.68/26] block=192.168.116.64/26 handle="k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372918 containerd[1563]: 2025-09-12 23:01:33.339 [INFO][4359] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.68/26] handle="k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.372918 containerd[1563]: 2025-09-12 23:01:33.340 [INFO][4359] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:33.372918 containerd[1563]: 2025-09-12 23:01:33.340 [INFO][4359] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.68/26] IPv6=[] ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" HandleID="k8s-pod-network.f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" Sep 12 23:01:33.373493 containerd[1563]: 2025-09-12 23:01:33.349 [INFO][4340] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vds8s" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"dc787179-cf7b-486b-bfa0-8a2ffd9994c2", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 0, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"", Pod:"coredns-7c65d6cfc9-vds8s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1c7b0315a5a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:33.373493 containerd[1563]: 2025-09-12 23:01:33.349 [INFO][4340] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.68/32] ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vds8s" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" Sep 12 23:01:33.373493 containerd[1563]: 2025-09-12 23:01:33.349 [INFO][4340] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1c7b0315a5a ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vds8s" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" Sep 12 23:01:33.373493 containerd[1563]: 2025-09-12 23:01:33.354 [INFO][4340] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vds8s" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" Sep 12 23:01:33.373493 containerd[1563]: 2025-09-12 23:01:33.355 [INFO][4340] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vds8s" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"dc787179-cf7b-486b-bfa0-8a2ffd9994c2", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 0, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf", Pod:"coredns-7c65d6cfc9-vds8s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1c7b0315a5a", MAC:"96:fe:d8:0a:83:9f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:33.373493 containerd[1563]: 2025-09-12 23:01:33.368 [INFO][4340] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" Namespace="kube-system" Pod="coredns-7c65d6cfc9-vds8s" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--vds8s-eth0" Sep 12 23:01:33.420225 containerd[1563]: time="2025-09-12T23:01:33.419276979Z" level=info msg="connecting to shim f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf" address="unix:///run/containerd/s/9883888d7fca462e53b6bf2285c47ac801817b05c7a728f7627762439ac11bbb" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:33.461576 systemd[1]: Started cri-containerd-f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf.scope - libcontainer container f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf. Sep 12 23:01:33.462670 systemd-networkd[1470]: cali79bb082ef1d: Link UP Sep 12 23:01:33.462789 systemd-networkd[1470]: cali79bb082ef1d: Gained carrier Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.299 [INFO][4335] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0 calico-apiserver-7fcf7b468- calico-apiserver fe96040e-cb83-4d2d-acce-421f62e4c4db 816 0 2025-09-12 23:00:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7fcf7b468 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-5-bd33a49fb7 calico-apiserver-7fcf7b468-ml5pp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali79bb082ef1d [] [] }} ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-ml5pp" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.299 [INFO][4335] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-ml5pp" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.339 [INFO][4367] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" HandleID="k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.340 [INFO][4367] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" HandleID="k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd850), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-5-bd33a49fb7", "pod":"calico-apiserver-7fcf7b468-ml5pp", "timestamp":"2025-09-12 23:01:33.339722724 +0000 UTC"}, Hostname:"ci-4459-0-0-5-bd33a49fb7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.340 [INFO][4367] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.340 [INFO][4367] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.340 [INFO][4367] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-5-bd33a49fb7' Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.396 [INFO][4367] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.403 [INFO][4367] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.415 [INFO][4367] ipam/ipam.go 511: Trying affinity for 192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.417 [INFO][4367] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.421 [INFO][4367] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.421 [INFO][4367] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.423 [INFO][4367] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.428 [INFO][4367] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.443 [INFO][4367] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.69/26] block=192.168.116.64/26 handle="k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.443 [INFO][4367] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.69/26] handle="k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.443 [INFO][4367] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:33.492643 containerd[1563]: 2025-09-12 23:01:33.443 [INFO][4367] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.69/26] IPv6=[] ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" HandleID="k8s-pod-network.a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" Sep 12 23:01:33.493273 containerd[1563]: 2025-09-12 23:01:33.455 [INFO][4335] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-ml5pp" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0", GenerateName:"calico-apiserver-7fcf7b468-", Namespace:"calico-apiserver", SelfLink:"", UID:"fe96040e-cb83-4d2d-acce-421f62e4c4db", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 0, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fcf7b468", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"", Pod:"calico-apiserver-7fcf7b468-ml5pp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali79bb082ef1d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:33.493273 containerd[1563]: 2025-09-12 23:01:33.456 [INFO][4335] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.69/32] ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-ml5pp" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" Sep 12 23:01:33.493273 containerd[1563]: 2025-09-12 23:01:33.456 [INFO][4335] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali79bb082ef1d ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-ml5pp" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" Sep 12 23:01:33.493273 containerd[1563]: 2025-09-12 23:01:33.462 [INFO][4335] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-ml5pp" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" Sep 12 23:01:33.493273 containerd[1563]: 2025-09-12 23:01:33.463 [INFO][4335] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-ml5pp" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0", GenerateName:"calico-apiserver-7fcf7b468-", Namespace:"calico-apiserver", SelfLink:"", UID:"fe96040e-cb83-4d2d-acce-421f62e4c4db", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 0, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fcf7b468", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e", Pod:"calico-apiserver-7fcf7b468-ml5pp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.116.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali79bb082ef1d", MAC:"5a:e4:c9:59:6b:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:33.493273 containerd[1563]: 2025-09-12 23:01:33.487 [INFO][4335] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" Namespace="calico-apiserver" Pod="calico-apiserver-7fcf7b468-ml5pp" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-calico--apiserver--7fcf7b468--ml5pp-eth0" Sep 12 23:01:33.526877 containerd[1563]: time="2025-09-12T23:01:33.525843023Z" level=info msg="connecting to shim a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e" address="unix:///run/containerd/s/f7aa377d323b5e3add271e7104210a5baddc14b0df6f41cb31d2982fc6ae09a0" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:33.554335 systemd[1]: Started cri-containerd-a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e.scope - libcontainer container a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e. Sep 12 23:01:33.570264 containerd[1563]: time="2025-09-12T23:01:33.570219372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-vds8s,Uid:dc787179-cf7b-486b-bfa0-8a2ffd9994c2,Namespace:kube-system,Attempt:0,} returns sandbox id \"f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf\"" Sep 12 23:01:33.574438 containerd[1563]: time="2025-09-12T23:01:33.574399959Z" level=info msg="CreateContainer within sandbox \"f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 23:01:33.593673 containerd[1563]: time="2025-09-12T23:01:33.593135924Z" level=info msg="Container 7e225741f39e4f86f680719ddca17ca49dd12e500c3671529991697b7e0c00c0: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:33.600805 containerd[1563]: time="2025-09-12T23:01:33.600227731Z" level=info msg="CreateContainer within sandbox \"f305f86c60b253b55ec4ef1d5b4e2d9298fea7e76423488766750e05b9ece5cf\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7e225741f39e4f86f680719ddca17ca49dd12e500c3671529991697b7e0c00c0\"" Sep 12 23:01:33.601232 containerd[1563]: time="2025-09-12T23:01:33.601209399Z" level=info msg="StartContainer for \"7e225741f39e4f86f680719ddca17ca49dd12e500c3671529991697b7e0c00c0\"" Sep 12 23:01:33.602270 containerd[1563]: time="2025-09-12T23:01:33.602240559Z" level=info msg="connecting to shim 7e225741f39e4f86f680719ddca17ca49dd12e500c3671529991697b7e0c00c0" address="unix:///run/containerd/s/9883888d7fca462e53b6bf2285c47ac801817b05c7a728f7627762439ac11bbb" protocol=ttrpc version=3 Sep 12 23:01:33.638611 systemd[1]: Started cri-containerd-7e225741f39e4f86f680719ddca17ca49dd12e500c3671529991697b7e0c00c0.scope - libcontainer container 7e225741f39e4f86f680719ddca17ca49dd12e500c3671529991697b7e0c00c0. Sep 12 23:01:33.647457 containerd[1563]: time="2025-09-12T23:01:33.647417305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fcf7b468-ml5pp,Uid:fe96040e-cb83-4d2d-acce-421f62e4c4db,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e\"" Sep 12 23:01:33.676790 containerd[1563]: time="2025-09-12T23:01:33.676683295Z" level=info msg="StartContainer for \"7e225741f39e4f86f680719ddca17ca49dd12e500c3671529991697b7e0c00c0\" returns successfully" Sep 12 23:01:34.475247 systemd-networkd[1470]: cali1c7b0315a5a: Gained IPv6LL Sep 12 23:01:34.558657 kubelet[2763]: I0912 23:01:34.558576 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-vds8s" podStartSLOduration=48.558553723 podStartE2EDuration="48.558553723s" podCreationTimestamp="2025-09-12 23:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:01:34.531503762 +0000 UTC m=+54.504658200" watchObservedRunningTime="2025-09-12 23:01:34.558553723 +0000 UTC m=+54.531708162" Sep 12 23:01:34.568157 containerd[1563]: time="2025-09-12T23:01:34.567014883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:34.571203 containerd[1563]: time="2025-09-12T23:01:34.571164541Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 12 23:01:34.573396 containerd[1563]: time="2025-09-12T23:01:34.573248374Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:34.578465 containerd[1563]: time="2025-09-12T23:01:34.578407913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:34.579344 containerd[1563]: time="2025-09-12T23:01:34.579133721Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 3.844419766s" Sep 12 23:01:34.579344 containerd[1563]: time="2025-09-12T23:01:34.579170370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 12 23:01:34.581004 containerd[1563]: time="2025-09-12T23:01:34.580671240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 23:01:34.583122 containerd[1563]: time="2025-09-12T23:01:34.582849477Z" level=info msg="CreateContainer within sandbox \"f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 23:01:34.596062 containerd[1563]: time="2025-09-12T23:01:34.594229553Z" level=info msg="Container c3d3678189db39249758fd9286a7a802fcd319be378b26777d35ac632f700d83: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:34.607887 containerd[1563]: time="2025-09-12T23:01:34.607677499Z" level=info msg="CreateContainer within sandbox \"f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"c3d3678189db39249758fd9286a7a802fcd319be378b26777d35ac632f700d83\"" Sep 12 23:01:34.610134 containerd[1563]: time="2025-09-12T23:01:34.609510971Z" level=info msg="StartContainer for \"c3d3678189db39249758fd9286a7a802fcd319be378b26777d35ac632f700d83\"" Sep 12 23:01:34.610849 containerd[1563]: time="2025-09-12T23:01:34.610821004Z" level=info msg="connecting to shim c3d3678189db39249758fd9286a7a802fcd319be378b26777d35ac632f700d83" address="unix:///run/containerd/s/7b575fe761fad43d645cb68dac69e2df1a5bf39cca2d582326ad9a50b8f79e54" protocol=ttrpc version=3 Sep 12 23:01:34.635450 systemd[1]: Started cri-containerd-c3d3678189db39249758fd9286a7a802fcd319be378b26777d35ac632f700d83.scope - libcontainer container c3d3678189db39249758fd9286a7a802fcd319be378b26777d35ac632f700d83. Sep 12 23:01:34.666791 systemd-networkd[1470]: cali79bb082ef1d: Gained IPv6LL Sep 12 23:01:34.688433 containerd[1563]: time="2025-09-12T23:01:34.688382220Z" level=info msg="StartContainer for \"c3d3678189db39249758fd9286a7a802fcd319be378b26777d35ac632f700d83\" returns successfully" Sep 12 23:01:36.145784 containerd[1563]: time="2025-09-12T23:01:36.145368710Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mbsv9,Uid:bf244edb-3fa8-4ebb-a022-bf4d9953e4eb,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:36.148116 containerd[1563]: time="2025-09-12T23:01:36.147311454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-745ct,Uid:f749bc25-4ea0-4be7-801c-efad8b6d5195,Namespace:kube-system,Attempt:0,}" Sep 12 23:01:36.148116 containerd[1563]: time="2025-09-12T23:01:36.147564452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-zwkck,Uid:1a615f0e-57a8-4819-ae03-9bae73eb3e3d,Namespace:calico-system,Attempt:0,}" Sep 12 23:01:36.325497 systemd-networkd[1470]: calia155f7f7720: Link UP Sep 12 23:01:36.328647 systemd-networkd[1470]: calia155f7f7720: Gained carrier Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.223 [INFO][4554] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0 coredns-7c65d6cfc9- kube-system f749bc25-4ea0-4be7-801c-efad8b6d5195 820 0 2025-09-12 23:00:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-0-0-5-bd33a49fb7 coredns-7c65d6cfc9-745ct eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia155f7f7720 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-745ct" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.224 [INFO][4554] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-745ct" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.264 [INFO][4589] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" HandleID="k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.265 [INFO][4589] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" HandleID="k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003323f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-0-0-5-bd33a49fb7", "pod":"coredns-7c65d6cfc9-745ct", "timestamp":"2025-09-12 23:01:36.264626282 +0000 UTC"}, Hostname:"ci-4459-0-0-5-bd33a49fb7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.265 [INFO][4589] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.265 [INFO][4589] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.265 [INFO][4589] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-5-bd33a49fb7' Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.273 [INFO][4589] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.280 [INFO][4589] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.285 [INFO][4589] ipam/ipam.go 511: Trying affinity for 192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.288 [INFO][4589] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.293 [INFO][4589] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.293 [INFO][4589] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.295 [INFO][4589] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.301 [INFO][4589] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.314 [INFO][4589] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.70/26] block=192.168.116.64/26 handle="k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.315 [INFO][4589] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.70/26] handle="k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.315 [INFO][4589] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:36.355385 containerd[1563]: 2025-09-12 23:01:36.315 [INFO][4589] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.70/26] IPv6=[] ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" HandleID="k8s-pod-network.f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" Sep 12 23:01:36.358146 containerd[1563]: 2025-09-12 23:01:36.319 [INFO][4554] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-745ct" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f749bc25-4ea0-4be7-801c-efad8b6d5195", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 0, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"", Pod:"coredns-7c65d6cfc9-745ct", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia155f7f7720", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:36.358146 containerd[1563]: 2025-09-12 23:01:36.320 [INFO][4554] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.70/32] ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-745ct" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" Sep 12 23:01:36.358146 containerd[1563]: 2025-09-12 23:01:36.321 [INFO][4554] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia155f7f7720 ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-745ct" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" Sep 12 23:01:36.358146 containerd[1563]: 2025-09-12 23:01:36.331 [INFO][4554] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-745ct" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" Sep 12 23:01:36.358146 containerd[1563]: 2025-09-12 23:01:36.334 [INFO][4554] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-745ct" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f749bc25-4ea0-4be7-801c-efad8b6d5195", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 0, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc", Pod:"coredns-7c65d6cfc9-745ct", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.116.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia155f7f7720", MAC:"ae:6c:14:44:3e:91", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:36.358146 containerd[1563]: 2025-09-12 23:01:36.349 [INFO][4554] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" Namespace="kube-system" Pod="coredns-7c65d6cfc9-745ct" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-coredns--7c65d6cfc9--745ct-eth0" Sep 12 23:01:36.405181 containerd[1563]: time="2025-09-12T23:01:36.404908775Z" level=info msg="connecting to shim f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc" address="unix:///run/containerd/s/3d7c5efe4ebb57fbfe4505bf74e88fa15e7efb07041d9bc53aa0d18988c01e48" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:36.431108 systemd-networkd[1470]: calid95f34ac1fb: Link UP Sep 12 23:01:36.431956 systemd-networkd[1470]: calid95f34ac1fb: Gained carrier Sep 12 23:01:36.449594 systemd[1]: Started cri-containerd-f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc.scope - libcontainer container f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc. Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.274 [INFO][4563] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0 csi-node-driver- calico-system bf244edb-3fa8-4ebb-a022-bf4d9953e4eb 689 0 2025-09-12 23:01:00 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-0-0-5-bd33a49fb7 csi-node-driver-mbsv9 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid95f34ac1fb [] [] }} ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Namespace="calico-system" Pod="csi-node-driver-mbsv9" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.275 [INFO][4563] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Namespace="calico-system" Pod="csi-node-driver-mbsv9" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.313 [INFO][4600] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" HandleID="k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.313 [INFO][4600] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" HandleID="k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-5-bd33a49fb7", "pod":"csi-node-driver-mbsv9", "timestamp":"2025-09-12 23:01:36.312871742 +0000 UTC"}, Hostname:"ci-4459-0-0-5-bd33a49fb7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.313 [INFO][4600] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.315 [INFO][4600] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.315 [INFO][4600] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-5-bd33a49fb7' Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.375 [INFO][4600] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.379 [INFO][4600] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.385 [INFO][4600] ipam/ipam.go 511: Trying affinity for 192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.392 [INFO][4600] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.396 [INFO][4600] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.396 [INFO][4600] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.399 [INFO][4600] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524 Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.405 [INFO][4600] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.418 [INFO][4600] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.71/26] block=192.168.116.64/26 handle="k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.419 [INFO][4600] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.71/26] handle="k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.419 [INFO][4600] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:36.463243 containerd[1563]: 2025-09-12 23:01:36.419 [INFO][4600] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.71/26] IPv6=[] ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" HandleID="k8s-pod-network.a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" Sep 12 23:01:36.463797 containerd[1563]: 2025-09-12 23:01:36.425 [INFO][4563] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Namespace="calico-system" Pod="csi-node-driver-mbsv9" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bf244edb-3fa8-4ebb-a022-bf4d9953e4eb", ResourceVersion:"689", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"", Pod:"csi-node-driver-mbsv9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.116.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid95f34ac1fb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:36.463797 containerd[1563]: 2025-09-12 23:01:36.425 [INFO][4563] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.71/32] ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Namespace="calico-system" Pod="csi-node-driver-mbsv9" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" Sep 12 23:01:36.463797 containerd[1563]: 2025-09-12 23:01:36.425 [INFO][4563] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid95f34ac1fb ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Namespace="calico-system" Pod="csi-node-driver-mbsv9" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" Sep 12 23:01:36.463797 containerd[1563]: 2025-09-12 23:01:36.432 [INFO][4563] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Namespace="calico-system" Pod="csi-node-driver-mbsv9" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" Sep 12 23:01:36.463797 containerd[1563]: 2025-09-12 23:01:36.433 [INFO][4563] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Namespace="calico-system" Pod="csi-node-driver-mbsv9" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bf244edb-3fa8-4ebb-a022-bf4d9953e4eb", ResourceVersion:"689", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524", Pod:"csi-node-driver-mbsv9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.116.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid95f34ac1fb", MAC:"d6:31:be:cc:24:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:36.463797 containerd[1563]: 2025-09-12 23:01:36.453 [INFO][4563] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" Namespace="calico-system" Pod="csi-node-driver-mbsv9" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-csi--node--driver--mbsv9-eth0" Sep 12 23:01:36.503098 containerd[1563]: time="2025-09-12T23:01:36.503000979Z" level=info msg="connecting to shim a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524" address="unix:///run/containerd/s/e53e3484b1debcd83d4b428ccaec82ac25548c87f1fcf18966dc4bcb7a58c55e" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:36.530493 containerd[1563]: time="2025-09-12T23:01:36.530443514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-745ct,Uid:f749bc25-4ea0-4be7-801c-efad8b6d5195,Namespace:kube-system,Attempt:0,} returns sandbox id \"f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc\"" Sep 12 23:01:36.534439 containerd[1563]: time="2025-09-12T23:01:36.534124018Z" level=info msg="CreateContainer within sandbox \"f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 23:01:36.550621 systemd-networkd[1470]: califdb2641539c: Link UP Sep 12 23:01:36.551313 systemd[1]: Started cri-containerd-a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524.scope - libcontainer container a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524. Sep 12 23:01:36.552652 systemd-networkd[1470]: califdb2641539c: Gained carrier Sep 12 23:01:36.564498 containerd[1563]: time="2025-09-12T23:01:36.564450536Z" level=info msg="Container 747676be640def425e752c9756c97aa5303c2bf6d1761dcb74e0974d2f4f6a7a: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:36.582106 containerd[1563]: time="2025-09-12T23:01:36.582069828Z" level=info msg="CreateContainer within sandbox \"f806b997ad6db245ab8d889334ab7cab22c312b65aac3be1d51dc5e95d25facc\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"747676be640def425e752c9756c97aa5303c2bf6d1761dcb74e0974d2f4f6a7a\"" Sep 12 23:01:36.586716 containerd[1563]: time="2025-09-12T23:01:36.586661101Z" level=info msg="StartContainer for \"747676be640def425e752c9756c97aa5303c2bf6d1761dcb74e0974d2f4f6a7a\"" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.274 [INFO][4567] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0 goldmane-7988f88666- calico-system 1a615f0e-57a8-4819-ae03-9bae73eb3e3d 825 0 2025-09-12 23:01:00 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-0-0-5-bd33a49fb7 goldmane-7988f88666-zwkck eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] califdb2641539c [] [] }} ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Namespace="calico-system" Pod="goldmane-7988f88666-zwkck" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.274 [INFO][4567] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Namespace="calico-system" Pod="goldmane-7988f88666-zwkck" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.335 [INFO][4598] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" HandleID="k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.336 [INFO][4598] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" HandleID="k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e3f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-5-bd33a49fb7", "pod":"goldmane-7988f88666-zwkck", "timestamp":"2025-09-12 23:01:36.335224908 +0000 UTC"}, Hostname:"ci-4459-0-0-5-bd33a49fb7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.336 [INFO][4598] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.419 [INFO][4598] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.419 [INFO][4598] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-5-bd33a49fb7' Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.476 [INFO][4598] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.482 [INFO][4598] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.488 [INFO][4598] ipam/ipam.go 511: Trying affinity for 192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.492 [INFO][4598] ipam/ipam.go 158: Attempting to load block cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.498 [INFO][4598] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.116.64/26 host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.498 [INFO][4598] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.116.64/26 handle="k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.501 [INFO][4598] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8 Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.513 [INFO][4598] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.116.64/26 handle="k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.530 [INFO][4598] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.116.72/26] block=192.168.116.64/26 handle="k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.531 [INFO][4598] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.116.72/26] handle="k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" host="ci-4459-0-0-5-bd33a49fb7" Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.531 [INFO][4598] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 23:01:36.592898 containerd[1563]: 2025-09-12 23:01:36.531 [INFO][4598] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.116.72/26] IPv6=[] ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" HandleID="k8s-pod-network.967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Workload="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" Sep 12 23:01:36.593820 containerd[1563]: 2025-09-12 23:01:36.537 [INFO][4567] cni-plugin/k8s.go 418: Populated endpoint ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Namespace="calico-system" Pod="goldmane-7988f88666-zwkck" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"1a615f0e-57a8-4819-ae03-9bae73eb3e3d", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"", Pod:"goldmane-7988f88666-zwkck", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.116.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califdb2641539c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:36.593820 containerd[1563]: 2025-09-12 23:01:36.539 [INFO][4567] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.116.72/32] ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Namespace="calico-system" Pod="goldmane-7988f88666-zwkck" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" Sep 12 23:01:36.593820 containerd[1563]: 2025-09-12 23:01:36.539 [INFO][4567] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califdb2641539c ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Namespace="calico-system" Pod="goldmane-7988f88666-zwkck" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" Sep 12 23:01:36.593820 containerd[1563]: 2025-09-12 23:01:36.561 [INFO][4567] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Namespace="calico-system" Pod="goldmane-7988f88666-zwkck" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" Sep 12 23:01:36.593820 containerd[1563]: 2025-09-12 23:01:36.562 [INFO][4567] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Namespace="calico-system" Pod="goldmane-7988f88666-zwkck" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"1a615f0e-57a8-4819-ae03-9bae73eb3e3d", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 23, 1, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-5-bd33a49fb7", ContainerID:"967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8", Pod:"goldmane-7988f88666-zwkck", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.116.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"califdb2641539c", MAC:"ae:d9:3f:7e:26:1e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 23:01:36.593820 containerd[1563]: 2025-09-12 23:01:36.576 [INFO][4567] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" Namespace="calico-system" Pod="goldmane-7988f88666-zwkck" WorkloadEndpoint="ci--4459--0--0--5--bd33a49fb7-k8s-goldmane--7988f88666--zwkck-eth0" Sep 12 23:01:36.608076 containerd[1563]: time="2025-09-12T23:01:36.607581922Z" level=info msg="connecting to shim 747676be640def425e752c9756c97aa5303c2bf6d1761dcb74e0974d2f4f6a7a" address="unix:///run/containerd/s/3d7c5efe4ebb57fbfe4505bf74e88fa15e7efb07041d9bc53aa0d18988c01e48" protocol=ttrpc version=3 Sep 12 23:01:36.628875 containerd[1563]: time="2025-09-12T23:01:36.628839459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mbsv9,Uid:bf244edb-3fa8-4ebb-a022-bf4d9953e4eb,Namespace:calico-system,Attempt:0,} returns sandbox id \"a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524\"" Sep 12 23:01:36.636456 systemd[1]: Started cri-containerd-747676be640def425e752c9756c97aa5303c2bf6d1761dcb74e0974d2f4f6a7a.scope - libcontainer container 747676be640def425e752c9756c97aa5303c2bf6d1761dcb74e0974d2f4f6a7a. Sep 12 23:01:36.652016 containerd[1563]: time="2025-09-12T23:01:36.651963671Z" level=info msg="connecting to shim 967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8" address="unix:///run/containerd/s/6be6cd454c560ebe28f2d92d6cdc9890301981c3bbaf90cfd59cb2776e93c995" namespace=k8s.io protocol=ttrpc version=3 Sep 12 23:01:36.694341 systemd[1]: Started cri-containerd-967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8.scope - libcontainer container 967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8. Sep 12 23:01:36.711767 containerd[1563]: time="2025-09-12T23:01:36.711715070Z" level=info msg="StartContainer for \"747676be640def425e752c9756c97aa5303c2bf6d1761dcb74e0974d2f4f6a7a\" returns successfully" Sep 12 23:01:36.797117 containerd[1563]: time="2025-09-12T23:01:36.797072127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-zwkck,Uid:1a615f0e-57a8-4819-ae03-9bae73eb3e3d,Namespace:calico-system,Attempt:0,} returns sandbox id \"967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8\"" Sep 12 23:01:37.225585 containerd[1563]: time="2025-09-12T23:01:37.225525575Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\" id:\"851ab569489310d284351ee34fc2646ef74c8ab9322ca0fc1aee43aa3f33a7a9\" pid:4831 exited_at:{seconds:1757718097 nanos:224137512}" Sep 12 23:01:37.571273 kubelet[2763]: I0912 23:01:37.570717 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-745ct" podStartSLOduration=51.570696506 podStartE2EDuration="51.570696506s" podCreationTimestamp="2025-09-12 23:00:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 23:01:37.550930618 +0000 UTC m=+57.524085058" watchObservedRunningTime="2025-09-12 23:01:37.570696506 +0000 UTC m=+57.543850935" Sep 12 23:01:37.866399 systemd-networkd[1470]: calia155f7f7720: Gained IPv6LL Sep 12 23:01:38.064228 containerd[1563]: time="2025-09-12T23:01:38.064182826Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:38.065406 containerd[1563]: time="2025-09-12T23:01:38.065331358Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 12 23:01:38.066963 containerd[1563]: time="2025-09-12T23:01:38.066926193Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:38.069217 containerd[1563]: time="2025-09-12T23:01:38.069156656Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:38.069962 containerd[1563]: time="2025-09-12T23:01:38.069931922Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.489232792s" Sep 12 23:01:38.070235 containerd[1563]: time="2025-09-12T23:01:38.070202475Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 23:01:38.071617 containerd[1563]: time="2025-09-12T23:01:38.071455003Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 23:01:38.073809 containerd[1563]: time="2025-09-12T23:01:38.073731454Z" level=info msg="CreateContainer within sandbox \"943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:01:38.084066 containerd[1563]: time="2025-09-12T23:01:38.083063979Z" level=info msg="Container 7250ced510a96dd75a54653589204f4bb29c9e067ac5c703b1d44819394e94bf: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:38.099653 containerd[1563]: time="2025-09-12T23:01:38.099609810Z" level=info msg="CreateContainer within sandbox \"943a12f6eea4feaf8de36ff7fd9a6b54633384d16f7e7c32c74d37a69eef50e6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7250ced510a96dd75a54653589204f4bb29c9e067ac5c703b1d44819394e94bf\"" Sep 12 23:01:38.102640 containerd[1563]: time="2025-09-12T23:01:38.101118284Z" level=info msg="StartContainer for \"7250ced510a96dd75a54653589204f4bb29c9e067ac5c703b1d44819394e94bf\"" Sep 12 23:01:38.102957 containerd[1563]: time="2025-09-12T23:01:38.102916591Z" level=info msg="connecting to shim 7250ced510a96dd75a54653589204f4bb29c9e067ac5c703b1d44819394e94bf" address="unix:///run/containerd/s/bb334c066a8c530b4c0492fa4ac8742b665a14f41a68b5a6799f886d8e8b4dfb" protocol=ttrpc version=3 Sep 12 23:01:38.122278 systemd-networkd[1470]: califdb2641539c: Gained IPv6LL Sep 12 23:01:38.186287 systemd-networkd[1470]: calid95f34ac1fb: Gained IPv6LL Sep 12 23:01:38.213638 systemd[1]: Started cri-containerd-7250ced510a96dd75a54653589204f4bb29c9e067ac5c703b1d44819394e94bf.scope - libcontainer container 7250ced510a96dd75a54653589204f4bb29c9e067ac5c703b1d44819394e94bf. Sep 12 23:01:38.273300 containerd[1563]: time="2025-09-12T23:01:38.273214368Z" level=info msg="StartContainer for \"7250ced510a96dd75a54653589204f4bb29c9e067ac5c703b1d44819394e94bf\" returns successfully" Sep 12 23:01:38.547879 kubelet[2763]: I0912 23:01:38.547737 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7fcf7b468-4m2kd" podStartSLOduration=34.016731774 podStartE2EDuration="40.547715787s" podCreationTimestamp="2025-09-12 23:00:58 +0000 UTC" firstStartedPulling="2025-09-12 23:01:31.540027383 +0000 UTC m=+51.513181823" lastFinishedPulling="2025-09-12 23:01:38.071011396 +0000 UTC m=+58.044165836" observedRunningTime="2025-09-12 23:01:38.54725637 +0000 UTC m=+58.520410810" watchObservedRunningTime="2025-09-12 23:01:38.547715787 +0000 UTC m=+58.520870226" Sep 12 23:01:42.792692 containerd[1563]: time="2025-09-12T23:01:42.792620451Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:42.794106 containerd[1563]: time="2025-09-12T23:01:42.794075203Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 12 23:01:42.795919 containerd[1563]: time="2025-09-12T23:01:42.795406165Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:42.821345 containerd[1563]: time="2025-09-12T23:01:42.820790296Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:42.821516 containerd[1563]: time="2025-09-12T23:01:42.821474327Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.749985149s" Sep 12 23:01:42.821568 containerd[1563]: time="2025-09-12T23:01:42.821521947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 12 23:01:42.833929 containerd[1563]: time="2025-09-12T23:01:42.833807756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 23:01:42.856871 containerd[1563]: time="2025-09-12T23:01:42.856798238Z" level=info msg="CreateContainer within sandbox \"9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 23:01:42.876238 containerd[1563]: time="2025-09-12T23:01:42.876171848Z" level=info msg="Container 2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:42.907686 containerd[1563]: time="2025-09-12T23:01:42.907615662Z" level=info msg="CreateContainer within sandbox \"9578779b42cbbffb390aaa491a3d1a3c5e7ec0d090037d38ed0dba329263887d\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\"" Sep 12 23:01:42.908643 containerd[1563]: time="2025-09-12T23:01:42.908615951Z" level=info msg="StartContainer for \"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\"" Sep 12 23:01:42.910671 containerd[1563]: time="2025-09-12T23:01:42.910620952Z" level=info msg="connecting to shim 2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be" address="unix:///run/containerd/s/65176e43e9e5f9509be7c0ce341afe0cb63dc614aba7f2e567b9627e9130a241" protocol=ttrpc version=3 Sep 12 23:01:42.988329 systemd[1]: Started cri-containerd-2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be.scope - libcontainer container 2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be. Sep 12 23:01:43.068827 containerd[1563]: time="2025-09-12T23:01:43.068507058Z" level=info msg="StartContainer for \"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" returns successfully" Sep 12 23:01:43.425778 containerd[1563]: time="2025-09-12T23:01:43.425712470Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:43.429432 containerd[1563]: time="2025-09-12T23:01:43.429393259Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 23:01:43.430918 containerd[1563]: time="2025-09-12T23:01:43.430880857Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 597.017674ms" Sep 12 23:01:43.430918 containerd[1563]: time="2025-09-12T23:01:43.430916255Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 12 23:01:43.434332 containerd[1563]: time="2025-09-12T23:01:43.434031266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 23:01:43.439229 containerd[1563]: time="2025-09-12T23:01:43.438845558Z" level=info msg="CreateContainer within sandbox \"a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 23:01:43.453831 containerd[1563]: time="2025-09-12T23:01:43.453785804Z" level=info msg="Container bbbd43688f92eb05971c7d4c3a4c667ae3242ec627293ceb961789422bcc50ce: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:43.470559 containerd[1563]: time="2025-09-12T23:01:43.470393001Z" level=info msg="CreateContainer within sandbox \"a01d8d39348cc7e4aa561a6e57f7df42a8d9e66f29cc423df7982b5169e9191e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bbbd43688f92eb05971c7d4c3a4c667ae3242ec627293ceb961789422bcc50ce\"" Sep 12 23:01:43.472266 containerd[1563]: time="2025-09-12T23:01:43.471350328Z" level=info msg="StartContainer for \"bbbd43688f92eb05971c7d4c3a4c667ae3242ec627293ceb961789422bcc50ce\"" Sep 12 23:01:43.473882 containerd[1563]: time="2025-09-12T23:01:43.473854479Z" level=info msg="connecting to shim bbbd43688f92eb05971c7d4c3a4c667ae3242ec627293ceb961789422bcc50ce" address="unix:///run/containerd/s/f7aa377d323b5e3add271e7104210a5baddc14b0df6f41cb31d2982fc6ae09a0" protocol=ttrpc version=3 Sep 12 23:01:43.507590 systemd[1]: Started cri-containerd-bbbd43688f92eb05971c7d4c3a4c667ae3242ec627293ceb961789422bcc50ce.scope - libcontainer container bbbd43688f92eb05971c7d4c3a4c667ae3242ec627293ceb961789422bcc50ce. Sep 12 23:01:43.736229 containerd[1563]: time="2025-09-12T23:01:43.736128606Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" id:\"54a444f3eb979024a3ce31dd4ecae0e79f47df43e5d6d39bfa41f41ca64c0c98\" pid:4979 exited_at:{seconds:1757718103 nanos:723339267}" Sep 12 23:01:43.747637 containerd[1563]: time="2025-09-12T23:01:43.746819026Z" level=info msg="StartContainer for \"bbbd43688f92eb05971c7d4c3a4c667ae3242ec627293ceb961789422bcc50ce\" returns successfully" Sep 12 23:01:43.782165 kubelet[2763]: I0912 23:01:43.782097 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-579768695b-96v9q" podStartSLOduration=31.595668459 podStartE2EDuration="42.782078696s" podCreationTimestamp="2025-09-12 23:01:01 +0000 UTC" firstStartedPulling="2025-09-12 23:01:31.63627737 +0000 UTC m=+51.609431810" lastFinishedPulling="2025-09-12 23:01:42.822687608 +0000 UTC m=+62.795842047" observedRunningTime="2025-09-12 23:01:43.589265955 +0000 UTC m=+63.562420394" watchObservedRunningTime="2025-09-12 23:01:43.782078696 +0000 UTC m=+63.755233155" Sep 12 23:01:43.857856 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1657869091.mount: Deactivated successfully. Sep 12 23:01:46.158190 kubelet[2763]: I0912 23:01:46.158135 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7fcf7b468-ml5pp" podStartSLOduration=38.374129101 podStartE2EDuration="48.158113177s" podCreationTimestamp="2025-09-12 23:00:58 +0000 UTC" firstStartedPulling="2025-09-12 23:01:33.649627392 +0000 UTC m=+53.622781831" lastFinishedPulling="2025-09-12 23:01:43.433611467 +0000 UTC m=+63.406765907" observedRunningTime="2025-09-12 23:01:44.593768623 +0000 UTC m=+64.566923063" watchObservedRunningTime="2025-09-12 23:01:46.158113177 +0000 UTC m=+66.131267616" Sep 12 23:01:48.070503 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount627906440.mount: Deactivated successfully. Sep 12 23:01:48.090954 containerd[1563]: time="2025-09-12T23:01:48.090907010Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:48.092417 containerd[1563]: time="2025-09-12T23:01:48.092393662Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 12 23:01:48.094314 containerd[1563]: time="2025-09-12T23:01:48.094287670Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:48.097538 containerd[1563]: time="2025-09-12T23:01:48.097502253Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:48.098318 containerd[1563]: time="2025-09-12T23:01:48.098295393Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 4.663968446s" Sep 12 23:01:48.098377 containerd[1563]: time="2025-09-12T23:01:48.098322428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 12 23:01:48.115310 containerd[1563]: time="2025-09-12T23:01:48.115273295Z" level=info msg="CreateContainer within sandbox \"f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 23:01:48.118947 containerd[1563]: time="2025-09-12T23:01:48.118845114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 23:01:48.163484 containerd[1563]: time="2025-09-12T23:01:48.163337043Z" level=info msg="Container 1113d4f9a7458dc79e7211f95dce443cce03b7af667f4132f4b0e86d315ff8d5: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:48.207113 containerd[1563]: time="2025-09-12T23:01:48.207068308Z" level=info msg="CreateContainer within sandbox \"f11ad6d1653fb48cb836353fb1b7420ffdcb985b31c0f349a03f9efe7a02be7a\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1113d4f9a7458dc79e7211f95dce443cce03b7af667f4132f4b0e86d315ff8d5\"" Sep 12 23:01:48.207626 containerd[1563]: time="2025-09-12T23:01:48.207610849Z" level=info msg="StartContainer for \"1113d4f9a7458dc79e7211f95dce443cce03b7af667f4132f4b0e86d315ff8d5\"" Sep 12 23:01:48.209660 containerd[1563]: time="2025-09-12T23:01:48.209624895Z" level=info msg="connecting to shim 1113d4f9a7458dc79e7211f95dce443cce03b7af667f4132f4b0e86d315ff8d5" address="unix:///run/containerd/s/7b575fe761fad43d645cb68dac69e2df1a5bf39cca2d582326ad9a50b8f79e54" protocol=ttrpc version=3 Sep 12 23:01:48.253226 systemd[1]: Started cri-containerd-1113d4f9a7458dc79e7211f95dce443cce03b7af667f4132f4b0e86d315ff8d5.scope - libcontainer container 1113d4f9a7458dc79e7211f95dce443cce03b7af667f4132f4b0e86d315ff8d5. Sep 12 23:01:48.339749 containerd[1563]: time="2025-09-12T23:01:48.339703330Z" level=info msg="StartContainer for \"1113d4f9a7458dc79e7211f95dce443cce03b7af667f4132f4b0e86d315ff8d5\" returns successfully" Sep 12 23:01:49.756457 containerd[1563]: time="2025-09-12T23:01:49.756397251Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:49.758092 containerd[1563]: time="2025-09-12T23:01:49.758066076Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 12 23:01:49.759675 containerd[1563]: time="2025-09-12T23:01:49.759637701Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:49.763703 containerd[1563]: time="2025-09-12T23:01:49.763654810Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:49.765942 containerd[1563]: time="2025-09-12T23:01:49.765730616Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.646848404s" Sep 12 23:01:49.765942 containerd[1563]: time="2025-09-12T23:01:49.765771263Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 12 23:01:49.767084 containerd[1563]: time="2025-09-12T23:01:49.767059071Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 23:01:49.773018 containerd[1563]: time="2025-09-12T23:01:49.772978763Z" level=info msg="CreateContainer within sandbox \"a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 23:01:49.809554 containerd[1563]: time="2025-09-12T23:01:49.809503457Z" level=info msg="Container 28b5531b08cbe8466e6876f47a8bb7e82f266e75e5bd7c187edbe5ae449cf25d: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:49.873906 containerd[1563]: time="2025-09-12T23:01:49.873851363Z" level=info msg="CreateContainer within sandbox \"a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"28b5531b08cbe8466e6876f47a8bb7e82f266e75e5bd7c187edbe5ae449cf25d\"" Sep 12 23:01:49.875530 containerd[1563]: time="2025-09-12T23:01:49.875506396Z" level=info msg="StartContainer for \"28b5531b08cbe8466e6876f47a8bb7e82f266e75e5bd7c187edbe5ae449cf25d\"" Sep 12 23:01:49.877149 containerd[1563]: time="2025-09-12T23:01:49.877093868Z" level=info msg="connecting to shim 28b5531b08cbe8466e6876f47a8bb7e82f266e75e5bd7c187edbe5ae449cf25d" address="unix:///run/containerd/s/e53e3484b1debcd83d4b428ccaec82ac25548c87f1fcf18966dc4bcb7a58c55e" protocol=ttrpc version=3 Sep 12 23:01:49.909596 systemd[1]: Started cri-containerd-28b5531b08cbe8466e6876f47a8bb7e82f266e75e5bd7c187edbe5ae449cf25d.scope - libcontainer container 28b5531b08cbe8466e6876f47a8bb7e82f266e75e5bd7c187edbe5ae449cf25d. Sep 12 23:01:50.103087 containerd[1563]: time="2025-09-12T23:01:50.102841767Z" level=info msg="StartContainer for \"28b5531b08cbe8466e6876f47a8bb7e82f266e75e5bd7c187edbe5ae449cf25d\" returns successfully" Sep 12 23:01:50.375815 containerd[1563]: time="2025-09-12T23:01:50.375425775Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" id:\"1871f7a08a91d50e0d5c62a36106a2e28722950919049aabcf99c71cfcd14af5\" pid:5109 exited_at:{seconds:1757718110 nanos:373336406}" Sep 12 23:01:52.172314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1423711224.mount: Deactivated successfully. Sep 12 23:01:52.884561 containerd[1563]: time="2025-09-12T23:01:52.884491806Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:52.886406 containerd[1563]: time="2025-09-12T23:01:52.886369797Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 12 23:01:52.891202 containerd[1563]: time="2025-09-12T23:01:52.891157998Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:52.894146 containerd[1563]: time="2025-09-12T23:01:52.894102538Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:52.895451 containerd[1563]: time="2025-09-12T23:01:52.895422692Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.128336235s" Sep 12 23:01:52.896427 containerd[1563]: time="2025-09-12T23:01:52.895454627Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 12 23:01:52.907118 containerd[1563]: time="2025-09-12T23:01:52.907070065Z" level=info msg="CreateContainer within sandbox \"967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 23:01:52.909008 containerd[1563]: time="2025-09-12T23:01:52.908975531Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 23:01:52.935236 containerd[1563]: time="2025-09-12T23:01:52.935173301Z" level=info msg="Container bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:52.972485 containerd[1563]: time="2025-09-12T23:01:52.972430419Z" level=info msg="CreateContainer within sandbox \"967216eac2545a88576c6d78e62e196ec28ca455d267e7fd053bcf4923e6fcd8\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\"" Sep 12 23:01:52.974388 containerd[1563]: time="2025-09-12T23:01:52.973238292Z" level=info msg="StartContainer for \"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\"" Sep 12 23:01:52.974972 containerd[1563]: time="2025-09-12T23:01:52.974532906Z" level=info msg="connecting to shim bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619" address="unix:///run/containerd/s/6be6cd454c560ebe28f2d92d6cdc9890301981c3bbaf90cfd59cb2776e93c995" protocol=ttrpc version=3 Sep 12 23:01:53.059202 systemd[1]: Started cri-containerd-bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619.scope - libcontainer container bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619. Sep 12 23:01:53.217434 containerd[1563]: time="2025-09-12T23:01:53.217304214Z" level=info msg="StartContainer for \"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" returns successfully" Sep 12 23:01:53.779532 kubelet[2763]: I0912 23:01:53.772732 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-75bf848f8-f6754" podStartSLOduration=7.389855532 podStartE2EDuration="24.760998488s" podCreationTimestamp="2025-09-12 23:01:29 +0000 UTC" firstStartedPulling="2025-09-12 23:01:30.734503422 +0000 UTC m=+50.707657861" lastFinishedPulling="2025-09-12 23:01:48.105646377 +0000 UTC m=+68.078800817" observedRunningTime="2025-09-12 23:01:48.669768041 +0000 UTC m=+68.642922511" watchObservedRunningTime="2025-09-12 23:01:53.760998488 +0000 UTC m=+73.734152928" Sep 12 23:01:53.781382 kubelet[2763]: I0912 23:01:53.779974 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-zwkck" podStartSLOduration=37.681702962 podStartE2EDuration="53.779957035s" podCreationTimestamp="2025-09-12 23:01:00 +0000 UTC" firstStartedPulling="2025-09-12 23:01:36.798522651 +0000 UTC m=+56.771677090" lastFinishedPulling="2025-09-12 23:01:52.896776724 +0000 UTC m=+72.869931163" observedRunningTime="2025-09-12 23:01:53.718476752 +0000 UTC m=+73.691631211" watchObservedRunningTime="2025-09-12 23:01:53.779957035 +0000 UTC m=+73.753111474" Sep 12 23:01:53.900440 containerd[1563]: time="2025-09-12T23:01:53.900403318Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"34b7321ad644838cdcc4db65fa664c9c12ba6a47e438a6be4eb0c50b4bf98ff1\" pid:5184 exit_status:1 exited_at:{seconds:1757718113 nanos:899285421}" Sep 12 23:01:54.815800 containerd[1563]: time="2025-09-12T23:01:54.815713577Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"e99e1c8e796a64d5c6bc3f52dea9a753638cd6d524d0ab92e9955431393aea45\" pid:5206 exit_status:1 exited_at:{seconds:1757718114 nanos:814324987}" Sep 12 23:01:55.914232 containerd[1563]: time="2025-09-12T23:01:55.914168967Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"2b2f9fb80ff55ef7a9c5978b21d9d9cc51c64bb2a4242eaad2931eddb143b192\" pid:5230 exit_status:1 exited_at:{seconds:1757718115 nanos:911855087}" Sep 12 23:01:56.299429 containerd[1563]: time="2025-09-12T23:01:56.297686896Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:56.299429 containerd[1563]: time="2025-09-12T23:01:56.299147461Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 12 23:01:56.301699 containerd[1563]: time="2025-09-12T23:01:56.300858702Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:56.303550 containerd[1563]: time="2025-09-12T23:01:56.303302675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 23:01:56.306137 containerd[1563]: time="2025-09-12T23:01:56.303617629Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 3.394603218s" Sep 12 23:01:56.306137 containerd[1563]: time="2025-09-12T23:01:56.305126193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 12 23:01:56.342012 containerd[1563]: time="2025-09-12T23:01:56.341978410Z" level=info msg="CreateContainer within sandbox \"a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 23:01:56.363865 containerd[1563]: time="2025-09-12T23:01:56.360936825Z" level=info msg="Container 766589f9d6255ed11bf4a5f390f64cb82ed2380711ac205608a87b26f4ef650c: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:01:56.368136 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3251917654.mount: Deactivated successfully. Sep 12 23:01:56.379360 containerd[1563]: time="2025-09-12T23:01:56.379307940Z" level=info msg="CreateContainer within sandbox \"a1fe6e6f964c495546670d6d9e07c7b5843e37eea39ddd7221c5cdcc0a638524\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"766589f9d6255ed11bf4a5f390f64cb82ed2380711ac205608a87b26f4ef650c\"" Sep 12 23:01:56.380256 containerd[1563]: time="2025-09-12T23:01:56.379997916Z" level=info msg="StartContainer for \"766589f9d6255ed11bf4a5f390f64cb82ed2380711ac205608a87b26f4ef650c\"" Sep 12 23:01:56.387530 containerd[1563]: time="2025-09-12T23:01:56.387477305Z" level=info msg="connecting to shim 766589f9d6255ed11bf4a5f390f64cb82ed2380711ac205608a87b26f4ef650c" address="unix:///run/containerd/s/e53e3484b1debcd83d4b428ccaec82ac25548c87f1fcf18966dc4bcb7a58c55e" protocol=ttrpc version=3 Sep 12 23:01:56.424905 systemd[1]: Started cri-containerd-766589f9d6255ed11bf4a5f390f64cb82ed2380711ac205608a87b26f4ef650c.scope - libcontainer container 766589f9d6255ed11bf4a5f390f64cb82ed2380711ac205608a87b26f4ef650c. Sep 12 23:01:56.636439 containerd[1563]: time="2025-09-12T23:01:56.636306728Z" level=info msg="StartContainer for \"766589f9d6255ed11bf4a5f390f64cb82ed2380711ac205608a87b26f4ef650c\" returns successfully" Sep 12 23:01:56.739238 kubelet[2763]: I0912 23:01:56.736187 2763 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-mbsv9" podStartSLOduration=37.052869917 podStartE2EDuration="56.732423327s" podCreationTimestamp="2025-09-12 23:01:00 +0000 UTC" firstStartedPulling="2025-09-12 23:01:36.632712754 +0000 UTC m=+56.605867193" lastFinishedPulling="2025-09-12 23:01:56.312266164 +0000 UTC m=+76.285420603" observedRunningTime="2025-09-12 23:01:56.731631118 +0000 UTC m=+76.704785567" watchObservedRunningTime="2025-09-12 23:01:56.732423327 +0000 UTC m=+76.705577776" Sep 12 23:01:57.573378 kubelet[2763]: I0912 23:01:57.565711 2763 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 23:01:57.575777 kubelet[2763]: I0912 23:01:57.575758 2763 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 23:02:07.530031 containerd[1563]: time="2025-09-12T23:02:07.529978987Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\" id:\"c585d17860401583e0cd017227d5a4c2a648c6cc4db30659eece29ae758aca43\" pid:5296 exited_at:{seconds:1757718127 nanos:529206612}" Sep 12 23:02:20.454824 containerd[1563]: time="2025-09-12T23:02:20.454759744Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" id:\"d1ae49781835563060db3a85f151216fa37302f941f3a1c24867a904b8d06c61\" pid:5342 exited_at:{seconds:1757718140 nanos:436484214}" Sep 12 23:02:20.498430 containerd[1563]: time="2025-09-12T23:02:20.498388126Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"00cfd67569a4faf3e91af23e40e6738a8ac254de5fa81cdb10bf0761f3130e43\" pid:5344 exited_at:{seconds:1757718140 nanos:497873208}" Sep 12 23:02:22.526665 systemd[1]: Started sshd@7-46.62.198.104:22-139.178.89.65:45906.service - OpenSSH per-connection server daemon (139.178.89.65:45906). Sep 12 23:02:23.604786 sshd[5381]: Accepted publickey for core from 139.178.89.65 port 45906 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:02:23.610461 sshd-session[5381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:02:23.620961 systemd-logind[1533]: New session 8 of user core. Sep 12 23:02:23.629288 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 23:02:24.957918 containerd[1563]: time="2025-09-12T23:02:24.957796855Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" id:\"4d27d342b10b1b6683e1e50845e2834ad117b136693ad40d70f4868bd96ec01f\" pid:5412 exited_at:{seconds:1757718144 nanos:957516989}" Sep 12 23:02:25.107198 sshd[5384]: Connection closed by 139.178.89.65 port 45906 Sep 12 23:02:25.109254 sshd-session[5381]: pam_unix(sshd:session): session closed for user core Sep 12 23:02:25.117627 systemd[1]: sshd@7-46.62.198.104:22-139.178.89.65:45906.service: Deactivated successfully. Sep 12 23:02:25.120101 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 23:02:25.121673 systemd-logind[1533]: Session 8 logged out. Waiting for processes to exit. Sep 12 23:02:25.123635 systemd-logind[1533]: Removed session 8. Sep 12 23:02:30.315421 systemd[1]: Started sshd@8-46.62.198.104:22-139.178.89.65:40388.service - OpenSSH per-connection server daemon (139.178.89.65:40388). Sep 12 23:02:31.451126 sshd[5426]: Accepted publickey for core from 139.178.89.65 port 40388 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:02:31.454220 sshd-session[5426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:02:31.468298 systemd-logind[1533]: New session 9 of user core. Sep 12 23:02:31.476340 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 23:02:32.529537 sshd[5429]: Connection closed by 139.178.89.65 port 40388 Sep 12 23:02:32.531421 sshd-session[5426]: pam_unix(sshd:session): session closed for user core Sep 12 23:02:32.536488 systemd-logind[1533]: Session 9 logged out. Waiting for processes to exit. Sep 12 23:02:32.537381 systemd[1]: sshd@8-46.62.198.104:22-139.178.89.65:40388.service: Deactivated successfully. Sep 12 23:02:32.540829 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 23:02:32.543229 systemd-logind[1533]: Removed session 9. Sep 12 23:02:32.697821 systemd[1]: Started sshd@9-46.62.198.104:22-139.178.89.65:40404.service - OpenSSH per-connection server daemon (139.178.89.65:40404). Sep 12 23:02:33.703203 sshd[5442]: Accepted publickey for core from 139.178.89.65 port 40404 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:02:33.706404 sshd-session[5442]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:02:33.717166 systemd-logind[1533]: New session 10 of user core. Sep 12 23:02:33.722338 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 23:02:34.568685 sshd[5445]: Connection closed by 139.178.89.65 port 40404 Sep 12 23:02:34.570176 sshd-session[5442]: pam_unix(sshd:session): session closed for user core Sep 12 23:02:34.577427 systemd-logind[1533]: Session 10 logged out. Waiting for processes to exit. Sep 12 23:02:34.577570 systemd[1]: sshd@9-46.62.198.104:22-139.178.89.65:40404.service: Deactivated successfully. Sep 12 23:02:34.581840 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 23:02:34.585033 systemd-logind[1533]: Removed session 10. Sep 12 23:02:34.741010 systemd[1]: Started sshd@10-46.62.198.104:22-139.178.89.65:40412.service - OpenSSH per-connection server daemon (139.178.89.65:40412). Sep 12 23:02:35.756909 sshd[5455]: Accepted publickey for core from 139.178.89.65 port 40412 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:02:35.758864 sshd-session[5455]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:02:35.765986 systemd-logind[1533]: New session 11 of user core. Sep 12 23:02:35.776301 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 23:02:36.514580 sshd[5458]: Connection closed by 139.178.89.65 port 40412 Sep 12 23:02:36.516309 sshd-session[5455]: pam_unix(sshd:session): session closed for user core Sep 12 23:02:36.522211 systemd[1]: sshd@10-46.62.198.104:22-139.178.89.65:40412.service: Deactivated successfully. Sep 12 23:02:36.526274 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 23:02:36.527814 systemd-logind[1533]: Session 11 logged out. Waiting for processes to exit. Sep 12 23:02:36.530942 systemd-logind[1533]: Removed session 11. Sep 12 23:02:37.594106 containerd[1563]: time="2025-09-12T23:02:37.593332587Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\" id:\"bd1a2218cf3ebe040f0b2fd8d14b69a27fdb805da420a4ff0ef0a5403d529b59\" pid:5487 exit_status:1 exited_at:{seconds:1757718157 nanos:570678826}" Sep 12 23:02:40.461885 containerd[1563]: time="2025-09-12T23:02:40.461617146Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"0ac6040ad0075300e3232644a293ed9226a27add59f4326e5ddcabffa0930d27\" pid:5516 exited_at:{seconds:1757718160 nanos:461130744}" Sep 12 23:02:41.725406 systemd[1]: Started sshd@11-46.62.198.104:22-139.178.89.65:37830.service - OpenSSH per-connection server daemon (139.178.89.65:37830). Sep 12 23:02:42.929909 sshd[5527]: Accepted publickey for core from 139.178.89.65 port 37830 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:02:42.932786 sshd-session[5527]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:02:42.938352 systemd-logind[1533]: New session 12 of user core. Sep 12 23:02:42.946286 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 23:02:43.859845 sshd[5530]: Connection closed by 139.178.89.65 port 37830 Sep 12 23:02:43.862767 sshd-session[5527]: pam_unix(sshd:session): session closed for user core Sep 12 23:02:43.870014 systemd[1]: sshd@11-46.62.198.104:22-139.178.89.65:37830.service: Deactivated successfully. Sep 12 23:02:43.873923 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 23:02:43.878492 systemd-logind[1533]: Session 12 logged out. Waiting for processes to exit. Sep 12 23:02:43.881159 systemd-logind[1533]: Removed session 12. Sep 12 23:02:49.018162 systemd[1]: Started sshd@12-46.62.198.104:22-139.178.89.65:37842.service - OpenSSH per-connection server daemon (139.178.89.65:37842). Sep 12 23:02:50.043796 sshd[5544]: Accepted publickey for core from 139.178.89.65 port 37842 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:02:50.045069 sshd-session[5544]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:02:50.049724 systemd-logind[1533]: New session 13 of user core. Sep 12 23:02:50.056217 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 23:02:50.460278 containerd[1563]: time="2025-09-12T23:02:50.460231268Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" id:\"83faf8d0dda735db0b1f96377e490bf05b0f6da7de31935e5557a450ed0a85f9\" pid:5562 exited_at:{seconds:1757718170 nanos:459712151}" Sep 12 23:02:50.540868 containerd[1563]: time="2025-09-12T23:02:50.540822006Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"b67b6841ed6cb1e2dbb247221938aca90364e504f18f7452d12cf03fcf9c48dc\" pid:5579 exited_at:{seconds:1757718170 nanos:540327539}" Sep 12 23:02:50.939025 sshd[5547]: Connection closed by 139.178.89.65 port 37842 Sep 12 23:02:50.939511 sshd-session[5544]: pam_unix(sshd:session): session closed for user core Sep 12 23:02:50.946531 systemd[1]: sshd@12-46.62.198.104:22-139.178.89.65:37842.service: Deactivated successfully. Sep 12 23:02:50.953808 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 23:02:50.958308 systemd-logind[1533]: Session 13 logged out. Waiting for processes to exit. Sep 12 23:02:50.961561 systemd-logind[1533]: Removed session 13. Sep 12 23:02:56.153022 systemd[1]: Started sshd@13-46.62.198.104:22-139.178.89.65:46200.service - OpenSSH per-connection server daemon (139.178.89.65:46200). Sep 12 23:02:57.277462 sshd[5608]: Accepted publickey for core from 139.178.89.65 port 46200 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:02:57.278272 sshd-session[5608]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:02:57.284225 systemd-logind[1533]: New session 14 of user core. Sep 12 23:02:57.290226 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 23:02:58.108923 sshd[5613]: Connection closed by 139.178.89.65 port 46200 Sep 12 23:02:58.109533 sshd-session[5608]: pam_unix(sshd:session): session closed for user core Sep 12 23:02:58.113827 systemd-logind[1533]: Session 14 logged out. Waiting for processes to exit. Sep 12 23:02:58.115283 systemd[1]: sshd@13-46.62.198.104:22-139.178.89.65:46200.service: Deactivated successfully. Sep 12 23:02:58.117677 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 23:02:58.120141 systemd-logind[1533]: Removed session 14. Sep 12 23:02:58.258496 systemd[1]: Started sshd@14-46.62.198.104:22-139.178.89.65:46204.service - OpenSSH per-connection server daemon (139.178.89.65:46204). Sep 12 23:02:59.247036 sshd[5625]: Accepted publickey for core from 139.178.89.65 port 46204 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:02:59.251031 sshd-session[5625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:02:59.256978 systemd-logind[1533]: New session 15 of user core. Sep 12 23:02:59.263200 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 23:03:00.284884 sshd[5628]: Connection closed by 139.178.89.65 port 46204 Sep 12 23:03:00.292591 sshd-session[5625]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:00.309002 systemd-logind[1533]: Session 15 logged out. Waiting for processes to exit. Sep 12 23:03:00.310247 systemd[1]: sshd@14-46.62.198.104:22-139.178.89.65:46204.service: Deactivated successfully. Sep 12 23:03:00.315172 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 23:03:00.319007 systemd-logind[1533]: Removed session 15. Sep 12 23:03:00.465627 systemd[1]: Started sshd@15-46.62.198.104:22-139.178.89.65:56648.service - OpenSSH per-connection server daemon (139.178.89.65:56648). Sep 12 23:03:01.510726 sshd[5638]: Accepted publickey for core from 139.178.89.65 port 56648 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:03:01.512792 sshd-session[5638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:01.521121 systemd-logind[1533]: New session 16 of user core. Sep 12 23:03:01.535343 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 23:03:04.355234 sshd[5648]: Connection closed by 139.178.89.65 port 56648 Sep 12 23:03:04.380128 sshd-session[5638]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:04.393984 systemd-logind[1533]: Session 16 logged out. Waiting for processes to exit. Sep 12 23:03:04.399398 systemd[1]: sshd@15-46.62.198.104:22-139.178.89.65:56648.service: Deactivated successfully. Sep 12 23:03:04.402525 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 23:03:04.402718 systemd[1]: session-16.scope: Consumed 716ms CPU time, 77.2M memory peak. Sep 12 23:03:04.409726 systemd-logind[1533]: Removed session 16. Sep 12 23:03:04.571468 systemd[1]: Started sshd@16-46.62.198.104:22-139.178.89.65:56662.service - OpenSSH per-connection server daemon (139.178.89.65:56662). Sep 12 23:03:05.719139 sshd[5665]: Accepted publickey for core from 139.178.89.65 port 56662 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:03:05.722717 sshd-session[5665]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:05.732168 systemd-logind[1533]: New session 17 of user core. Sep 12 23:03:05.744383 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 23:03:07.509063 sshd[5668]: Connection closed by 139.178.89.65 port 56662 Sep 12 23:03:07.510281 sshd-session[5665]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:07.513841 systemd-logind[1533]: Session 17 logged out. Waiting for processes to exit. Sep 12 23:03:07.513959 systemd[1]: sshd@16-46.62.198.104:22-139.178.89.65:56662.service: Deactivated successfully. Sep 12 23:03:07.516549 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 23:03:07.520803 systemd-logind[1533]: Removed session 17. Sep 12 23:03:07.666160 systemd[1]: Started sshd@17-46.62.198.104:22-139.178.89.65:56664.service - OpenSSH per-connection server daemon (139.178.89.65:56664). Sep 12 23:03:07.927138 containerd[1563]: time="2025-09-12T23:03:07.927055706Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\" id:\"0c176ae99e8c797bb8f6bcc0a7fccbf6f6ff8296d8fa47a4185847519e2692a8\" pid:5687 exited_at:{seconds:1757718187 nanos:871546207}" Sep 12 23:03:08.731900 sshd[5700]: Accepted publickey for core from 139.178.89.65 port 56664 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:03:08.734996 sshd-session[5700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:08.744173 systemd-logind[1533]: New session 18 of user core. Sep 12 23:03:08.763469 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 23:03:09.794810 sshd[5718]: Connection closed by 139.178.89.65 port 56664 Sep 12 23:03:09.796025 sshd-session[5700]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:09.804440 systemd[1]: sshd@17-46.62.198.104:22-139.178.89.65:56664.service: Deactivated successfully. Sep 12 23:03:09.810565 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 23:03:09.813291 systemd-logind[1533]: Session 18 logged out. Waiting for processes to exit. Sep 12 23:03:09.818274 systemd-logind[1533]: Removed session 18. Sep 12 23:03:15.003443 systemd[1]: Started sshd@18-46.62.198.104:22-139.178.89.65:49168.service - OpenSSH per-connection server daemon (139.178.89.65:49168). Sep 12 23:03:16.178347 sshd[5734]: Accepted publickey for core from 139.178.89.65 port 49168 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:03:16.186383 sshd-session[5734]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:16.200152 systemd-logind[1533]: New session 19 of user core. Sep 12 23:03:16.207901 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 23:03:17.212810 sshd[5737]: Connection closed by 139.178.89.65 port 49168 Sep 12 23:03:17.213231 sshd-session[5734]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:17.218798 systemd[1]: sshd@18-46.62.198.104:22-139.178.89.65:49168.service: Deactivated successfully. Sep 12 23:03:17.221319 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 23:03:17.230137 systemd-logind[1533]: Session 19 logged out. Waiting for processes to exit. Sep 12 23:03:17.232514 systemd-logind[1533]: Removed session 19. Sep 12 23:03:20.407323 containerd[1563]: time="2025-09-12T23:03:20.407279943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" id:\"6b4b6a6122ec51eb00992468139e3c2e0bcd63de91c1f713885dfc85cff47183\" pid:5778 exited_at:{seconds:1757718200 nanos:406767329}" Sep 12 23:03:20.648644 containerd[1563]: time="2025-09-12T23:03:20.633025563Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"886826897b3f35d73c9363752f524a9c1ba912646e0d72dd6b0ce2ca40b5e0db\" pid:5776 exited_at:{seconds:1757718200 nanos:632106475}" Sep 12 23:03:22.370803 systemd[1]: Started sshd@19-46.62.198.104:22-139.178.89.65:52296.service - OpenSSH per-connection server daemon (139.178.89.65:52296). Sep 12 23:03:23.478695 sshd[5799]: Accepted publickey for core from 139.178.89.65 port 52296 ssh2: RSA SHA256:G4xrRP1TAozVGhkAXgRrR6IrgcF+LJ5gHqFZC6r9aR0 Sep 12 23:03:23.482550 sshd-session[5799]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 23:03:23.494152 systemd-logind[1533]: New session 20 of user core. Sep 12 23:03:23.501329 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 23:03:24.948203 sshd[5802]: Connection closed by 139.178.89.65 port 52296 Sep 12 23:03:24.955342 sshd-session[5799]: pam_unix(sshd:session): session closed for user core Sep 12 23:03:24.966010 systemd[1]: sshd@19-46.62.198.104:22-139.178.89.65:52296.service: Deactivated successfully. Sep 12 23:03:24.969235 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 23:03:24.972878 systemd-logind[1533]: Session 20 logged out. Waiting for processes to exit. Sep 12 23:03:24.974258 systemd-logind[1533]: Removed session 20. Sep 12 23:03:25.012642 containerd[1563]: time="2025-09-12T23:03:25.012464863Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" id:\"e60300cda5ed7fb8da4e80d391ceea16b30c931a926b9a5bfaaa7b08ad6f7c16\" pid:5825 exited_at:{seconds:1757718205 nanos:11496171}" Sep 12 23:03:37.521490 containerd[1563]: time="2025-09-12T23:03:37.521356941Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45680f49249879c806842b3de0d54011ef913b5383b6f8e464742de0ba8bb95f\" id:\"1d607da0b8ea65453883284caebbab82b4a0f698d58ef62eeb838ff158260e8a\" pid:5845 exited_at:{seconds:1757718217 nanos:520792519}" Sep 12 23:03:40.443801 containerd[1563]: time="2025-09-12T23:03:40.443754953Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"80aaebe45597617493ae76f82ab1d9fd472f9d16c888b07daa1219e64cf04b38\" pid:5870 exited_at:{seconds:1757718220 nanos:443001852}" Sep 12 23:03:50.392750 containerd[1563]: time="2025-09-12T23:03:50.392437548Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b6dccac3fc072f9bcba6ae128f1309040d17d7785daee5e352c1c1bf15603be\" id:\"836452e2bf1ec400aa3432dc55a6da2dfb2ab36010fe321d236a6be543f58b5f\" pid:5898 exited_at:{seconds:1757718230 nanos:384316469}" Sep 12 23:03:50.447583 containerd[1563]: time="2025-09-12T23:03:50.447501883Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bcca66f978dfbeaf9bcb299887220fd0c932a22ed10918ff9ee31b91b2ea2619\" id:\"df92754fe200c4a0ce42c3bc9659861ce382cd60bc09e3e72736ec972aaf973c\" pid:5916 exited_at:{seconds:1757718230 nanos:446353879}" Sep 12 23:03:57.861153 systemd[1]: cri-containerd-a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b.scope: Deactivated successfully. Sep 12 23:03:57.861607 systemd[1]: cri-containerd-a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b.scope: Consumed 4.203s CPU time, 82.2M memory peak, 113.7M read from disk. Sep 12 23:03:58.072120 containerd[1563]: time="2025-09-12T23:03:58.069752211Z" level=info msg="received exit event container_id:\"a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b\" id:\"a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b\" pid:2596 exit_status:1 exited_at:{seconds:1757718237 nanos:988970265}" Sep 12 23:03:58.072120 containerd[1563]: time="2025-09-12T23:03:58.070036589Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b\" id:\"a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b\" pid:2596 exit_status:1 exited_at:{seconds:1757718237 nanos:988970265}" Sep 12 23:03:58.191489 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b-rootfs.mount: Deactivated successfully. Sep 12 23:03:58.302588 systemd[1]: cri-containerd-760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5.scope: Deactivated successfully. Sep 12 23:03:58.303076 systemd[1]: cri-containerd-760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5.scope: Consumed 1.321s CPU time, 37.7M memory peak, 65M read from disk. Sep 12 23:03:58.307984 containerd[1563]: time="2025-09-12T23:03:58.307791990Z" level=info msg="received exit event container_id:\"760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5\" id:\"760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5\" pid:2613 exit_status:1 exited_at:{seconds:1757718238 nanos:306830445}" Sep 12 23:03:58.308704 containerd[1563]: time="2025-09-12T23:03:58.308670613Z" level=info msg="TaskExit event in podsandbox handler container_id:\"760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5\" id:\"760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5\" pid:2613 exit_status:1 exited_at:{seconds:1757718238 nanos:306830445}" Sep 12 23:03:58.331259 kubelet[2763]: E0912 23:03:58.331082 2763 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:38948->10.0.0.2:2379: read: connection timed out" Sep 12 23:03:58.335534 systemd[1]: cri-containerd-45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772.scope: Deactivated successfully. Sep 12 23:03:58.335840 systemd[1]: cri-containerd-45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772.scope: Consumed 14.677s CPU time, 120.1M memory peak, 77.2M read from disk. Sep 12 23:03:58.338036 containerd[1563]: time="2025-09-12T23:03:58.337952303Z" level=info msg="received exit event container_id:\"45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772\" id:\"45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772\" pid:3104 exit_status:1 exited_at:{seconds:1757718238 nanos:337422184}" Sep 12 23:03:58.338229 containerd[1563]: time="2025-09-12T23:03:58.338203707Z" level=info msg="TaskExit event in podsandbox handler container_id:\"45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772\" id:\"45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772\" pid:3104 exit_status:1 exited_at:{seconds:1757718238 nanos:337422184}" Sep 12 23:03:58.344025 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5-rootfs.mount: Deactivated successfully. Sep 12 23:03:58.368879 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772-rootfs.mount: Deactivated successfully. Sep 12 23:03:58.895635 kubelet[2763]: I0912 23:03:58.895579 2763 scope.go:117] "RemoveContainer" containerID="760cba2786db639865ff7c5705c139bd9fe5a104bfc7d20afc56c741316395c5" Sep 12 23:03:58.917979 kubelet[2763]: I0912 23:03:58.917843 2763 scope.go:117] "RemoveContainer" containerID="45c71b0de7a26d0901f613f10c164c1a8008ba4617213d0d484ede4e31d16772" Sep 12 23:03:58.938109 kubelet[2763]: I0912 23:03:58.938035 2763 scope.go:117] "RemoveContainer" containerID="a82b27b9ba7ff13222b831b01ce6a75dd06ebdae73e411e5b3cac35234a81c5b" Sep 12 23:03:58.979168 containerd[1563]: time="2025-09-12T23:03:58.979114488Z" level=info msg="CreateContainer within sandbox \"c82e6c4d3436eeacd6310ce0d6bd8c0265883acb43ada004d369590eff989e49\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 12 23:03:58.979333 containerd[1563]: time="2025-09-12T23:03:58.979257889Z" level=info msg="CreateContainer within sandbox \"bd429d61506f4dfb8ff1527a3ff3b0277cb69582a81e35773e2c0e9f30f3aa88\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 23:03:58.999721 containerd[1563]: time="2025-09-12T23:03:58.979132243Z" level=info msg="CreateContainer within sandbox \"4b01d472e63e9a66e8201593e6e351e58a698279ae7e96cb46c438c22e7a1729\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 12 23:03:59.121156 containerd[1563]: time="2025-09-12T23:03:59.121074723Z" level=info msg="Container 2a1af72453f51238e58feddb3b60dcaecdf0c323046da852095e416e5307cafe: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:03:59.122763 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3718520796.mount: Deactivated successfully. Sep 12 23:03:59.140064 containerd[1563]: time="2025-09-12T23:03:59.139813509Z" level=info msg="Container 42c9482744a95a468e6d400832dd1e555f1feaf80740723a6c9d739e40036d64: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:03:59.140064 containerd[1563]: time="2025-09-12T23:03:59.139842225Z" level=info msg="Container bfc5a2062a5cffbf7627b3a3076a46e690d9d878979cdb4dd8bdc32e13d3832b: CDI devices from CRI Config.CDIDevices: []" Sep 12 23:03:59.152521 containerd[1563]: time="2025-09-12T23:03:59.152256373Z" level=info msg="CreateContainer within sandbox \"bd429d61506f4dfb8ff1527a3ff3b0277cb69582a81e35773e2c0e9f30f3aa88\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"2a1af72453f51238e58feddb3b60dcaecdf0c323046da852095e416e5307cafe\"" Sep 12 23:03:59.156360 containerd[1563]: time="2025-09-12T23:03:59.156035421Z" level=info msg="CreateContainer within sandbox \"4b01d472e63e9a66e8201593e6e351e58a698279ae7e96cb46c438c22e7a1729\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"bfc5a2062a5cffbf7627b3a3076a46e690d9d878979cdb4dd8bdc32e13d3832b\"" Sep 12 23:03:59.158255 containerd[1563]: time="2025-09-12T23:03:59.158154565Z" level=info msg="CreateContainer within sandbox \"c82e6c4d3436eeacd6310ce0d6bd8c0265883acb43ada004d369590eff989e49\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"42c9482744a95a468e6d400832dd1e555f1feaf80740723a6c9d739e40036d64\"" Sep 12 23:03:59.158255 containerd[1563]: time="2025-09-12T23:03:59.158185045Z" level=info msg="StartContainer for \"bfc5a2062a5cffbf7627b3a3076a46e690d9d878979cdb4dd8bdc32e13d3832b\"" Sep 12 23:03:59.159519 containerd[1563]: time="2025-09-12T23:03:59.159037447Z" level=info msg="StartContainer for \"2a1af72453f51238e58feddb3b60dcaecdf0c323046da852095e416e5307cafe\"" Sep 12 23:03:59.166254 containerd[1563]: time="2025-09-12T23:03:59.166205438Z" level=info msg="connecting to shim bfc5a2062a5cffbf7627b3a3076a46e690d9d878979cdb4dd8bdc32e13d3832b" address="unix:///run/containerd/s/aca000276a9f8873857c49c201e16eb47224c1c103e0fcf7d84fb3f9c43a0442" protocol=ttrpc version=3 Sep 12 23:03:59.168084 containerd[1563]: time="2025-09-12T23:03:59.167333680Z" level=info msg="connecting to shim 2a1af72453f51238e58feddb3b60dcaecdf0c323046da852095e416e5307cafe" address="unix:///run/containerd/s/602b9e110756cc263263b6d42721e3ca3ddabe4e278c3ed97aed497669c883df" protocol=ttrpc version=3 Sep 12 23:03:59.184820 containerd[1563]: time="2025-09-12T23:03:59.182811443Z" level=info msg="StartContainer for \"42c9482744a95a468e6d400832dd1e555f1feaf80740723a6c9d739e40036d64\"" Sep 12 23:03:59.188974 containerd[1563]: time="2025-09-12T23:03:59.188939536Z" level=info msg="connecting to shim 42c9482744a95a468e6d400832dd1e555f1feaf80740723a6c9d739e40036d64" address="unix:///run/containerd/s/f957353839717f1a503b73ae4e621e8945afbade56f3648175cef0f197e38304" protocol=ttrpc version=3 Sep 12 23:03:59.261171 systemd[1]: Started cri-containerd-2a1af72453f51238e58feddb3b60dcaecdf0c323046da852095e416e5307cafe.scope - libcontainer container 2a1af72453f51238e58feddb3b60dcaecdf0c323046da852095e416e5307cafe. Sep 12 23:03:59.262032 systemd[1]: Started cri-containerd-42c9482744a95a468e6d400832dd1e555f1feaf80740723a6c9d739e40036d64.scope - libcontainer container 42c9482744a95a468e6d400832dd1e555f1feaf80740723a6c9d739e40036d64. Sep 12 23:03:59.262876 systemd[1]: Started cri-containerd-bfc5a2062a5cffbf7627b3a3076a46e690d9d878979cdb4dd8bdc32e13d3832b.scope - libcontainer container bfc5a2062a5cffbf7627b3a3076a46e690d9d878979cdb4dd8bdc32e13d3832b. Sep 12 23:03:59.355492 containerd[1563]: time="2025-09-12T23:03:59.355442942Z" level=info msg="StartContainer for \"2a1af72453f51238e58feddb3b60dcaecdf0c323046da852095e416e5307cafe\" returns successfully" Sep 12 23:03:59.403818 containerd[1563]: time="2025-09-12T23:03:59.403673963Z" level=info msg="StartContainer for \"42c9482744a95a468e6d400832dd1e555f1feaf80740723a6c9d739e40036d64\" returns successfully" Sep 12 23:03:59.413263 containerd[1563]: time="2025-09-12T23:03:59.413206552Z" level=info msg="StartContainer for \"bfc5a2062a5cffbf7627b3a3076a46e690d9d878979cdb4dd8bdc32e13d3832b\" returns successfully" Sep 12 23:04:02.281657 kubelet[2763]: E0912 23:04:02.264414 2763 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:38758->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-0-0-5-bd33a49fb7.1864ab6b4294889e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-0-0-5-bd33a49fb7,UID:4911ce518e34ace21e2d4f5064fa6e92,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-5-bd33a49fb7,},FirstTimestamp:2025-09-12 23:03:51.74821699 +0000 UTC m=+191.721371519,LastTimestamp:2025-09-12 23:03:51.74821699 +0000 UTC m=+191.721371519,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-5-bd33a49fb7,}"