Sep 9 05:31:47.782119 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 9 03:39:34 -00 2025 Sep 9 05:31:47.782145 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:31:47.782154 kernel: BIOS-provided physical RAM map: Sep 9 05:31:47.782161 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 9 05:31:47.782166 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 9 05:31:47.782172 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 9 05:31:47.782181 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Sep 9 05:31:47.782187 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Sep 9 05:31:47.782193 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 9 05:31:47.782198 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 9 05:31:47.782205 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 9 05:31:47.782210 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 9 05:31:47.782216 kernel: NX (Execute Disable) protection: active Sep 9 05:31:47.782223 kernel: APIC: Static calls initialized Sep 9 05:31:47.782244 kernel: SMBIOS 2.8 present. Sep 9 05:31:47.782251 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Sep 9 05:31:47.782257 kernel: DMI: Memory slots populated: 1/1 Sep 9 05:31:47.782263 kernel: Hypervisor detected: KVM Sep 9 05:31:47.782270 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 9 05:31:47.782276 kernel: kvm-clock: using sched offset of 4201305810 cycles Sep 9 05:31:47.782283 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 9 05:31:47.782290 kernel: tsc: Detected 2445.406 MHz processor Sep 9 05:31:47.782299 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 9 05:31:47.782306 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 9 05:31:47.782312 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Sep 9 05:31:47.782319 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 9 05:31:47.782326 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 9 05:31:47.782332 kernel: Using GB pages for direct mapping Sep 9 05:31:47.782339 kernel: ACPI: Early table checksum verification disabled Sep 9 05:31:47.782345 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Sep 9 05:31:47.782352 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:31:47.782360 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:31:47.782366 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:31:47.782373 kernel: ACPI: FACS 0x000000007CFE0000 000040 Sep 9 05:31:47.782380 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:31:47.782386 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:31:47.782392 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:31:47.782399 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 05:31:47.782406 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Sep 9 05:31:47.782414 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Sep 9 05:31:47.782423 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Sep 9 05:31:47.782430 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Sep 9 05:31:47.782436 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Sep 9 05:31:47.782443 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Sep 9 05:31:47.782450 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Sep 9 05:31:47.782458 kernel: No NUMA configuration found Sep 9 05:31:47.782465 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Sep 9 05:31:47.782472 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Sep 9 05:31:47.782479 kernel: Zone ranges: Sep 9 05:31:47.782486 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 9 05:31:47.782492 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Sep 9 05:31:47.782499 kernel: Normal empty Sep 9 05:31:47.782506 kernel: Device empty Sep 9 05:31:47.782513 kernel: Movable zone start for each node Sep 9 05:31:47.782521 kernel: Early memory node ranges Sep 9 05:31:47.782527 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 9 05:31:47.782534 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Sep 9 05:31:47.782541 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Sep 9 05:31:47.782548 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 9 05:31:47.782555 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 9 05:31:47.782561 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 9 05:31:47.782569 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 9 05:31:47.782575 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 9 05:31:47.782582 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 9 05:31:47.782590 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 9 05:31:47.782597 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 9 05:31:47.782604 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 9 05:31:47.782611 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 9 05:31:47.782618 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 9 05:31:47.782624 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 9 05:31:47.782631 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 9 05:31:47.782638 kernel: CPU topo: Max. logical packages: 1 Sep 9 05:31:47.782645 kernel: CPU topo: Max. logical dies: 1 Sep 9 05:31:47.782653 kernel: CPU topo: Max. dies per package: 1 Sep 9 05:31:47.782659 kernel: CPU topo: Max. threads per core: 1 Sep 9 05:31:47.782666 kernel: CPU topo: Num. cores per package: 2 Sep 9 05:31:47.782673 kernel: CPU topo: Num. threads per package: 2 Sep 9 05:31:47.782680 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 9 05:31:47.782686 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 9 05:31:47.782693 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 9 05:31:47.782700 kernel: Booting paravirtualized kernel on KVM Sep 9 05:31:47.782707 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 9 05:31:47.782714 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 9 05:31:47.782722 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 9 05:31:47.782729 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 9 05:31:47.782736 kernel: pcpu-alloc: [0] 0 1 Sep 9 05:31:47.782743 kernel: kvm-guest: PV spinlocks disabled, no host support Sep 9 05:31:47.782751 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:31:47.782758 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 05:31:47.782765 kernel: random: crng init done Sep 9 05:31:47.782772 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 05:31:47.782780 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 9 05:31:47.782787 kernel: Fallback order for Node 0: 0 Sep 9 05:31:47.782794 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Sep 9 05:31:47.782801 kernel: Policy zone: DMA32 Sep 9 05:31:47.782808 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 05:31:47.782815 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 9 05:31:47.782822 kernel: ftrace: allocating 40102 entries in 157 pages Sep 9 05:31:47.782828 kernel: ftrace: allocated 157 pages with 5 groups Sep 9 05:31:47.782835 kernel: Dynamic Preempt: voluntary Sep 9 05:31:47.782843 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 05:31:47.782851 kernel: rcu: RCU event tracing is enabled. Sep 9 05:31:47.782858 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 9 05:31:47.782865 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 05:31:47.782872 kernel: Rude variant of Tasks RCU enabled. Sep 9 05:31:47.782879 kernel: Tracing variant of Tasks RCU enabled. Sep 9 05:31:47.782886 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 05:31:47.782893 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 9 05:31:47.782900 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:31:47.782908 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:31:47.782915 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 05:31:47.782922 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 9 05:31:47.782929 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 05:31:47.782935 kernel: Console: colour VGA+ 80x25 Sep 9 05:31:47.782942 kernel: printk: legacy console [tty0] enabled Sep 9 05:31:47.782949 kernel: printk: legacy console [ttyS0] enabled Sep 9 05:31:47.782956 kernel: ACPI: Core revision 20240827 Sep 9 05:31:47.782963 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 9 05:31:47.782975 kernel: APIC: Switch to symmetric I/O mode setup Sep 9 05:31:47.782983 kernel: x2apic enabled Sep 9 05:31:47.782990 kernel: APIC: Switched APIC routing to: physical x2apic Sep 9 05:31:47.782998 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 9 05:31:47.783006 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Sep 9 05:31:47.783013 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) Sep 9 05:31:47.786152 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 9 05:31:47.786165 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 9 05:31:47.786174 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 9 05:31:47.786185 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 9 05:31:47.786192 kernel: Spectre V2 : Mitigation: Retpolines Sep 9 05:31:47.786200 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 9 05:31:47.786207 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 9 05:31:47.786214 kernel: active return thunk: retbleed_return_thunk Sep 9 05:31:47.786222 kernel: RETBleed: Mitigation: untrained return thunk Sep 9 05:31:47.786246 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 9 05:31:47.786255 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 9 05:31:47.786262 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 9 05:31:47.786270 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 9 05:31:47.786277 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 9 05:31:47.786285 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 9 05:31:47.786292 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 9 05:31:47.786299 kernel: Freeing SMP alternatives memory: 32K Sep 9 05:31:47.786307 kernel: pid_max: default: 32768 minimum: 301 Sep 9 05:31:47.786314 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 05:31:47.786323 kernel: landlock: Up and running. Sep 9 05:31:47.786330 kernel: SELinux: Initializing. Sep 9 05:31:47.786337 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 05:31:47.786345 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 9 05:31:47.786352 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 9 05:31:47.786359 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 9 05:31:47.786366 kernel: ... version: 0 Sep 9 05:31:47.786374 kernel: ... bit width: 48 Sep 9 05:31:47.786381 kernel: ... generic registers: 6 Sep 9 05:31:47.786390 kernel: ... value mask: 0000ffffffffffff Sep 9 05:31:47.786402 kernel: ... max period: 00007fffffffffff Sep 9 05:31:47.786414 kernel: ... fixed-purpose events: 0 Sep 9 05:31:47.786426 kernel: ... event mask: 000000000000003f Sep 9 05:31:47.786437 kernel: signal: max sigframe size: 1776 Sep 9 05:31:47.786449 kernel: rcu: Hierarchical SRCU implementation. Sep 9 05:31:47.786461 kernel: rcu: Max phase no-delay instances is 400. Sep 9 05:31:47.786472 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 05:31:47.786484 kernel: smp: Bringing up secondary CPUs ... Sep 9 05:31:47.786500 kernel: smpboot: x86: Booting SMP configuration: Sep 9 05:31:47.786518 kernel: .... node #0, CPUs: #1 Sep 9 05:31:47.786538 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 05:31:47.786559 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) Sep 9 05:31:47.786578 kernel: Memory: 1917788K/2047464K available (14336K kernel code, 2428K rwdata, 9988K rodata, 54076K init, 2892K bss, 125140K reserved, 0K cma-reserved) Sep 9 05:31:47.786597 kernel: devtmpfs: initialized Sep 9 05:31:47.786646 kernel: x86/mm: Memory block size: 128MB Sep 9 05:31:47.786667 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 05:31:47.786685 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 9 05:31:47.786705 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 05:31:47.786722 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 05:31:47.786739 kernel: audit: initializing netlink subsys (disabled) Sep 9 05:31:47.786756 kernel: audit: type=2000 audit(1757395904.653:1): state=initialized audit_enabled=0 res=1 Sep 9 05:31:47.786772 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 05:31:47.786789 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 9 05:31:47.786805 kernel: cpuidle: using governor menu Sep 9 05:31:47.786821 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 05:31:47.786833 kernel: dca service started, version 1.12.1 Sep 9 05:31:47.786847 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 9 05:31:47.786859 kernel: PCI: Using configuration type 1 for base access Sep 9 05:31:47.786871 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 9 05:31:47.786884 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 05:31:47.786895 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 05:31:47.786907 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 05:31:47.786920 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 05:31:47.786929 kernel: ACPI: Added _OSI(Module Device) Sep 9 05:31:47.786937 kernel: ACPI: Added _OSI(Processor Device) Sep 9 05:31:47.786946 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 05:31:47.786953 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 05:31:47.786961 kernel: ACPI: Interpreter enabled Sep 9 05:31:47.786968 kernel: ACPI: PM: (supports S0 S5) Sep 9 05:31:47.786975 kernel: ACPI: Using IOAPIC for interrupt routing Sep 9 05:31:47.786982 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 9 05:31:47.786989 kernel: PCI: Using E820 reservations for host bridge windows Sep 9 05:31:47.786997 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 9 05:31:47.787004 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 05:31:47.787150 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 05:31:47.787248 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 9 05:31:47.787323 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 9 05:31:47.787334 kernel: PCI host bridge to bus 0000:00 Sep 9 05:31:47.787410 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 9 05:31:47.787475 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 9 05:31:47.787543 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 9 05:31:47.787605 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Sep 9 05:31:47.788809 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 9 05:31:47.788897 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 9 05:31:47.788965 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 05:31:47.789093 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 9 05:31:47.789188 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Sep 9 05:31:47.789290 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Sep 9 05:31:47.789365 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Sep 9 05:31:47.789437 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Sep 9 05:31:47.789507 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Sep 9 05:31:47.789579 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 9 05:31:47.789658 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.789731 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Sep 9 05:31:47.789807 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 9 05:31:47.791119 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 05:31:47.791207 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 9 05:31:47.791309 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.791386 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Sep 9 05:31:47.791459 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 9 05:31:47.791536 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 05:31:47.791606 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 05:31:47.791687 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.791761 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Sep 9 05:31:47.791832 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 9 05:31:47.791903 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 05:31:47.791974 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 05:31:47.796086 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.796166 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Sep 9 05:31:47.796241 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 9 05:31:47.796307 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 05:31:47.796365 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 05:31:47.796430 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.796489 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Sep 9 05:31:47.796545 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 9 05:31:47.796606 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 05:31:47.796662 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 05:31:47.796726 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.796784 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Sep 9 05:31:47.796840 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 9 05:31:47.796897 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 05:31:47.796953 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 05:31:47.797036 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.797102 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Sep 9 05:31:47.797161 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 9 05:31:47.797218 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 05:31:47.797289 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 05:31:47.797353 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.797415 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Sep 9 05:31:47.797472 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 9 05:31:47.797527 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 9 05:31:47.797583 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 05:31:47.797647 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 05:31:47.797705 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Sep 9 05:31:47.797761 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 9 05:31:47.797822 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 05:31:47.797879 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 05:31:47.797941 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 9 05:31:47.797999 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 9 05:31:47.799107 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 9 05:31:47.799180 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Sep 9 05:31:47.799253 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Sep 9 05:31:47.799326 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 9 05:31:47.799385 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 9 05:31:47.799452 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 9 05:31:47.799515 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Sep 9 05:31:47.799574 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Sep 9 05:31:47.799634 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Sep 9 05:31:47.799691 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 9 05:31:47.799761 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 9 05:31:47.799821 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Sep 9 05:31:47.799879 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 9 05:31:47.799947 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Sep 9 05:31:47.800008 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Sep 9 05:31:47.800649 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Sep 9 05:31:47.800721 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 9 05:31:47.800793 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Sep 9 05:31:47.800855 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Sep 9 05:31:47.800912 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 9 05:31:47.800977 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 9 05:31:47.801577 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Sep 9 05:31:47.801647 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 9 05:31:47.801721 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Sep 9 05:31:47.801783 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Sep 9 05:31:47.801842 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Sep 9 05:31:47.801900 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 9 05:31:47.801908 kernel: acpiphp: Slot [0] registered Sep 9 05:31:47.801974 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 9 05:31:47.803088 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Sep 9 05:31:47.803158 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Sep 9 05:31:47.803219 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Sep 9 05:31:47.803296 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 9 05:31:47.803306 kernel: acpiphp: Slot [0-2] registered Sep 9 05:31:47.803362 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 9 05:31:47.803371 kernel: acpiphp: Slot [0-3] registered Sep 9 05:31:47.803427 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 9 05:31:47.803439 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 9 05:31:47.803445 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 9 05:31:47.803451 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 9 05:31:47.803457 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 9 05:31:47.803462 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 9 05:31:47.803468 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 9 05:31:47.803474 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 9 05:31:47.803480 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 9 05:31:47.803486 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 9 05:31:47.803493 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 9 05:31:47.803499 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 9 05:31:47.803505 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 9 05:31:47.803511 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 9 05:31:47.803517 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 9 05:31:47.803523 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 9 05:31:47.803528 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 9 05:31:47.803534 kernel: iommu: Default domain type: Translated Sep 9 05:31:47.803540 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 9 05:31:47.803547 kernel: PCI: Using ACPI for IRQ routing Sep 9 05:31:47.803553 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 9 05:31:47.803559 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 9 05:31:47.803565 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Sep 9 05:31:47.803623 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 9 05:31:47.803679 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 9 05:31:47.803736 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 9 05:31:47.803745 kernel: vgaarb: loaded Sep 9 05:31:47.803751 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 9 05:31:47.803759 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 9 05:31:47.803765 kernel: clocksource: Switched to clocksource kvm-clock Sep 9 05:31:47.803771 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 05:31:47.803777 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 05:31:47.803783 kernel: pnp: PnP ACPI init Sep 9 05:31:47.803849 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 9 05:31:47.803859 kernel: pnp: PnP ACPI: found 5 devices Sep 9 05:31:47.803865 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 9 05:31:47.803873 kernel: NET: Registered PF_INET protocol family Sep 9 05:31:47.803880 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 05:31:47.803886 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 9 05:31:47.803892 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 05:31:47.803898 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 9 05:31:47.803904 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 9 05:31:47.803910 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 9 05:31:47.803916 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 05:31:47.803922 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 9 05:31:47.803929 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 05:31:47.803935 kernel: NET: Registered PF_XDP protocol family Sep 9 05:31:47.803993 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 9 05:31:47.804079 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 9 05:31:47.804142 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 9 05:31:47.804200 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Sep 9 05:31:47.804274 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Sep 9 05:31:47.804333 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Sep 9 05:31:47.804395 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 9 05:31:47.804463 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 9 05:31:47.804523 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 9 05:31:47.804580 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 9 05:31:47.804636 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 9 05:31:47.804692 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 05:31:47.804748 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 9 05:31:47.804805 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 9 05:31:47.804862 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 05:31:47.804923 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 9 05:31:47.804981 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 9 05:31:47.805063 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 05:31:47.805123 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 9 05:31:47.805179 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 9 05:31:47.805253 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 05:31:47.805315 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 9 05:31:47.805376 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 9 05:31:47.805433 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 05:31:47.805489 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 9 05:31:47.805546 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Sep 9 05:31:47.805603 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 9 05:31:47.805662 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 05:31:47.805718 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 9 05:31:47.805802 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Sep 9 05:31:47.805864 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 9 05:31:47.805921 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 05:31:47.805978 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 9 05:31:47.807059 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Sep 9 05:31:47.807124 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 9 05:31:47.807182 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 05:31:47.807252 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 9 05:31:47.807310 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 9 05:31:47.807366 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 9 05:31:47.807418 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Sep 9 05:31:47.807468 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 9 05:31:47.807518 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 9 05:31:47.807577 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 9 05:31:47.807631 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 9 05:31:47.807693 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 9 05:31:47.807746 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 9 05:31:47.807805 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 9 05:31:47.807858 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 9 05:31:47.807916 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 9 05:31:47.807969 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 9 05:31:47.809063 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 9 05:31:47.809145 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 9 05:31:47.809208 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Sep 9 05:31:47.809274 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 9 05:31:47.809333 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Sep 9 05:31:47.809385 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 9 05:31:47.809441 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 9 05:31:47.809502 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Sep 9 05:31:47.809555 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Sep 9 05:31:47.809606 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 9 05:31:47.809664 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Sep 9 05:31:47.809717 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 9 05:31:47.809768 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 9 05:31:47.809780 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 9 05:31:47.809786 kernel: PCI: CLS 0 bytes, default 64 Sep 9 05:31:47.809793 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Sep 9 05:31:47.809800 kernel: Initialise system trusted keyrings Sep 9 05:31:47.809806 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 9 05:31:47.809812 kernel: Key type asymmetric registered Sep 9 05:31:47.809818 kernel: Asymmetric key parser 'x509' registered Sep 9 05:31:47.809824 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 9 05:31:47.809831 kernel: io scheduler mq-deadline registered Sep 9 05:31:47.809838 kernel: io scheduler kyber registered Sep 9 05:31:47.809844 kernel: io scheduler bfq registered Sep 9 05:31:47.809903 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 9 05:31:47.809963 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 9 05:31:47.810210 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 9 05:31:47.810303 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 9 05:31:47.810365 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 9 05:31:47.810424 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 9 05:31:47.810486 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 9 05:31:47.810573 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 9 05:31:47.810648 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 9 05:31:47.810707 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 9 05:31:47.810798 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 9 05:31:47.810860 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 9 05:31:47.811067 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 9 05:31:47.811133 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 9 05:31:47.811197 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 9 05:31:47.811272 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 9 05:31:47.811283 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 9 05:31:47.811340 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Sep 9 05:31:47.811396 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Sep 9 05:31:47.811406 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 9 05:31:47.811415 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Sep 9 05:31:47.811422 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 05:31:47.811429 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 9 05:31:47.811436 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 9 05:31:47.811442 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 9 05:31:47.811448 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 9 05:31:47.811510 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 9 05:31:47.811520 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 9 05:31:47.813092 kernel: rtc_cmos 00:03: registered as rtc0 Sep 9 05:31:47.813159 kernel: rtc_cmos 00:03: setting system clock to 2025-09-09T05:31:47 UTC (1757395907) Sep 9 05:31:47.813214 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 9 05:31:47.813224 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 9 05:31:47.813243 kernel: NET: Registered PF_INET6 protocol family Sep 9 05:31:47.813250 kernel: Segment Routing with IPv6 Sep 9 05:31:47.813256 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 05:31:47.813262 kernel: NET: Registered PF_PACKET protocol family Sep 9 05:31:47.813268 kernel: Key type dns_resolver registered Sep 9 05:31:47.813278 kernel: IPI shorthand broadcast: enabled Sep 9 05:31:47.813284 kernel: sched_clock: Marking stable (2872010037, 152946221)->(3029391121, -4434863) Sep 9 05:31:47.813291 kernel: registered taskstats version 1 Sep 9 05:31:47.813297 kernel: Loading compiled-in X.509 certificates Sep 9 05:31:47.813303 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 884b9ad6a330f59ae6e6488b20a5491e41ff24a3' Sep 9 05:31:47.813310 kernel: Demotion targets for Node 0: null Sep 9 05:31:47.813316 kernel: Key type .fscrypt registered Sep 9 05:31:47.813322 kernel: Key type fscrypt-provisioning registered Sep 9 05:31:47.813328 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 05:31:47.813336 kernel: ima: Allocated hash algorithm: sha1 Sep 9 05:31:47.813342 kernel: ima: No architecture policies found Sep 9 05:31:47.813348 kernel: clk: Disabling unused clocks Sep 9 05:31:47.813354 kernel: Warning: unable to open an initial console. Sep 9 05:31:47.813361 kernel: Freeing unused kernel image (initmem) memory: 54076K Sep 9 05:31:47.813367 kernel: Write protecting the kernel read-only data: 24576k Sep 9 05:31:47.813373 kernel: Freeing unused kernel image (rodata/data gap) memory: 252K Sep 9 05:31:47.813380 kernel: Run /init as init process Sep 9 05:31:47.813387 kernel: with arguments: Sep 9 05:31:47.813394 kernel: /init Sep 9 05:31:47.813400 kernel: with environment: Sep 9 05:31:47.813406 kernel: HOME=/ Sep 9 05:31:47.813412 kernel: TERM=linux Sep 9 05:31:47.813418 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 05:31:47.813425 systemd[1]: Successfully made /usr/ read-only. Sep 9 05:31:47.813435 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:31:47.813443 systemd[1]: Detected virtualization kvm. Sep 9 05:31:47.813450 systemd[1]: Detected architecture x86-64. Sep 9 05:31:47.813456 systemd[1]: Running in initrd. Sep 9 05:31:47.813463 systemd[1]: No hostname configured, using default hostname. Sep 9 05:31:47.813469 systemd[1]: Hostname set to . Sep 9 05:31:47.813476 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:31:47.813483 systemd[1]: Queued start job for default target initrd.target. Sep 9 05:31:47.813489 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:31:47.813497 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:31:47.813505 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 05:31:47.813512 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:31:47.813519 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 05:31:47.813526 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 05:31:47.813534 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 05:31:47.813541 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 05:31:47.813549 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:31:47.813556 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:31:47.813562 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:31:47.813569 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:31:47.813576 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:31:47.813582 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:31:47.813589 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:31:47.813596 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:31:47.813602 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 05:31:47.813610 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 05:31:47.813617 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:31:47.813624 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:31:47.813631 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:31:47.813638 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:31:47.813645 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 05:31:47.813651 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:31:47.813658 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 05:31:47.813666 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 05:31:47.813673 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 05:31:47.813680 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:31:47.813687 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:31:47.813693 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:31:47.813700 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 05:31:47.813724 systemd-journald[216]: Collecting audit messages is disabled. Sep 9 05:31:47.813743 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:31:47.813751 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 05:31:47.813758 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 05:31:47.813765 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 05:31:47.813771 kernel: Bridge firewalling registered Sep 9 05:31:47.813778 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:31:47.813785 systemd-journald[216]: Journal started Sep 9 05:31:47.813802 systemd-journald[216]: Runtime Journal (/run/log/journal/4db4780dc9494d929b70172a8c6ef8bf) is 4.8M, max 38.6M, 33.7M free. Sep 9 05:31:47.787874 systemd-modules-load[217]: Inserted module 'overlay' Sep 9 05:31:47.850664 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:31:47.810740 systemd-modules-load[217]: Inserted module 'br_netfilter' Sep 9 05:31:47.851335 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:31:47.852399 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 05:31:47.854618 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 05:31:47.858139 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:31:47.861561 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:31:47.867104 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:31:47.871224 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:31:47.878167 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:31:47.879683 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:31:47.882255 systemd-tmpfiles[234]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 05:31:47.884189 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 05:31:47.886204 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:31:47.889114 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:31:47.903651 dracut-cmdline[253]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=107bc9be805328e5e30844239fa87d36579f371e3de2c34fec43f6ff6d17b104 Sep 9 05:31:47.932919 systemd-resolved[255]: Positive Trust Anchors: Sep 9 05:31:47.933567 systemd-resolved[255]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:31:47.933593 systemd-resolved[255]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:31:47.939765 systemd-resolved[255]: Defaulting to hostname 'linux'. Sep 9 05:31:47.940827 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:31:47.941748 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:31:47.965162 kernel: SCSI subsystem initialized Sep 9 05:31:47.974043 kernel: Loading iSCSI transport class v2.0-870. Sep 9 05:31:47.984041 kernel: iscsi: registered transport (tcp) Sep 9 05:31:48.001087 kernel: iscsi: registered transport (qla4xxx) Sep 9 05:31:48.001128 kernel: QLogic iSCSI HBA Driver Sep 9 05:31:48.015600 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:31:48.030192 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:31:48.032364 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:31:48.062161 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 05:31:48.063612 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 05:31:48.114057 kernel: raid6: avx2x4 gen() 30911 MB/s Sep 9 05:31:48.131044 kernel: raid6: avx2x2 gen() 36068 MB/s Sep 9 05:31:48.148145 kernel: raid6: avx2x1 gen() 25271 MB/s Sep 9 05:31:48.148184 kernel: raid6: using algorithm avx2x2 gen() 36068 MB/s Sep 9 05:31:48.166242 kernel: raid6: .... xor() 32447 MB/s, rmw enabled Sep 9 05:31:48.166281 kernel: raid6: using avx2x2 recovery algorithm Sep 9 05:31:48.183059 kernel: xor: automatically using best checksumming function avx Sep 9 05:31:48.293064 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 05:31:48.297288 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:31:48.299169 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:31:48.319983 systemd-udevd[465]: Using default interface naming scheme 'v255'. Sep 9 05:31:48.323970 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:31:48.326830 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 05:31:48.345780 dracut-pre-trigger[475]: rd.md=0: removing MD RAID activation Sep 9 05:31:48.360734 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:31:48.362843 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:31:48.412872 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:31:48.416191 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 05:31:48.494622 kernel: cryptd: max_cpu_qlen set to 1000 Sep 9 05:31:48.496058 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Sep 9 05:31:48.499066 kernel: scsi host0: Virtio SCSI HBA Sep 9 05:31:48.504044 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 9 05:31:48.508065 kernel: ACPI: bus type USB registered Sep 9 05:31:48.508109 kernel: usbcore: registered new interface driver usbfs Sep 9 05:31:48.511626 kernel: usbcore: registered new interface driver hub Sep 9 05:31:48.511689 kernel: usbcore: registered new device driver usb Sep 9 05:31:48.550170 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:31:48.550956 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:31:48.553168 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:31:48.573100 kernel: AES CTR mode by8 optimization enabled Sep 9 05:31:48.573121 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 9 05:31:48.557311 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:31:48.576329 kernel: libata version 3.00 loaded. Sep 9 05:31:48.588917 kernel: sd 0:0:0:0: Power-on or device reset occurred Sep 9 05:31:48.592933 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 9 05:31:48.593101 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 9 05:31:48.593193 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Sep 9 05:31:48.597056 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 9 05:31:48.605092 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 05:31:48.605148 kernel: GPT:17805311 != 80003071 Sep 9 05:31:48.605165 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 05:31:48.605176 kernel: GPT:17805311 != 80003071 Sep 9 05:31:48.605187 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 05:31:48.605201 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 05:31:48.607063 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 9 05:31:48.614041 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 9 05:31:48.614184 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 9 05:31:48.614284 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 9 05:31:48.616062 kernel: ahci 0000:00:1f.2: version 3.0 Sep 9 05:31:48.616174 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 9 05:31:48.617044 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 9 05:31:48.617156 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 9 05:31:48.617259 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 9 05:31:48.620536 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 9 05:31:48.620647 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 9 05:31:48.620727 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 9 05:31:48.620825 kernel: hub 1-0:1.0: USB hub found Sep 9 05:31:48.620927 kernel: hub 1-0:1.0: 4 ports detected Sep 9 05:31:48.621003 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 9 05:31:48.621140 kernel: hub 2-0:1.0: USB hub found Sep 9 05:31:48.621237 kernel: hub 2-0:1.0: 4 ports detected Sep 9 05:31:48.621329 kernel: scsi host1: ahci Sep 9 05:31:48.621405 kernel: scsi host2: ahci Sep 9 05:31:48.626044 kernel: scsi host3: ahci Sep 9 05:31:48.626152 kernel: scsi host4: ahci Sep 9 05:31:48.627048 kernel: scsi host5: ahci Sep 9 05:31:48.627214 kernel: scsi host6: ahci Sep 9 05:31:48.627402 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 46 lpm-pol 1 Sep 9 05:31:48.627421 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 46 lpm-pol 1 Sep 9 05:31:48.627435 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 46 lpm-pol 1 Sep 9 05:31:48.627447 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 46 lpm-pol 1 Sep 9 05:31:48.627460 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 46 lpm-pol 1 Sep 9 05:31:48.627474 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 46 lpm-pol 1 Sep 9 05:31:48.685325 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 9 05:31:48.721529 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:31:48.730694 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 9 05:31:48.737607 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 9 05:31:48.738208 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 9 05:31:48.747785 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 9 05:31:48.749560 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 05:31:48.776154 disk-uuid[629]: Primary Header is updated. Sep 9 05:31:48.776154 disk-uuid[629]: Secondary Entries is updated. Sep 9 05:31:48.776154 disk-uuid[629]: Secondary Header is updated. Sep 9 05:31:48.787063 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 05:31:48.858059 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 9 05:31:48.939383 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 9 05:31:48.939452 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 9 05:31:48.939464 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 9 05:31:48.939474 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 9 05:31:48.939483 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 9 05:31:48.941044 kernel: ata1.00: LPM support broken, forcing max_power Sep 9 05:31:48.943431 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 9 05:31:48.943460 kernel: ata1.00: applying bridge limits Sep 9 05:31:48.945819 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 9 05:31:48.946043 kernel: ata1.00: LPM support broken, forcing max_power Sep 9 05:31:48.948271 kernel: ata1.00: configured for UDMA/100 Sep 9 05:31:48.949050 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 9 05:31:48.992080 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 9 05:31:48.992267 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 9 05:31:48.992281 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 05:31:48.998116 kernel: usbcore: registered new interface driver usbhid Sep 9 05:31:48.998152 kernel: usbhid: USB HID core driver Sep 9 05:31:49.001040 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Sep 9 05:31:49.001177 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Sep 9 05:31:49.004563 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 9 05:31:49.322281 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 05:31:49.323381 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:31:49.324540 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:31:49.325826 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:31:49.328079 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 05:31:49.345991 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:31:49.807055 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 05:31:49.808289 disk-uuid[630]: The operation has completed successfully. Sep 9 05:31:49.850187 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 05:31:49.850289 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 05:31:49.882274 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 05:31:49.899613 sh[663]: Success Sep 9 05:31:49.916087 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 05:31:49.916154 kernel: device-mapper: uevent: version 1.0.3 Sep 9 05:31:49.917923 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 05:31:49.928050 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 9 05:31:49.971412 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 05:31:49.975233 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 05:31:49.986244 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 05:31:49.997140 kernel: BTRFS: device fsid 9ca60a92-6b53-4529-adc0-1f4392d2ad56 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (675) Sep 9 05:31:50.001507 kernel: BTRFS info (device dm-0): first mount of filesystem 9ca60a92-6b53-4529-adc0-1f4392d2ad56 Sep 9 05:31:50.001540 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:31:50.011814 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 9 05:31:50.011848 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 05:31:50.011859 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 05:31:50.014809 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 05:31:50.015700 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:31:50.016546 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 05:31:50.017238 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 05:31:50.020120 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 05:31:50.044150 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (708) Sep 9 05:31:50.044192 kernel: BTRFS info (device sda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:31:50.046374 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:31:50.052151 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 05:31:50.052180 kernel: BTRFS info (device sda6): turning on async discard Sep 9 05:31:50.053268 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 05:31:50.059084 kernel: BTRFS info (device sda6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:31:50.059516 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 05:31:50.060676 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 05:31:50.137332 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:31:50.141141 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:31:50.154794 ignition[772]: Ignition 2.22.0 Sep 9 05:31:50.154822 ignition[772]: Stage: fetch-offline Sep 9 05:31:50.154854 ignition[772]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:31:50.156510 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:31:50.154861 ignition[772]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 05:31:50.154922 ignition[772]: parsed url from cmdline: "" Sep 9 05:31:50.154925 ignition[772]: no config URL provided Sep 9 05:31:50.154928 ignition[772]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:31:50.154933 ignition[772]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:31:50.154937 ignition[772]: failed to fetch config: resource requires networking Sep 9 05:31:50.155058 ignition[772]: Ignition finished successfully Sep 9 05:31:50.180848 systemd-networkd[849]: lo: Link UP Sep 9 05:31:50.180859 systemd-networkd[849]: lo: Gained carrier Sep 9 05:31:50.182916 systemd-networkd[849]: Enumeration completed Sep 9 05:31:50.183096 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:31:50.183434 systemd-networkd[849]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:31:50.183439 systemd-networkd[849]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:31:50.184600 systemd-networkd[849]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:31:50.184604 systemd-networkd[849]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:31:50.185378 systemd[1]: Reached target network.target - Network. Sep 9 05:31:50.185562 systemd-networkd[849]: eth0: Link UP Sep 9 05:31:50.185748 systemd-networkd[849]: eth1: Link UP Sep 9 05:31:50.185899 systemd-networkd[849]: eth0: Gained carrier Sep 9 05:31:50.185907 systemd-networkd[849]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:31:50.189349 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 05:31:50.191119 systemd-networkd[849]: eth1: Gained carrier Sep 9 05:31:50.191128 systemd-networkd[849]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:31:50.211870 ignition[853]: Ignition 2.22.0 Sep 9 05:31:50.211884 ignition[853]: Stage: fetch Sep 9 05:31:50.212082 systemd-networkd[849]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 9 05:31:50.212320 ignition[853]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:31:50.212333 ignition[853]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 05:31:50.212413 ignition[853]: parsed url from cmdline: "" Sep 9 05:31:50.212416 ignition[853]: no config URL provided Sep 9 05:31:50.212420 ignition[853]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 05:31:50.212427 ignition[853]: no config at "/usr/lib/ignition/user.ign" Sep 9 05:31:50.212450 ignition[853]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 9 05:31:50.212569 ignition[853]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 9 05:31:50.254094 systemd-networkd[849]: eth0: DHCPv4 address 65.109.237.121/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 9 05:31:50.412837 ignition[853]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 9 05:31:50.420987 ignition[853]: GET result: OK Sep 9 05:31:50.421129 ignition[853]: parsing config with SHA512: c98d5df744acc1ab32eeb9f6ae797307ec7c1399c1b2214e58814df460efc3c3712341f80013ae8250013b81dfdf6c96031fa425a2a42b721bba8f0d5b47746f Sep 9 05:31:50.430502 unknown[853]: fetched base config from "system" Sep 9 05:31:50.430987 ignition[853]: fetch: fetch complete Sep 9 05:31:50.430518 unknown[853]: fetched base config from "system" Sep 9 05:31:50.430994 ignition[853]: fetch: fetch passed Sep 9 05:31:50.430526 unknown[853]: fetched user config from "hetzner" Sep 9 05:31:50.433109 ignition[853]: Ignition finished successfully Sep 9 05:31:50.436398 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 05:31:50.439280 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 05:31:50.484967 ignition[861]: Ignition 2.22.0 Sep 9 05:31:50.484981 ignition[861]: Stage: kargs Sep 9 05:31:50.485134 ignition[861]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:31:50.485142 ignition[861]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 05:31:50.485887 ignition[861]: kargs: kargs passed Sep 9 05:31:50.485931 ignition[861]: Ignition finished successfully Sep 9 05:31:50.488287 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 05:31:50.490460 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 05:31:50.528168 ignition[868]: Ignition 2.22.0 Sep 9 05:31:50.528188 ignition[868]: Stage: disks Sep 9 05:31:50.528405 ignition[868]: no configs at "/usr/lib/ignition/base.d" Sep 9 05:31:50.532078 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 05:31:50.528419 ignition[868]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 05:31:50.533451 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 05:31:50.529608 ignition[868]: disks: disks passed Sep 9 05:31:50.534550 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 05:31:50.529663 ignition[868]: Ignition finished successfully Sep 9 05:31:50.535831 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:31:50.537068 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:31:50.538156 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:31:50.540167 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 05:31:50.582947 systemd-fsck[877]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 9 05:31:50.588000 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 05:31:50.592106 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 05:31:50.736081 kernel: EXT4-fs (sda9): mounted filesystem d2d7815e-fa16-4396-ab9d-ac540c1d8856 r/w with ordered data mode. Quota mode: none. Sep 9 05:31:50.736389 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 05:31:50.737377 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 05:31:50.739676 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:31:50.741572 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 05:31:50.749277 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 9 05:31:50.750679 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 05:31:50.750710 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:31:50.754486 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 05:31:50.758117 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 05:31:50.764056 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (885) Sep 9 05:31:50.771342 kernel: BTRFS info (device sda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:31:50.771380 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:31:50.781357 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 05:31:50.781405 kernel: BTRFS info (device sda6): turning on async discard Sep 9 05:31:50.784551 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 05:31:50.787753 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:31:50.817256 initrd-setup-root[913]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 05:31:50.819858 coreos-metadata[887]: Sep 09 05:31:50.819 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 9 05:31:50.821534 coreos-metadata[887]: Sep 09 05:31:50.820 INFO Fetch successful Sep 9 05:31:50.821534 coreos-metadata[887]: Sep 09 05:31:50.820 INFO wrote hostname ci-4452-0-0-n-de00512edc to /sysroot/etc/hostname Sep 9 05:31:50.824367 initrd-setup-root[920]: cut: /sysroot/etc/group: No such file or directory Sep 9 05:31:50.825421 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 05:31:50.828493 initrd-setup-root[928]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 05:31:50.831618 initrd-setup-root[935]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 05:31:50.896065 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 05:31:50.897600 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 05:31:50.898483 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 05:31:50.919047 kernel: BTRFS info (device sda6): last unmount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:31:50.930768 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 05:31:50.941637 ignition[1003]: INFO : Ignition 2.22.0 Sep 9 05:31:50.941637 ignition[1003]: INFO : Stage: mount Sep 9 05:31:50.942907 ignition[1003]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:31:50.942907 ignition[1003]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 05:31:50.942907 ignition[1003]: INFO : mount: mount passed Sep 9 05:31:50.942907 ignition[1003]: INFO : Ignition finished successfully Sep 9 05:31:50.943290 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 05:31:50.946097 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 05:31:50.996120 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 05:31:50.997682 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 05:31:51.022050 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1015) Sep 9 05:31:51.025244 kernel: BTRFS info (device sda6): first mount of filesystem d4e5a7a8-c50a-463e-827d-ca249a0b8b8b Sep 9 05:31:51.025282 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 9 05:31:51.031439 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 05:31:51.031476 kernel: BTRFS info (device sda6): turning on async discard Sep 9 05:31:51.033926 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 05:31:51.035994 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 05:31:51.064118 ignition[1032]: INFO : Ignition 2.22.0 Sep 9 05:31:51.064118 ignition[1032]: INFO : Stage: files Sep 9 05:31:51.065702 ignition[1032]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:31:51.065702 ignition[1032]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 05:31:51.065702 ignition[1032]: DEBUG : files: compiled without relabeling support, skipping Sep 9 05:31:51.067987 ignition[1032]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 05:31:51.067987 ignition[1032]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 05:31:51.070203 ignition[1032]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 05:31:51.071061 ignition[1032]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 05:31:51.071061 ignition[1032]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 05:31:51.070564 unknown[1032]: wrote ssh authorized keys file for user: core Sep 9 05:31:51.073585 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 05:31:51.073585 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Sep 9 05:31:51.349536 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 05:31:51.615198 systemd-networkd[849]: eth0: Gained IPv6LL Sep 9 05:31:51.807336 systemd-networkd[849]: eth1: Gained IPv6LL Sep 9 05:31:52.701963 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Sep 9 05:31:52.701963 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 05:31:52.704695 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 05:31:52.714141 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 05:31:52.714141 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 05:31:52.714141 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Sep 9 05:31:53.265894 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 05:31:56.996732 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Sep 9 05:31:56.996732 ignition[1032]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 05:31:56.999169 ignition[1032]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 05:31:57.001401 ignition[1032]: INFO : files: files passed Sep 9 05:31:57.001401 ignition[1032]: INFO : Ignition finished successfully Sep 9 05:31:57.001642 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 05:31:57.007133 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 05:31:57.018187 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 05:31:57.021083 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 05:31:57.021209 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 05:31:57.033765 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:31:57.033765 initrd-setup-root-after-ignition[1062]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:31:57.035708 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 05:31:57.036917 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:31:57.038040 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 05:31:57.039703 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 05:31:57.098215 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 05:31:57.098312 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 05:31:57.099644 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 05:31:57.100879 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 05:31:57.102031 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 05:31:57.102634 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 05:31:57.124356 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:31:57.127069 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 05:31:57.143482 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:31:57.144365 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:31:57.145756 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 05:31:57.147169 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 05:31:57.147314 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 05:31:57.148845 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 05:31:57.149710 systemd[1]: Stopped target basic.target - Basic System. Sep 9 05:31:57.151184 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 05:31:57.152346 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 05:31:57.153515 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 05:31:57.154800 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 05:31:57.156184 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 05:31:57.157555 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 05:31:57.159053 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 05:31:57.160392 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 05:31:57.161717 systemd[1]: Stopped target swap.target - Swaps. Sep 9 05:31:57.162980 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 05:31:57.163140 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 05:31:57.164437 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:31:57.165284 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:31:57.166377 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 05:31:57.169522 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:31:57.170255 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 05:31:57.170387 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 05:31:57.176337 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 05:31:57.176484 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 05:31:57.177868 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 05:31:57.177998 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 05:31:57.179207 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 9 05:31:57.179336 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 05:31:57.183208 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 05:31:57.183945 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 05:31:57.184134 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:31:57.187808 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 05:31:57.188372 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 05:31:57.188516 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:31:57.189824 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 05:31:57.189955 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 05:31:57.199622 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 05:31:57.199714 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 05:31:57.214070 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 05:31:57.215869 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 05:31:57.216009 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 05:31:57.220877 ignition[1086]: INFO : Ignition 2.22.0 Sep 9 05:31:57.220877 ignition[1086]: INFO : Stage: umount Sep 9 05:31:57.222045 ignition[1086]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 05:31:57.222045 ignition[1086]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 05:31:57.222045 ignition[1086]: INFO : umount: umount passed Sep 9 05:31:57.222045 ignition[1086]: INFO : Ignition finished successfully Sep 9 05:31:57.223263 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 05:31:57.223340 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 05:31:57.224245 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 05:31:57.224280 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 05:31:57.225090 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 05:31:57.225142 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 05:31:57.226033 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 05:31:57.226075 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 05:31:57.226982 systemd[1]: Stopped target network.target - Network. Sep 9 05:31:57.227918 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 05:31:57.227952 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 05:31:57.228934 systemd[1]: Stopped target paths.target - Path Units. Sep 9 05:31:57.229949 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 05:31:57.234049 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:31:57.234915 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 05:31:57.235861 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 05:31:57.237004 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 05:31:57.237075 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 05:31:57.238247 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 05:31:57.238272 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 05:31:57.239252 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 05:31:57.239290 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 05:31:57.240343 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 05:31:57.240379 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 05:31:57.241415 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 05:31:57.241467 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 05:31:57.242486 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 05:31:57.243513 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 05:31:57.249126 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 05:31:57.249211 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 05:31:57.252599 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 05:31:57.252797 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 05:31:57.252826 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:31:57.254888 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:31:57.255066 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 05:31:57.255153 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 05:31:57.257233 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 05:31:57.257433 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 05:31:57.258264 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 05:31:57.258289 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:31:57.260216 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 05:31:57.261946 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 05:31:57.261985 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 05:31:57.264360 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 05:31:57.264395 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:31:57.265993 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 05:31:57.266039 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 05:31:57.269070 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:31:57.271086 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 05:31:57.274347 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 05:31:57.274475 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:31:57.275203 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 05:31:57.275247 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 05:31:57.276261 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 05:31:57.276284 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:31:57.276706 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 05:31:57.276753 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 05:31:57.278551 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 05:31:57.278583 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 05:31:57.279642 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 05:31:57.279676 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 05:31:57.282112 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 05:31:57.283037 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 05:31:57.283078 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:31:57.286109 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 05:31:57.286142 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:31:57.287593 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:31:57.287624 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:31:57.292530 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 05:31:57.293108 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 05:31:57.298242 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 05:31:57.298320 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 05:31:57.299553 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 05:31:57.301230 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 05:31:57.311838 systemd[1]: Switching root. Sep 9 05:31:57.344430 systemd-journald[216]: Journal stopped Sep 9 05:31:58.200587 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). Sep 9 05:31:58.200638 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 05:31:58.200657 kernel: SELinux: policy capability open_perms=1 Sep 9 05:31:58.200665 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 05:31:58.200676 kernel: SELinux: policy capability always_check_network=0 Sep 9 05:31:58.200686 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 05:31:58.200694 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 05:31:58.200702 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 05:31:58.200711 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 05:31:58.200718 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 05:31:58.200727 kernel: audit: type=1403 audit(1757395917.516:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 05:31:58.200736 systemd[1]: Successfully loaded SELinux policy in 56.692ms. Sep 9 05:31:58.200747 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.071ms. Sep 9 05:31:58.200756 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 05:31:58.200765 systemd[1]: Detected virtualization kvm. Sep 9 05:31:58.200773 systemd[1]: Detected architecture x86-64. Sep 9 05:31:58.200781 systemd[1]: Detected first boot. Sep 9 05:31:58.200789 systemd[1]: Hostname set to . Sep 9 05:31:58.200797 systemd[1]: Initializing machine ID from VM UUID. Sep 9 05:31:58.200806 zram_generator::config[1129]: No configuration found. Sep 9 05:31:58.200815 kernel: Guest personality initialized and is inactive Sep 9 05:31:58.200823 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 9 05:31:58.200830 kernel: Initialized host personality Sep 9 05:31:58.200838 kernel: NET: Registered PF_VSOCK protocol family Sep 9 05:31:58.200846 systemd[1]: Populated /etc with preset unit settings. Sep 9 05:31:58.200855 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 05:31:58.200863 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 05:31:58.200872 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 05:31:58.200882 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 05:31:58.200891 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 05:31:58.200901 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 05:31:58.200909 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 05:31:58.200918 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 05:31:58.200927 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 05:31:58.200936 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 05:31:58.200947 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 05:31:58.200956 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 05:31:58.200965 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 05:31:58.200973 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 05:31:58.200981 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 05:31:58.200991 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 05:31:58.201000 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 05:31:58.201009 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 05:31:58.204103 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 9 05:31:58.204133 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 05:31:58.204143 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 05:31:58.204152 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 05:31:58.204164 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 05:31:58.204173 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 05:31:58.204181 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 05:31:58.204189 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 05:31:58.204197 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 05:31:58.204206 systemd[1]: Reached target slices.target - Slice Units. Sep 9 05:31:58.204214 systemd[1]: Reached target swap.target - Swaps. Sep 9 05:31:58.204222 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 05:31:58.204230 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 05:31:58.204239 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 05:31:58.204248 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 05:31:58.204257 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 05:31:58.204265 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 05:31:58.204273 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 05:31:58.204281 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 05:31:58.204290 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 05:31:58.204298 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 05:31:58.204307 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:31:58.204316 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 05:31:58.204325 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 05:31:58.204334 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 05:31:58.204343 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 05:31:58.204351 systemd[1]: Reached target machines.target - Containers. Sep 9 05:31:58.204359 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 05:31:58.204368 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:31:58.204377 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 05:31:58.204385 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 05:31:58.204394 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:31:58.204403 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:31:58.204411 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:31:58.204419 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 05:31:58.204429 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:31:58.204437 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 05:31:58.204446 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 05:31:58.204454 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 05:31:58.204463 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 05:31:58.204471 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 05:31:58.204480 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:31:58.204488 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 05:31:58.204496 kernel: loop: module loaded Sep 9 05:31:58.204504 kernel: fuse: init (API version 7.41) Sep 9 05:31:58.204512 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 05:31:58.204521 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 05:31:58.204529 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 05:31:58.204539 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 05:31:58.204548 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 05:31:58.204557 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 05:31:58.204565 systemd[1]: Stopped verity-setup.service. Sep 9 05:31:58.204573 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:31:58.204583 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 05:31:58.204591 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 05:31:58.204599 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 05:31:58.204610 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 05:31:58.204619 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 05:31:58.204627 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 05:31:58.204636 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 05:31:58.204663 systemd-journald[1210]: Collecting audit messages is disabled. Sep 9 05:31:58.204683 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 05:31:58.204692 systemd-journald[1210]: Journal started Sep 9 05:31:58.204712 systemd-journald[1210]: Runtime Journal (/run/log/journal/4db4780dc9494d929b70172a8c6ef8bf) is 4.8M, max 38.6M, 33.7M free. Sep 9 05:31:57.957954 systemd[1]: Queued start job for default target multi-user.target. Sep 9 05:31:57.966971 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 9 05:31:57.967445 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 05:31:58.207095 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 05:31:58.209252 kernel: ACPI: bus type drm_connector registered Sep 9 05:31:58.211417 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 05:31:58.211664 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 05:31:58.212493 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:31:58.212722 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:31:58.213416 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:31:58.213606 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:31:58.214315 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:31:58.214509 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:31:58.215420 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 05:31:58.215686 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 05:31:58.216439 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:31:58.216663 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:31:58.217437 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 05:31:58.218157 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 05:31:58.218839 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 05:31:58.219704 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 05:31:58.226594 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 05:31:58.230125 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 05:31:58.232503 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 05:31:58.233367 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 05:31:58.233450 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 05:31:58.234641 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 05:31:58.242170 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 05:31:58.242865 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:31:58.245127 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 05:31:58.246707 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 05:31:58.248103 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:31:58.248979 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 05:31:58.250104 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:31:58.252141 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 05:31:58.255983 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 05:31:58.259587 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 05:31:58.262571 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 05:31:58.263869 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 05:31:58.272187 systemd-journald[1210]: Time spent on flushing to /var/log/journal/4db4780dc9494d929b70172a8c6ef8bf is 25.907ms for 1165 entries. Sep 9 05:31:58.272187 systemd-journald[1210]: System Journal (/var/log/journal/4db4780dc9494d929b70172a8c6ef8bf) is 8M, max 584.8M, 576.8M free. Sep 9 05:31:58.311370 systemd-journald[1210]: Received client request to flush runtime journal. Sep 9 05:31:58.311430 kernel: loop0: detected capacity change from 0 to 128016 Sep 9 05:31:58.273291 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 05:31:58.274298 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 05:31:58.276377 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 05:31:58.297358 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 05:31:58.313357 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 05:31:58.318091 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 05:31:58.327667 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 05:31:58.338305 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 05:31:58.339056 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 05:31:58.343173 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 05:31:58.358093 kernel: loop1: detected capacity change from 0 to 110984 Sep 9 05:31:58.368530 systemd-tmpfiles[1271]: ACLs are not supported, ignoring. Sep 9 05:31:58.368820 systemd-tmpfiles[1271]: ACLs are not supported, ignoring. Sep 9 05:31:58.375035 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 05:31:58.391050 kernel: loop2: detected capacity change from 0 to 224512 Sep 9 05:31:58.426843 kernel: loop3: detected capacity change from 0 to 8 Sep 9 05:31:58.440044 kernel: loop4: detected capacity change from 0 to 128016 Sep 9 05:31:58.460616 kernel: loop5: detected capacity change from 0 to 110984 Sep 9 05:31:58.480052 kernel: loop6: detected capacity change from 0 to 224512 Sep 9 05:31:58.501974 kernel: loop7: detected capacity change from 0 to 8 Sep 9 05:31:58.502307 (sd-merge)[1277]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 9 05:31:58.504368 (sd-merge)[1277]: Merged extensions into '/usr'. Sep 9 05:31:58.508223 systemd[1]: Reload requested from client PID 1254 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 05:31:58.508321 systemd[1]: Reloading... Sep 9 05:31:58.578097 zram_generator::config[1303]: No configuration found. Sep 9 05:31:58.708468 ldconfig[1249]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 05:31:58.751916 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 05:31:58.752310 systemd[1]: Reloading finished in 243 ms. Sep 9 05:31:58.763944 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 05:31:58.764838 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 05:31:58.773893 systemd[1]: Starting ensure-sysext.service... Sep 9 05:31:58.775220 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 05:31:58.791507 systemd[1]: Reload requested from client PID 1346 ('systemctl') (unit ensure-sysext.service)... Sep 9 05:31:58.791622 systemd[1]: Reloading... Sep 9 05:31:58.797011 systemd-tmpfiles[1347]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 05:31:58.797276 systemd-tmpfiles[1347]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 05:31:58.797521 systemd-tmpfiles[1347]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 05:31:58.797764 systemd-tmpfiles[1347]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 05:31:58.798443 systemd-tmpfiles[1347]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 05:31:58.798690 systemd-tmpfiles[1347]: ACLs are not supported, ignoring. Sep 9 05:31:58.798784 systemd-tmpfiles[1347]: ACLs are not supported, ignoring. Sep 9 05:31:58.804009 systemd-tmpfiles[1347]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:31:58.804040 systemd-tmpfiles[1347]: Skipping /boot Sep 9 05:31:58.809447 systemd-tmpfiles[1347]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 05:31:58.809507 systemd-tmpfiles[1347]: Skipping /boot Sep 9 05:31:58.836095 zram_generator::config[1374]: No configuration found. Sep 9 05:31:58.974214 systemd[1]: Reloading finished in 182 ms. Sep 9 05:31:58.998202 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 05:31:59.002610 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 05:31:59.007143 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:31:59.009747 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 05:31:59.012737 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 05:31:59.016234 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 05:31:59.019218 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 05:31:59.021130 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 05:31:59.027461 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:31:59.028241 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:31:59.030367 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:31:59.036920 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 05:31:59.045358 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 05:31:59.047184 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:31:59.047293 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:31:59.047388 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:31:59.048355 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:31:59.048516 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:31:59.055219 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 05:31:59.060147 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:31:59.060311 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:31:59.062140 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 05:31:59.062640 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:31:59.062750 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:31:59.062856 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:31:59.068048 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:31:59.068287 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 05:31:59.072414 systemd-udevd[1423]: Using default interface naming scheme 'v255'. Sep 9 05:31:59.076522 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 05:31:59.077204 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 05:31:59.077286 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 05:31:59.077400 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 9 05:31:59.078730 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 05:31:59.080303 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 05:31:59.082490 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 05:31:59.082623 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 05:31:59.087666 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 05:31:59.088063 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 05:31:59.088994 systemd[1]: Finished ensure-sysext.service. Sep 9 05:31:59.089886 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 05:31:59.090009 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 05:31:59.091727 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 05:31:59.091780 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 05:31:59.095150 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 05:31:59.097921 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 05:31:59.104474 augenrules[1459]: No rules Sep 9 05:31:59.105486 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:31:59.105651 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:31:59.107388 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 05:31:59.107505 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 05:31:59.114102 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 05:31:59.125927 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 05:31:59.127176 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 05:31:59.129373 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 05:31:59.129806 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 05:31:59.140868 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 05:31:59.213008 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 9 05:31:59.254053 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 05:31:59.295051 systemd-networkd[1472]: lo: Link UP Sep 9 05:31:59.295381 systemd-networkd[1472]: lo: Gained carrier Sep 9 05:31:59.298573 systemd-networkd[1472]: Enumeration completed Sep 9 05:31:59.298725 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 05:31:59.298934 systemd-networkd[1472]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:31:59.298938 systemd-networkd[1472]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:31:59.299517 systemd-networkd[1472]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:31:59.299522 systemd-networkd[1472]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 05:31:59.299806 systemd-networkd[1472]: eth0: Link UP Sep 9 05:31:59.299903 systemd-networkd[1472]: eth0: Gained carrier Sep 9 05:31:59.299914 systemd-networkd[1472]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:31:59.303182 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 05:31:59.305181 systemd-networkd[1472]: eth1: Link UP Sep 9 05:31:59.305772 systemd-networkd[1472]: eth1: Gained carrier Sep 9 05:31:59.305837 systemd-networkd[1472]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 05:31:59.310184 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 05:31:59.337104 systemd-networkd[1472]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 9 05:31:59.349598 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 05:31:59.356101 systemd-networkd[1472]: eth0: DHCPv4 address 65.109.237.121/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 9 05:31:59.361103 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 05:31:59.366979 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 05:31:59.376407 systemd-resolved[1422]: Positive Trust Anchors: Sep 9 05:31:59.379188 systemd-resolved[1422]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 05:31:59.379326 systemd-resolved[1422]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 05:31:59.383448 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 9 05:31:59.384978 systemd-resolved[1422]: Using system hostname 'ci-4452-0-0-n-de00512edc'. Sep 9 05:31:59.385668 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 05:31:59.387376 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 05:31:59.387851 systemd[1]: Reached target network.target - Network. Sep 9 05:31:59.388280 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 05:31:59.389093 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 05:31:59.390157 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 05:31:59.390783 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 05:31:59.392092 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 9 05:31:59.392642 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 05:31:59.393346 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 05:31:59.394223 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 05:31:59.395498 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 05:31:59.395529 systemd[1]: Reached target paths.target - Path Units. Sep 9 05:31:59.395913 systemd[1]: Reached target timers.target - Timer Units. Sep 9 05:31:59.398163 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 05:31:59.404723 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 05:31:59.406056 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 9 05:31:59.412288 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 05:31:59.413368 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 05:31:59.414240 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 05:31:59.421735 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 05:31:59.422955 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 05:31:59.426267 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 05:31:59.427342 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 05:31:59.435038 kernel: ACPI: button: Power Button [PWRF] Sep 9 05:31:59.434731 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 9 05:31:59.435470 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 05:31:59.435894 systemd[1]: Reached target basic.target - Basic System. Sep 9 05:31:59.436804 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:31:59.436832 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 05:31:59.439113 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 05:31:59.442782 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 05:31:59.446137 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 05:31:59.473347 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 9 05:31:59.473537 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 9 05:31:59.475115 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 05:31:59.479100 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 05:31:59.481578 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 05:31:59.485066 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Sep 9 05:31:59.485104 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Sep 9 05:31:59.483041 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 05:31:59.485453 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 9 05:31:59.490470 kernel: Console: switching to colour dummy device 80x25 Sep 9 05:31:59.490495 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 9 05:31:59.490512 kernel: [drm] features: -context_init Sep 9 05:31:59.493162 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 05:31:59.493297 kernel: EDAC MC: Ver: 3.0.0 Sep 9 05:31:59.493638 kernel: [drm] number of scanouts: 1 Sep 9 05:31:59.494104 kernel: [drm] number of cap sets: 0 Sep 9 05:31:59.495141 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Sep 9 05:31:59.499470 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 05:31:59.503725 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Sep 9 05:31:59.503776 kernel: Console: switching to colour frame buffer device 160x50 Sep 9 05:31:59.509115 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 9 05:31:59.513972 jq[1542]: false Sep 9 05:31:59.511297 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 9 05:31:59.515206 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 05:31:59.525492 google_oslogin_nss_cache[1545]: oslogin_cache_refresh[1545]: Refreshing passwd entry cache Sep 9 05:31:59.523777 oslogin_cache_refresh[1545]: Refreshing passwd entry cache Sep 9 05:31:59.528044 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 05:31:59.534114 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 05:31:59.534773 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 05:31:59.537046 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 05:31:59.537969 google_oslogin_nss_cache[1545]: oslogin_cache_refresh[1545]: Failure getting users, quitting Sep 9 05:31:59.537969 google_oslogin_nss_cache[1545]: oslogin_cache_refresh[1545]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:31:59.537969 google_oslogin_nss_cache[1545]: oslogin_cache_refresh[1545]: Refreshing group entry cache Sep 9 05:31:59.537637 oslogin_cache_refresh[1545]: Failure getting users, quitting Sep 9 05:31:59.537653 oslogin_cache_refresh[1545]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 9 05:31:59.537686 oslogin_cache_refresh[1545]: Refreshing group entry cache Sep 9 05:31:59.538797 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 05:31:59.540212 google_oslogin_nss_cache[1545]: oslogin_cache_refresh[1545]: Failure getting groups, quitting Sep 9 05:31:59.540261 oslogin_cache_refresh[1545]: Failure getting groups, quitting Sep 9 05:31:59.540307 google_oslogin_nss_cache[1545]: oslogin_cache_refresh[1545]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:31:59.540635 oslogin_cache_refresh[1545]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 9 05:31:59.544114 extend-filesystems[1543]: Found /dev/sda6 Sep 9 05:31:59.545767 coreos-metadata[1528]: Sep 09 05:31:59.544 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 9 05:31:59.545962 coreos-metadata[1528]: Sep 09 05:31:59.545 INFO Fetch successful Sep 9 05:31:59.546127 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 05:31:59.549012 extend-filesystems[1543]: Found /dev/sda9 Sep 9 05:31:59.550516 coreos-metadata[1528]: Sep 09 05:31:59.547 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 9 05:31:59.550516 coreos-metadata[1528]: Sep 09 05:31:59.547 INFO Fetch successful Sep 9 05:31:59.553371 extend-filesystems[1543]: Checking size of /dev/sda9 Sep 9 05:31:59.559367 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 05:31:59.559668 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 05:31:59.559809 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 05:31:59.560009 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 9 05:31:59.560200 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 9 05:31:59.563720 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 05:31:59.567416 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 05:31:59.574218 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 05:31:59.579414 extend-filesystems[1543]: Resized partition /dev/sda9 Sep 9 05:31:59.597570 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 9 05:31:59.576050 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 05:31:59.597692 jq[1567]: true Sep 9 05:31:59.597781 extend-filesystems[1580]: resize2fs 1.47.3 (8-Jul-2025) Sep 9 05:31:59.608285 (ntainerd)[1581]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 05:31:59.618955 dbus-daemon[1530]: [system] SELinux support is enabled Sep 9 05:31:59.619114 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 05:31:59.646889 jq[1594]: true Sep 9 05:31:59.646974 update_engine[1566]: I20250909 05:31:59.628232 1566 main.cc:92] Flatcar Update Engine starting Sep 9 05:31:59.646974 update_engine[1566]: I20250909 05:31:59.638770 1566 update_check_scheduler.cc:74] Next update check in 2m41s Sep 9 05:31:59.622474 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 05:31:59.622497 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 05:31:59.623842 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 05:31:59.623858 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 05:31:59.637617 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:31:59.638564 systemd[1]: Started update-engine.service - Update Engine. Sep 9 05:31:59.641642 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 05:31:59.646688 systemd-timesyncd[1456]: Contacted time server 185.252.140.126:123 (0.flatcar.pool.ntp.org). Sep 9 05:31:59.646724 systemd-timesyncd[1456]: Initial clock synchronization to Tue 2025-09-09 05:31:59.595547 UTC. Sep 9 05:31:59.658713 tar[1576]: linux-amd64/LICENSE Sep 9 05:31:59.658959 tar[1576]: linux-amd64/helm Sep 9 05:31:59.723040 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 9 05:31:59.730345 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 05:31:59.732989 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 05:31:59.746457 bash[1619]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:31:59.746263 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 05:31:59.746579 extend-filesystems[1580]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 9 05:31:59.746579 extend-filesystems[1580]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 9 05:31:59.746579 extend-filesystems[1580]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 9 05:31:59.747713 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 05:31:59.751143 extend-filesystems[1543]: Resized filesystem in /dev/sda9 Sep 9 05:31:59.782801 systemd-logind[1561]: New seat seat0. Sep 9 05:31:59.784508 systemd-logind[1561]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 9 05:31:59.814625 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 05:31:59.817156 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 05:31:59.821588 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:31:59.830280 systemd[1]: Starting sshkeys.service... Sep 9 05:31:59.862913 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 05:31:59.868228 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 05:31:59.878362 sshd_keygen[1595]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 05:31:59.902151 locksmithd[1602]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 05:31:59.910409 containerd[1581]: time="2025-09-09T05:31:59Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 05:31:59.910409 containerd[1581]: time="2025-09-09T05:31:59.909689878Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 05:31:59.915907 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 05:31:59.920082 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 05:31:59.926740 systemd-logind[1561]: Watching system buttons on /dev/input/event3 (Power Button) Sep 9 05:31:59.927083 coreos-metadata[1638]: Sep 09 05:31:59.926 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 9 05:31:59.929819 coreos-metadata[1638]: Sep 09 05:31:59.929 INFO Fetch successful Sep 9 05:31:59.930941 unknown[1638]: wrote ssh authorized keys file for user: core Sep 9 05:31:59.933274 containerd[1581]: time="2025-09-09T05:31:59.932979630Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.357µs" Sep 9 05:31:59.934177 containerd[1581]: time="2025-09-09T05:31:59.933010267Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940080181Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940214553Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940229711Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940251522Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940298280Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940307877Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940443652Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940454713Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940463319Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940469761Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940527189Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941109 containerd[1581]: time="2025-09-09T05:31:59.940673393Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941331 containerd[1581]: time="2025-09-09T05:31:59.940696686Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 05:31:59.941331 containerd[1581]: time="2025-09-09T05:31:59.940704211Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 05:31:59.941331 containerd[1581]: time="2025-09-09T05:31:59.940725882Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 05:31:59.941331 containerd[1581]: time="2025-09-09T05:31:59.940891232Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 05:31:59.941331 containerd[1581]: time="2025-09-09T05:31:59.940933781Z" level=info msg="metadata content store policy set" policy=shared Sep 9 05:31:59.945304 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 05:31:59.945594 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949177756Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949221137Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949233621Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949242658Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949253178Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949261924Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949273565Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949282743Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949290718Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949298753Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949306507Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949321997Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949405553Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 05:31:59.949813 containerd[1581]: time="2025-09-09T05:31:59.949424018Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949436701Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949465836Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949474983Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949484080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949493879Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949501513Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949511262Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949518905Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949527872Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949579409Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949590099Z" level=info msg="Start snapshots syncer" Sep 9 05:31:59.950080 containerd[1581]: time="2025-09-09T05:31:59.949607181Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 05:31:59.950248 containerd[1581]: time="2025-09-09T05:31:59.949785676Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 05:31:59.950248 containerd[1581]: time="2025-09-09T05:31:59.949823386Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 05:31:59.950334 containerd[1581]: time="2025-09-09T05:31:59.949880022Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 05:31:59.950334 containerd[1581]: time="2025-09-09T05:31:59.949949623Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 05:31:59.950334 containerd[1581]: time="2025-09-09T05:31:59.949975151Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 05:31:59.950334 containerd[1581]: time="2025-09-09T05:31:59.949989908Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 05:31:59.950334 containerd[1581]: time="2025-09-09T05:31:59.949999387Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 05:31:59.950334 containerd[1581]: time="2025-09-09T05:31:59.950007923Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 05:31:59.950413 containerd[1581]: time="2025-09-09T05:31:59.950015457Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 05:31:59.950427 containerd[1581]: time="2025-09-09T05:31:59.950413763Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 05:31:59.950445 containerd[1581]: time="2025-09-09T05:31:59.950435444Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 05:31:59.950702 containerd[1581]: time="2025-09-09T05:31:59.950687496Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 05:31:59.950974 containerd[1581]: time="2025-09-09T05:31:59.950961440Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 05:31:59.951076 containerd[1581]: time="2025-09-09T05:31:59.951063541Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:31:59.951291 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 05:31:59.952761 containerd[1581]: time="2025-09-09T05:31:59.952744052Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 05:31:59.952855 containerd[1581]: time="2025-09-09T05:31:59.952841364Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:31:59.952909 containerd[1581]: time="2025-09-09T05:31:59.952897389Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 05:31:59.952951 containerd[1581]: time="2025-09-09T05:31:59.952941802Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 05:31:59.953130 containerd[1581]: time="2025-09-09T05:31:59.953115117Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 05:31:59.953318 containerd[1581]: time="2025-09-09T05:31:59.953304463Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 05:31:59.953383 containerd[1581]: time="2025-09-09T05:31:59.953372831Z" level=info msg="runtime interface created" Sep 9 05:31:59.953690 containerd[1581]: time="2025-09-09T05:31:59.953532039Z" level=info msg="created NRI interface" Sep 9 05:31:59.953690 containerd[1581]: time="2025-09-09T05:31:59.953548640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 05:31:59.953690 containerd[1581]: time="2025-09-09T05:31:59.953560642Z" level=info msg="Connect containerd service" Sep 9 05:31:59.953690 containerd[1581]: time="2025-09-09T05:31:59.953589086Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 05:31:59.956822 containerd[1581]: time="2025-09-09T05:31:59.956720627Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 05:31:59.963156 update-ssh-keys[1656]: Updated "/home/core/.ssh/authorized_keys" Sep 9 05:31:59.958768 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 05:31:59.961401 systemd[1]: Finished sshkeys.service. Sep 9 05:31:59.972814 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:31:59.972945 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:31:59.974735 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:31:59.979237 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:31:59.982634 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 05:31:59.994329 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 05:31:59.994479 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:32:00.007280 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 05:32:00.014873 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 05:32:00.022811 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 05:32:00.028214 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 9 05:32:00.029908 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114729707Z" level=info msg="Start subscribing containerd event" Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114784231Z" level=info msg="Start recovering state" Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114875673Z" level=info msg="Start event monitor" Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114886784Z" level=info msg="Start cni network conf syncer for default" Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114893238Z" level=info msg="Start streaming server" Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114899973Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114905967Z" level=info msg="runtime interface starting up..." Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114911462Z" level=info msg="starting plugins..." Sep 9 05:32:00.116419 containerd[1581]: time="2025-09-09T05:32:00.114922314Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 05:32:00.116728 containerd[1581]: time="2025-09-09T05:32:00.116659885Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 05:32:00.116824 containerd[1581]: time="2025-09-09T05:32:00.116811416Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 05:32:00.117041 containerd[1581]: time="2025-09-09T05:32:00.116923900Z" level=info msg="containerd successfully booted in 0.212125s" Sep 9 05:32:00.117006 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 05:32:00.141911 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 05:32:00.231826 tar[1576]: linux-amd64/README.md Sep 9 05:32:00.246711 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 05:32:00.447208 systemd-networkd[1472]: eth1: Gained IPv6LL Sep 9 05:32:00.449924 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 05:32:00.451538 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 05:32:00.453621 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:32:00.455549 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 05:32:00.481412 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 05:32:00.703247 systemd-networkd[1472]: eth0: Gained IPv6LL Sep 9 05:32:01.371622 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:01.375590 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 05:32:01.377422 systemd[1]: Startup finished in 2.947s (kernel) + 9.871s (initrd) + 3.916s (userspace) = 16.735s. Sep 9 05:32:01.379349 (kubelet)[1710]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:32:02.007290 kubelet[1710]: E0909 05:32:02.007227 1710 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:32:02.009763 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:32:02.009901 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:32:02.010194 systemd[1]: kubelet.service: Consumed 1.002s CPU time, 263.5M memory peak. Sep 9 05:32:02.902840 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 05:32:02.905212 systemd[1]: Started sshd@0-65.109.237.121:22-147.75.109.163:59676.service - OpenSSH per-connection server daemon (147.75.109.163:59676). Sep 9 05:32:03.900229 sshd[1723]: Accepted publickey for core from 147.75.109.163 port 59676 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:32:03.902476 sshd-session[1723]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:32:03.908362 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 05:32:03.909836 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 05:32:03.919277 systemd-logind[1561]: New session 1 of user core. Sep 9 05:32:03.925401 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 05:32:03.927792 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 05:32:03.943980 (systemd)[1728]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 05:32:03.946447 systemd-logind[1561]: New session c1 of user core. Sep 9 05:32:04.078951 systemd[1728]: Queued start job for default target default.target. Sep 9 05:32:04.085861 systemd[1728]: Created slice app.slice - User Application Slice. Sep 9 05:32:04.085928 systemd[1728]: Reached target paths.target - Paths. Sep 9 05:32:04.085969 systemd[1728]: Reached target timers.target - Timers. Sep 9 05:32:04.086971 systemd[1728]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 05:32:04.096834 systemd[1728]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 05:32:04.096895 systemd[1728]: Reached target sockets.target - Sockets. Sep 9 05:32:04.096923 systemd[1728]: Reached target basic.target - Basic System. Sep 9 05:32:04.096947 systemd[1728]: Reached target default.target - Main User Target. Sep 9 05:32:04.096966 systemd[1728]: Startup finished in 144ms. Sep 9 05:32:04.097079 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 05:32:04.109139 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 05:32:04.798666 systemd[1]: Started sshd@1-65.109.237.121:22-147.75.109.163:59678.service - OpenSSH per-connection server daemon (147.75.109.163:59678). Sep 9 05:32:05.797278 sshd[1739]: Accepted publickey for core from 147.75.109.163 port 59678 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:32:05.798582 sshd-session[1739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:32:05.803830 systemd-logind[1561]: New session 2 of user core. Sep 9 05:32:05.809172 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 05:32:06.472950 sshd[1742]: Connection closed by 147.75.109.163 port 59678 Sep 9 05:32:06.473488 sshd-session[1739]: pam_unix(sshd:session): session closed for user core Sep 9 05:32:06.476935 systemd-logind[1561]: Session 2 logged out. Waiting for processes to exit. Sep 9 05:32:06.477082 systemd[1]: sshd@1-65.109.237.121:22-147.75.109.163:59678.service: Deactivated successfully. Sep 9 05:32:06.478536 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 05:32:06.479965 systemd-logind[1561]: Removed session 2. Sep 9 05:32:06.678122 systemd[1]: Started sshd@2-65.109.237.121:22-147.75.109.163:59690.service - OpenSSH per-connection server daemon (147.75.109.163:59690). Sep 9 05:32:07.752857 sshd[1748]: Accepted publickey for core from 147.75.109.163 port 59690 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:32:07.754255 sshd-session[1748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:32:07.759636 systemd-logind[1561]: New session 3 of user core. Sep 9 05:32:07.769151 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 05:32:08.490298 sshd[1751]: Connection closed by 147.75.109.163 port 59690 Sep 9 05:32:08.491347 sshd-session[1748]: pam_unix(sshd:session): session closed for user core Sep 9 05:32:08.496718 systemd[1]: sshd@2-65.109.237.121:22-147.75.109.163:59690.service: Deactivated successfully. Sep 9 05:32:08.499378 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 05:32:08.501320 systemd-logind[1561]: Session 3 logged out. Waiting for processes to exit. Sep 9 05:32:08.503092 systemd-logind[1561]: Removed session 3. Sep 9 05:32:08.678790 systemd[1]: Started sshd@3-65.109.237.121:22-147.75.109.163:59706.service - OpenSSH per-connection server daemon (147.75.109.163:59706). Sep 9 05:32:09.791191 sshd[1757]: Accepted publickey for core from 147.75.109.163 port 59706 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:32:09.793814 sshd-session[1757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:32:09.800561 systemd-logind[1561]: New session 4 of user core. Sep 9 05:32:09.807251 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 05:32:10.530284 sshd[1760]: Connection closed by 147.75.109.163 port 59706 Sep 9 05:32:10.530838 sshd-session[1757]: pam_unix(sshd:session): session closed for user core Sep 9 05:32:10.533684 systemd[1]: sshd@3-65.109.237.121:22-147.75.109.163:59706.service: Deactivated successfully. Sep 9 05:32:10.535830 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 05:32:10.537224 systemd-logind[1561]: Session 4 logged out. Waiting for processes to exit. Sep 9 05:32:10.539165 systemd-logind[1561]: Removed session 4. Sep 9 05:32:10.679124 systemd[1]: Started sshd@4-65.109.237.121:22-147.75.109.163:52124.service - OpenSSH per-connection server daemon (147.75.109.163:52124). Sep 9 05:32:11.653117 sshd[1766]: Accepted publickey for core from 147.75.109.163 port 52124 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:32:11.654354 sshd-session[1766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:32:11.659339 systemd-logind[1561]: New session 5 of user core. Sep 9 05:32:11.669144 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 05:32:12.025437 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 05:32:12.026724 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:32:12.163503 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:12.170595 (kubelet)[1778]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:32:12.176494 sudo[1777]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 05:32:12.176732 sudo[1777]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:32:12.186505 sudo[1777]: pam_unix(sudo:session): session closed for user root Sep 9 05:32:12.220539 kubelet[1778]: E0909 05:32:12.220486 1778 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:32:12.223985 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:32:12.224152 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:32:12.224378 systemd[1]: kubelet.service: Consumed 146ms CPU time, 109.3M memory peak. Sep 9 05:32:12.343754 sshd[1769]: Connection closed by 147.75.109.163 port 52124 Sep 9 05:32:12.344695 sshd-session[1766]: pam_unix(sshd:session): session closed for user core Sep 9 05:32:12.348037 systemd[1]: sshd@4-65.109.237.121:22-147.75.109.163:52124.service: Deactivated successfully. Sep 9 05:32:12.349367 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 05:32:12.350237 systemd-logind[1561]: Session 5 logged out. Waiting for processes to exit. Sep 9 05:32:12.351461 systemd-logind[1561]: Removed session 5. Sep 9 05:32:12.511322 systemd[1]: Started sshd@5-65.109.237.121:22-147.75.109.163:52136.service - OpenSSH per-connection server daemon (147.75.109.163:52136). Sep 9 05:32:13.488429 sshd[1791]: Accepted publickey for core from 147.75.109.163 port 52136 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:32:13.489894 sshd-session[1791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:32:13.494799 systemd-logind[1561]: New session 6 of user core. Sep 9 05:32:13.497146 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 05:32:14.003461 sudo[1796]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 05:32:14.003667 sudo[1796]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:32:14.007489 sudo[1796]: pam_unix(sudo:session): session closed for user root Sep 9 05:32:14.011320 sudo[1795]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 05:32:14.011511 sudo[1795]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:32:14.020196 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 05:32:14.049661 augenrules[1818]: No rules Sep 9 05:32:14.050523 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 05:32:14.050700 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 05:32:14.052015 sudo[1795]: pam_unix(sudo:session): session closed for user root Sep 9 05:32:14.209520 sshd[1794]: Connection closed by 147.75.109.163 port 52136 Sep 9 05:32:14.210051 sshd-session[1791]: pam_unix(sshd:session): session closed for user core Sep 9 05:32:14.212935 systemd[1]: sshd@5-65.109.237.121:22-147.75.109.163:52136.service: Deactivated successfully. Sep 9 05:32:14.214481 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 05:32:14.215994 systemd-logind[1561]: Session 6 logged out. Waiting for processes to exit. Sep 9 05:32:14.216951 systemd-logind[1561]: Removed session 6. Sep 9 05:32:14.380229 systemd[1]: Started sshd@6-65.109.237.121:22-147.75.109.163:52146.service - OpenSSH per-connection server daemon (147.75.109.163:52146). Sep 9 05:32:15.359382 sshd[1827]: Accepted publickey for core from 147.75.109.163 port 52146 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:32:15.360559 sshd-session[1827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:32:15.364972 systemd-logind[1561]: New session 7 of user core. Sep 9 05:32:15.375178 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 05:32:15.875057 sudo[1831]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 05:32:15.875341 sudo[1831]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 05:32:16.213403 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 05:32:16.232609 (dockerd)[1848]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 05:32:16.563720 dockerd[1848]: time="2025-09-09T05:32:16.563632859Z" level=info msg="Starting up" Sep 9 05:32:16.564748 dockerd[1848]: time="2025-09-09T05:32:16.564669028Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 05:32:16.575195 dockerd[1848]: time="2025-09-09T05:32:16.575112959Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 05:32:16.627299 dockerd[1848]: time="2025-09-09T05:32:16.627241821Z" level=info msg="Loading containers: start." Sep 9 05:32:16.642056 kernel: Initializing XFRM netlink socket Sep 9 05:32:16.911231 systemd-networkd[1472]: docker0: Link UP Sep 9 05:32:16.916703 dockerd[1848]: time="2025-09-09T05:32:16.916645928Z" level=info msg="Loading containers: done." Sep 9 05:32:16.935011 dockerd[1848]: time="2025-09-09T05:32:16.934957335Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 05:32:16.935159 dockerd[1848]: time="2025-09-09T05:32:16.935097853Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 05:32:16.935250 dockerd[1848]: time="2025-09-09T05:32:16.935215410Z" level=info msg="Initializing buildkit" Sep 9 05:32:16.967187 dockerd[1848]: time="2025-09-09T05:32:16.966994288Z" level=info msg="Completed buildkit initialization" Sep 9 05:32:16.975284 dockerd[1848]: time="2025-09-09T05:32:16.975223913Z" level=info msg="Daemon has completed initialization" Sep 9 05:32:16.975284 dockerd[1848]: time="2025-09-09T05:32:16.975289443Z" level=info msg="API listen on /run/docker.sock" Sep 9 05:32:16.975652 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 05:32:18.254485 containerd[1581]: time="2025-09-09T05:32:18.254427469Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 05:32:19.163537 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2626727672.mount: Deactivated successfully. Sep 9 05:32:20.575389 containerd[1581]: time="2025-09-09T05:32:20.574460674Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:20.575389 containerd[1581]: time="2025-09-09T05:32:20.575360056Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=28800781" Sep 9 05:32:20.576007 containerd[1581]: time="2025-09-09T05:32:20.575989455Z" level=info msg="ImageCreate event name:\"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:20.577918 containerd[1581]: time="2025-09-09T05:32:20.577892699Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:20.578575 containerd[1581]: time="2025-09-09T05:32:20.578556757Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"28797487\" in 2.324093231s" Sep 9 05:32:20.578649 containerd[1581]: time="2025-09-09T05:32:20.578637518Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:0d4edaa48e2f940c934e0f7cfd5209fc85e65ab5e842b980f41263d1764661f1\"" Sep 9 05:32:20.579281 containerd[1581]: time="2025-09-09T05:32:20.579259578Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 05:32:22.276001 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 05:32:22.278264 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:32:22.388568 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:22.391324 containerd[1581]: time="2025-09-09T05:32:22.391276181Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:22.392946 containerd[1581]: time="2025-09-09T05:32:22.392908668Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=24784150" Sep 9 05:32:22.394042 containerd[1581]: time="2025-09-09T05:32:22.393950116Z" level=info msg="ImageCreate event name:\"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:22.395253 (kubelet)[2123]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:32:22.397935 containerd[1581]: time="2025-09-09T05:32:22.397865441Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:22.399770 containerd[1581]: time="2025-09-09T05:32:22.399730961Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"26387322\" in 1.820444022s" Sep 9 05:32:22.399879 containerd[1581]: time="2025-09-09T05:32:22.399864925Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:b248d0b0c74ad8230e0bae0cbed477560e8a1e8c7ef5f29b7e75c1f273c8a091\"" Sep 9 05:32:22.400578 containerd[1581]: time="2025-09-09T05:32:22.400548467Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 05:32:22.431314 kubelet[2123]: E0909 05:32:22.431250 2123 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:32:22.433552 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:32:22.433664 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:32:22.433907 systemd[1]: kubelet.service: Consumed 110ms CPU time, 108.5M memory peak. Sep 9 05:32:23.920322 containerd[1581]: time="2025-09-09T05:32:23.920245732Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:23.921850 containerd[1581]: time="2025-09-09T05:32:23.921770731Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=19175058" Sep 9 05:32:23.924652 containerd[1581]: time="2025-09-09T05:32:23.922996423Z" level=info msg="ImageCreate event name:\"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:23.926056 containerd[1581]: time="2025-09-09T05:32:23.926000133Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:23.927644 containerd[1581]: time="2025-09-09T05:32:23.927612902Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"20778248\" in 1.526963991s" Sep 9 05:32:23.927765 containerd[1581]: time="2025-09-09T05:32:23.927745458Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:2ac266f06c9a5a3d0d20ae482dbccb54d3be454d5ca49f48b528bdf5bae3e908\"" Sep 9 05:32:23.928754 containerd[1581]: time="2025-09-09T05:32:23.928698798Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 05:32:25.168462 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount654048939.mount: Deactivated successfully. Sep 9 05:32:25.452986 containerd[1581]: time="2025-09-09T05:32:25.452896913Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:25.454057 containerd[1581]: time="2025-09-09T05:32:25.454037843Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=30897198" Sep 9 05:32:25.455871 containerd[1581]: time="2025-09-09T05:32:25.455484482Z" level=info msg="ImageCreate event name:\"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:25.458131 containerd[1581]: time="2025-09-09T05:32:25.458114668Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:25.458340 containerd[1581]: time="2025-09-09T05:32:25.458310479Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"30896189\" in 1.529568576s" Sep 9 05:32:25.458531 containerd[1581]: time="2025-09-09T05:32:25.458343193Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:d7b94972d43c5d6ce8088a8bcd08614a5ecf2bf04166232c688adcd0b8ed4b12\"" Sep 9 05:32:25.460935 containerd[1581]: time="2025-09-09T05:32:25.460905759Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 05:32:26.199898 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2655480731.mount: Deactivated successfully. Sep 9 05:32:26.872865 containerd[1581]: time="2025-09-09T05:32:26.872804187Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:26.873609 containerd[1581]: time="2025-09-09T05:32:26.873578975Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Sep 9 05:32:26.874247 containerd[1581]: time="2025-09-09T05:32:26.874209228Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:26.876865 containerd[1581]: time="2025-09-09T05:32:26.876072744Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:26.876865 containerd[1581]: time="2025-09-09T05:32:26.876750070Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.415820741s" Sep 9 05:32:26.876865 containerd[1581]: time="2025-09-09T05:32:26.876772140Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 9 05:32:26.877299 containerd[1581]: time="2025-09-09T05:32:26.877211321Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 05:32:27.543032 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2988827911.mount: Deactivated successfully. Sep 9 05:32:27.547660 containerd[1581]: time="2025-09-09T05:32:27.547618457Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:32:27.548588 containerd[1581]: time="2025-09-09T05:32:27.548554606Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Sep 9 05:32:27.549940 containerd[1581]: time="2025-09-09T05:32:27.549382464Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:32:27.551933 containerd[1581]: time="2025-09-09T05:32:27.551911731Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 05:32:27.552513 containerd[1581]: time="2025-09-09T05:32:27.552483531Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 675.072026ms" Sep 9 05:32:27.552560 containerd[1581]: time="2025-09-09T05:32:27.552516937Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 9 05:32:27.552960 containerd[1581]: time="2025-09-09T05:32:27.552912801Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 05:32:28.311974 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount904807752.mount: Deactivated successfully. Sep 9 05:32:29.716881 containerd[1581]: time="2025-09-09T05:32:29.716817970Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:29.717686 containerd[1581]: time="2025-09-09T05:32:29.717652525Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57682132" Sep 9 05:32:29.718656 containerd[1581]: time="2025-09-09T05:32:29.718303013Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:29.720739 containerd[1581]: time="2025-09-09T05:32:29.720710989Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:29.721659 containerd[1581]: time="2025-09-09T05:32:29.721630148Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.168361549s" Sep 9 05:32:29.721731 containerd[1581]: time="2025-09-09T05:32:29.721719738Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Sep 9 05:32:32.525430 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 05:32:32.528168 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:32:32.646626 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:32.649631 (kubelet)[2281]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 05:32:32.690690 kubelet[2281]: E0909 05:32:32.690650 2281 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 05:32:32.692371 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 05:32:32.692484 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 05:32:32.693174 systemd[1]: kubelet.service: Consumed 110ms CPU time, 109.2M memory peak. Sep 9 05:32:32.774470 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:32.774763 systemd[1]: kubelet.service: Consumed 110ms CPU time, 109.2M memory peak. Sep 9 05:32:32.780489 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:32:32.799438 systemd[1]: Reload requested from client PID 2295 ('systemctl') (unit session-7.scope)... Sep 9 05:32:32.799527 systemd[1]: Reloading... Sep 9 05:32:32.887092 zram_generator::config[2338]: No configuration found. Sep 9 05:32:33.056193 systemd[1]: Reloading finished in 256 ms. Sep 9 05:32:33.104674 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 05:32:33.104753 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 05:32:33.104942 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:33.104977 systemd[1]: kubelet.service: Consumed 73ms CPU time, 98.4M memory peak. Sep 9 05:32:33.106231 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:32:33.215005 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:33.220729 (kubelet)[2392]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:32:33.262706 kubelet[2392]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:32:33.262706 kubelet[2392]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 05:32:33.262706 kubelet[2392]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:32:33.262999 kubelet[2392]: I0909 05:32:33.262797 2392 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:32:33.657480 kubelet[2392]: I0909 05:32:33.657423 2392 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 05:32:33.657480 kubelet[2392]: I0909 05:32:33.657455 2392 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:32:33.657746 kubelet[2392]: I0909 05:32:33.657709 2392 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 05:32:33.690241 kubelet[2392]: I0909 05:32:33.689507 2392 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:32:33.693256 kubelet[2392]: E0909 05:32:33.692207 2392 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://65.109.237.121:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 65.109.237.121:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:32:33.704332 kubelet[2392]: I0909 05:32:33.704299 2392 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:32:33.711587 kubelet[2392]: I0909 05:32:33.711561 2392 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:32:33.715096 kubelet[2392]: I0909 05:32:33.715038 2392 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:32:33.715333 kubelet[2392]: I0909 05:32:33.715090 2392 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452-0-0-n-de00512edc","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:32:33.717696 kubelet[2392]: I0909 05:32:33.717652 2392 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:32:33.717696 kubelet[2392]: I0909 05:32:33.717686 2392 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 05:32:33.719313 kubelet[2392]: I0909 05:32:33.719280 2392 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:32:33.724176 kubelet[2392]: I0909 05:32:33.724077 2392 kubelet.go:446] "Attempting to sync node with API server" Sep 9 05:32:33.724176 kubelet[2392]: I0909 05:32:33.724117 2392 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:32:33.724176 kubelet[2392]: I0909 05:32:33.724144 2392 kubelet.go:352] "Adding apiserver pod source" Sep 9 05:32:33.724176 kubelet[2392]: I0909 05:32:33.724158 2392 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:32:33.732418 kubelet[2392]: W0909 05:32:33.731668 2392 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://65.109.237.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452-0-0-n-de00512edc&limit=500&resourceVersion=0": dial tcp 65.109.237.121:6443: connect: connection refused Sep 9 05:32:33.732418 kubelet[2392]: E0909 05:32:33.731746 2392 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://65.109.237.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452-0-0-n-de00512edc&limit=500&resourceVersion=0\": dial tcp 65.109.237.121:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:32:33.732418 kubelet[2392]: W0909 05:32:33.732154 2392 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://65.109.237.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 65.109.237.121:6443: connect: connection refused Sep 9 05:32:33.732418 kubelet[2392]: E0909 05:32:33.732194 2392 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://65.109.237.121:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 65.109.237.121:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:32:33.734278 kubelet[2392]: I0909 05:32:33.734259 2392 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:32:33.737988 kubelet[2392]: I0909 05:32:33.737961 2392 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:32:33.738704 kubelet[2392]: W0909 05:32:33.738689 2392 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 05:32:33.742117 kubelet[2392]: I0909 05:32:33.741571 2392 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 05:32:33.742117 kubelet[2392]: I0909 05:32:33.741626 2392 server.go:1287] "Started kubelet" Sep 9 05:32:33.743540 kubelet[2392]: I0909 05:32:33.742488 2392 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:32:33.745138 kubelet[2392]: I0909 05:32:33.744145 2392 server.go:479] "Adding debug handlers to kubelet server" Sep 9 05:32:33.750744 kubelet[2392]: I0909 05:32:33.750671 2392 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:32:33.751063 kubelet[2392]: I0909 05:32:33.751012 2392 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:32:33.751317 kubelet[2392]: I0909 05:32:33.751293 2392 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:32:33.755310 kubelet[2392]: I0909 05:32:33.755269 2392 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:32:33.756105 kubelet[2392]: E0909 05:32:33.753242 2392 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://65.109.237.121:6443/api/v1/namespaces/default/events\": dial tcp 65.109.237.121:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4452-0-0-n-de00512edc.1863864f107a0ec8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4452-0-0-n-de00512edc,UID:ci-4452-0-0-n-de00512edc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4452-0-0-n-de00512edc,},FirstTimestamp:2025-09-09 05:32:33.741598408 +0000 UTC m=+0.517128022,LastTimestamp:2025-09-09 05:32:33.741598408 +0000 UTC m=+0.517128022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4452-0-0-n-de00512edc,}" Sep 9 05:32:33.760347 kubelet[2392]: I0909 05:32:33.760312 2392 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 05:32:33.760570 kubelet[2392]: E0909 05:32:33.760513 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:33.765077 kubelet[2392]: I0909 05:32:33.765015 2392 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:32:33.766395 kubelet[2392]: I0909 05:32:33.765321 2392 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:32:33.766395 kubelet[2392]: E0909 05:32:33.765779 2392 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.109.237.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452-0-0-n-de00512edc?timeout=10s\": dial tcp 65.109.237.121:6443: connect: connection refused" interval="200ms" Sep 9 05:32:33.768504 kubelet[2392]: I0909 05:32:33.768479 2392 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:32:33.772234 kubelet[2392]: I0909 05:32:33.769001 2392 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 05:32:33.772234 kubelet[2392]: W0909 05:32:33.769429 2392 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://65.109.237.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 65.109.237.121:6443: connect: connection refused Sep 9 05:32:33.772234 kubelet[2392]: E0909 05:32:33.769476 2392 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://65.109.237.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 65.109.237.121:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:32:33.772234 kubelet[2392]: I0909 05:32:33.769737 2392 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:32:33.780053 kubelet[2392]: I0909 05:32:33.779287 2392 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:32:33.780171 kubelet[2392]: I0909 05:32:33.780062 2392 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:32:33.780171 kubelet[2392]: I0909 05:32:33.780080 2392 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 05:32:33.780171 kubelet[2392]: I0909 05:32:33.780096 2392 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 05:32:33.780171 kubelet[2392]: I0909 05:32:33.780101 2392 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 05:32:33.780171 kubelet[2392]: E0909 05:32:33.780135 2392 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:32:33.788364 kubelet[2392]: W0909 05:32:33.788316 2392 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://65.109.237.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 65.109.237.121:6443: connect: connection refused Sep 9 05:32:33.788435 kubelet[2392]: E0909 05:32:33.788391 2392 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://65.109.237.121:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 65.109.237.121:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:32:33.792874 kubelet[2392]: E0909 05:32:33.792822 2392 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:32:33.798329 kubelet[2392]: I0909 05:32:33.798307 2392 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 05:32:33.798515 kubelet[2392]: I0909 05:32:33.798497 2392 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 05:32:33.798628 kubelet[2392]: I0909 05:32:33.798612 2392 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:32:33.800775 kubelet[2392]: I0909 05:32:33.800752 2392 policy_none.go:49] "None policy: Start" Sep 9 05:32:33.801285 kubelet[2392]: I0909 05:32:33.800920 2392 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 05:32:33.801285 kubelet[2392]: I0909 05:32:33.800946 2392 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:32:33.807473 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 05:32:33.816605 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 05:32:33.820560 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 05:32:33.841894 kubelet[2392]: I0909 05:32:33.841857 2392 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:32:33.842254 kubelet[2392]: I0909 05:32:33.842061 2392 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:32:33.842254 kubelet[2392]: I0909 05:32:33.842077 2392 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:32:33.842491 kubelet[2392]: I0909 05:32:33.842474 2392 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:32:33.844791 kubelet[2392]: E0909 05:32:33.844734 2392 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 05:32:33.845013 kubelet[2392]: E0909 05:32:33.844967 2392 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:33.895008 systemd[1]: Created slice kubepods-burstable-podea2e3e2e3f0d27b731b4b6cda3d70855.slice - libcontainer container kubepods-burstable-podea2e3e2e3f0d27b731b4b6cda3d70855.slice. Sep 9 05:32:33.916043 kubelet[2392]: E0909 05:32:33.914289 2392 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:33.919420 systemd[1]: Created slice kubepods-burstable-pod54be5da420ebbb69fe2357682547bd05.slice - libcontainer container kubepods-burstable-pod54be5da420ebbb69fe2357682547bd05.slice. Sep 9 05:32:33.923573 kubelet[2392]: E0909 05:32:33.923521 2392 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:33.929090 systemd[1]: Created slice kubepods-burstable-podabf10bac82dabdf156ba0ebc027d4dc0.slice - libcontainer container kubepods-burstable-podabf10bac82dabdf156ba0ebc027d4dc0.slice. Sep 9 05:32:33.931699 kubelet[2392]: E0909 05:32:33.931638 2392 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:33.944898 kubelet[2392]: I0909 05:32:33.944831 2392 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:33.945558 kubelet[2392]: E0909 05:32:33.945474 2392 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.109.237.121:6443/api/v1/nodes\": dial tcp 65.109.237.121:6443: connect: connection refused" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:33.967516 kubelet[2392]: E0909 05:32:33.967454 2392 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.109.237.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452-0-0-n-de00512edc?timeout=10s\": dial tcp 65.109.237.121:6443: connect: connection refused" interval="400ms" Sep 9 05:32:34.071622 kubelet[2392]: I0909 05:32:34.071521 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ea2e3e2e3f0d27b731b4b6cda3d70855-ca-certs\") pod \"kube-apiserver-ci-4452-0-0-n-de00512edc\" (UID: \"ea2e3e2e3f0d27b731b4b6cda3d70855\") " pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.071622 kubelet[2392]: I0909 05:32:34.071580 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ea2e3e2e3f0d27b731b4b6cda3d70855-k8s-certs\") pod \"kube-apiserver-ci-4452-0-0-n-de00512edc\" (UID: \"ea2e3e2e3f0d27b731b4b6cda3d70855\") " pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.071622 kubelet[2392]: I0909 05:32:34.071610 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ea2e3e2e3f0d27b731b4b6cda3d70855-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452-0-0-n-de00512edc\" (UID: \"ea2e3e2e3f0d27b731b4b6cda3d70855\") " pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.072170 kubelet[2392]: I0909 05:32:34.071635 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.072170 kubelet[2392]: I0909 05:32:34.071677 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-ca-certs\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.072170 kubelet[2392]: I0909 05:32:34.071701 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-flexvolume-dir\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.072170 kubelet[2392]: I0909 05:32:34.071724 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-k8s-certs\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.072170 kubelet[2392]: I0909 05:32:34.071770 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-kubeconfig\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.072577 kubelet[2392]: I0909 05:32:34.071793 2392 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/abf10bac82dabdf156ba0ebc027d4dc0-kubeconfig\") pod \"kube-scheduler-ci-4452-0-0-n-de00512edc\" (UID: \"abf10bac82dabdf156ba0ebc027d4dc0\") " pod="kube-system/kube-scheduler-ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.147564 kubelet[2392]: I0909 05:32:34.147522 2392 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.148053 kubelet[2392]: E0909 05:32:34.147976 2392 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.109.237.121:6443/api/v1/nodes\": dial tcp 65.109.237.121:6443: connect: connection refused" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.216009 containerd[1581]: time="2025-09-09T05:32:34.215863767Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452-0-0-n-de00512edc,Uid:ea2e3e2e3f0d27b731b4b6cda3d70855,Namespace:kube-system,Attempt:0,}" Sep 9 05:32:34.230130 containerd[1581]: time="2025-09-09T05:32:34.230003439Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452-0-0-n-de00512edc,Uid:54be5da420ebbb69fe2357682547bd05,Namespace:kube-system,Attempt:0,}" Sep 9 05:32:34.234271 containerd[1581]: time="2025-09-09T05:32:34.234223853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452-0-0-n-de00512edc,Uid:abf10bac82dabdf156ba0ebc027d4dc0,Namespace:kube-system,Attempt:0,}" Sep 9 05:32:34.322602 containerd[1581]: time="2025-09-09T05:32:34.322428003Z" level=info msg="connecting to shim 32e000aac59d38f320c350dada4b5717be5256f76e16fb6acc6554d240393352" address="unix:///run/containerd/s/f7f9e1e538a74654ff0e4e13c9cb53b350c8c55942ecaedd98d1b9d8d62af297" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:32:34.326864 containerd[1581]: time="2025-09-09T05:32:34.326202726Z" level=info msg="connecting to shim 507bd45eb1be74e903681327e950b35ded267a163908115371fd8e936b20c0d6" address="unix:///run/containerd/s/64d49d464caec475b2c106644615aac6eea6c2985e0a6e1c3dd655a7dd2b1524" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:32:34.331088 containerd[1581]: time="2025-09-09T05:32:34.331063195Z" level=info msg="connecting to shim 43d8dc4b5194fec02a885ac98bda0561ef86e429c427b768135ab1ee1f6bfeeb" address="unix:///run/containerd/s/5f6309c74fc0d5f6e3c8b6c27a7a57dc69ae00f1c2bd93929f22f07933481844" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:32:34.368642 kubelet[2392]: E0909 05:32:34.368587 2392 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.109.237.121:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4452-0-0-n-de00512edc?timeout=10s\": dial tcp 65.109.237.121:6443: connect: connection refused" interval="800ms" Sep 9 05:32:34.414152 systemd[1]: Started cri-containerd-32e000aac59d38f320c350dada4b5717be5256f76e16fb6acc6554d240393352.scope - libcontainer container 32e000aac59d38f320c350dada4b5717be5256f76e16fb6acc6554d240393352. Sep 9 05:32:34.415689 systemd[1]: Started cri-containerd-43d8dc4b5194fec02a885ac98bda0561ef86e429c427b768135ab1ee1f6bfeeb.scope - libcontainer container 43d8dc4b5194fec02a885ac98bda0561ef86e429c427b768135ab1ee1f6bfeeb. Sep 9 05:32:34.417001 systemd[1]: Started cri-containerd-507bd45eb1be74e903681327e950b35ded267a163908115371fd8e936b20c0d6.scope - libcontainer container 507bd45eb1be74e903681327e950b35ded267a163908115371fd8e936b20c0d6. Sep 9 05:32:34.505447 containerd[1581]: time="2025-09-09T05:32:34.505352185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4452-0-0-n-de00512edc,Uid:ea2e3e2e3f0d27b731b4b6cda3d70855,Namespace:kube-system,Attempt:0,} returns sandbox id \"32e000aac59d38f320c350dada4b5717be5256f76e16fb6acc6554d240393352\"" Sep 9 05:32:34.505856 containerd[1581]: time="2025-09-09T05:32:34.505760236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4452-0-0-n-de00512edc,Uid:54be5da420ebbb69fe2357682547bd05,Namespace:kube-system,Attempt:0,} returns sandbox id \"507bd45eb1be74e903681327e950b35ded267a163908115371fd8e936b20c0d6\"" Sep 9 05:32:34.511227 containerd[1581]: time="2025-09-09T05:32:34.510828352Z" level=info msg="CreateContainer within sandbox \"32e000aac59d38f320c350dada4b5717be5256f76e16fb6acc6554d240393352\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 05:32:34.512864 containerd[1581]: time="2025-09-09T05:32:34.512846964Z" level=info msg="CreateContainer within sandbox \"507bd45eb1be74e903681327e950b35ded267a163908115371fd8e936b20c0d6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 05:32:34.514217 containerd[1581]: time="2025-09-09T05:32:34.514154179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4452-0-0-n-de00512edc,Uid:abf10bac82dabdf156ba0ebc027d4dc0,Namespace:kube-system,Attempt:0,} returns sandbox id \"43d8dc4b5194fec02a885ac98bda0561ef86e429c427b768135ab1ee1f6bfeeb\"" Sep 9 05:32:34.517220 containerd[1581]: time="2025-09-09T05:32:34.516757841Z" level=info msg="CreateContainer within sandbox \"43d8dc4b5194fec02a885ac98bda0561ef86e429c427b768135ab1ee1f6bfeeb\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 05:32:34.523386 containerd[1581]: time="2025-09-09T05:32:34.523353047Z" level=info msg="Container b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:32:34.525315 containerd[1581]: time="2025-09-09T05:32:34.525120234Z" level=info msg="Container 2816a01197e8189483942238d8a3b0fc2471695b233846a41ccd37929f09aa5a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:32:34.525544 containerd[1581]: time="2025-09-09T05:32:34.525513893Z" level=info msg="Container 81fc8cfa6a52a51231818dfb4ff95b445793400078a19e677c150ebef8218c63: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:32:34.535123 containerd[1581]: time="2025-09-09T05:32:34.535006813Z" level=info msg="CreateContainer within sandbox \"507bd45eb1be74e903681327e950b35ded267a163908115371fd8e936b20c0d6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0\"" Sep 9 05:32:34.536665 containerd[1581]: time="2025-09-09T05:32:34.536633569Z" level=info msg="StartContainer for \"b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0\"" Sep 9 05:32:34.538046 containerd[1581]: time="2025-09-09T05:32:34.538000889Z" level=info msg="CreateContainer within sandbox \"32e000aac59d38f320c350dada4b5717be5256f76e16fb6acc6554d240393352\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"2816a01197e8189483942238d8a3b0fc2471695b233846a41ccd37929f09aa5a\"" Sep 9 05:32:34.538364 containerd[1581]: time="2025-09-09T05:32:34.538343357Z" level=info msg="connecting to shim b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0" address="unix:///run/containerd/s/64d49d464caec475b2c106644615aac6eea6c2985e0a6e1c3dd655a7dd2b1524" protocol=ttrpc version=3 Sep 9 05:32:34.540037 containerd[1581]: time="2025-09-09T05:32:34.538917800Z" level=info msg="StartContainer for \"2816a01197e8189483942238d8a3b0fc2471695b233846a41ccd37929f09aa5a\"" Sep 9 05:32:34.540037 containerd[1581]: time="2025-09-09T05:32:34.539760184Z" level=info msg="connecting to shim 2816a01197e8189483942238d8a3b0fc2471695b233846a41ccd37929f09aa5a" address="unix:///run/containerd/s/f7f9e1e538a74654ff0e4e13c9cb53b350c8c55942ecaedd98d1b9d8d62af297" protocol=ttrpc version=3 Sep 9 05:32:34.541632 containerd[1581]: time="2025-09-09T05:32:34.541577590Z" level=info msg="CreateContainer within sandbox \"43d8dc4b5194fec02a885ac98bda0561ef86e429c427b768135ab1ee1f6bfeeb\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"81fc8cfa6a52a51231818dfb4ff95b445793400078a19e677c150ebef8218c63\"" Sep 9 05:32:34.541876 containerd[1581]: time="2025-09-09T05:32:34.541859704Z" level=info msg="StartContainer for \"81fc8cfa6a52a51231818dfb4ff95b445793400078a19e677c150ebef8218c63\"" Sep 9 05:32:34.542584 containerd[1581]: time="2025-09-09T05:32:34.542566213Z" level=info msg="connecting to shim 81fc8cfa6a52a51231818dfb4ff95b445793400078a19e677c150ebef8218c63" address="unix:///run/containerd/s/5f6309c74fc0d5f6e3c8b6c27a7a57dc69ae00f1c2bd93929f22f07933481844" protocol=ttrpc version=3 Sep 9 05:32:34.550424 kubelet[2392]: I0909 05:32:34.550403 2392 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.551581 kubelet[2392]: E0909 05:32:34.551549 2392 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.109.237.121:6443/api/v1/nodes\": dial tcp 65.109.237.121:6443: connect: connection refused" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.557295 systemd[1]: Started cri-containerd-2816a01197e8189483942238d8a3b0fc2471695b233846a41ccd37929f09aa5a.scope - libcontainer container 2816a01197e8189483942238d8a3b0fc2471695b233846a41ccd37929f09aa5a. Sep 9 05:32:34.569349 systemd[1]: Started cri-containerd-b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0.scope - libcontainer container b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0. Sep 9 05:32:34.577247 systemd[1]: Started cri-containerd-81fc8cfa6a52a51231818dfb4ff95b445793400078a19e677c150ebef8218c63.scope - libcontainer container 81fc8cfa6a52a51231818dfb4ff95b445793400078a19e677c150ebef8218c63. Sep 9 05:32:34.628098 containerd[1581]: time="2025-09-09T05:32:34.628014463Z" level=info msg="StartContainer for \"2816a01197e8189483942238d8a3b0fc2471695b233846a41ccd37929f09aa5a\" returns successfully" Sep 9 05:32:34.632142 kubelet[2392]: W0909 05:32:34.631549 2392 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://65.109.237.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452-0-0-n-de00512edc&limit=500&resourceVersion=0": dial tcp 65.109.237.121:6443: connect: connection refused Sep 9 05:32:34.632142 kubelet[2392]: E0909 05:32:34.632099 2392 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://65.109.237.121:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4452-0-0-n-de00512edc&limit=500&resourceVersion=0\": dial tcp 65.109.237.121:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:32:34.637278 containerd[1581]: time="2025-09-09T05:32:34.637248276Z" level=info msg="StartContainer for \"b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0\" returns successfully" Sep 9 05:32:34.674422 containerd[1581]: time="2025-09-09T05:32:34.674381939Z" level=info msg="StartContainer for \"81fc8cfa6a52a51231818dfb4ff95b445793400078a19e677c150ebef8218c63\" returns successfully" Sep 9 05:32:34.806546 kubelet[2392]: E0909 05:32:34.806375 2392 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.806750 kubelet[2392]: W0909 05:32:34.806719 2392 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://65.109.237.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 65.109.237.121:6443: connect: connection refused Sep 9 05:32:34.806833 kubelet[2392]: E0909 05:32:34.806817 2392 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://65.109.237.121:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 65.109.237.121:6443: connect: connection refused" logger="UnhandledError" Sep 9 05:32:34.809288 kubelet[2392]: E0909 05:32:34.809250 2392 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:34.812601 kubelet[2392]: E0909 05:32:34.812484 2392 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:35.354375 kubelet[2392]: I0909 05:32:35.354293 2392 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:35.817540 kubelet[2392]: E0909 05:32:35.817494 2392 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:35.818324 kubelet[2392]: E0909 05:32:35.818111 2392 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:35.961951 kubelet[2392]: E0909 05:32:35.961913 2392 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4452-0-0-n-de00512edc\" not found" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:36.017005 kubelet[2392]: I0909 05:32:36.016888 2392 kubelet_node_status.go:78] "Successfully registered node" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:36.017005 kubelet[2392]: E0909 05:32:36.016918 2392 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4452-0-0-n-de00512edc\": node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.035235 kubelet[2392]: E0909 05:32:36.035206 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.137057 kubelet[2392]: E0909 05:32:36.135835 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.236806 kubelet[2392]: E0909 05:32:36.236708 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.336963 kubelet[2392]: E0909 05:32:36.336901 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.437329 kubelet[2392]: E0909 05:32:36.437139 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.538125 kubelet[2392]: E0909 05:32:36.538077 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.638920 kubelet[2392]: E0909 05:32:36.638873 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.739129 kubelet[2392]: E0909 05:32:36.738986 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.839222 kubelet[2392]: E0909 05:32:36.839182 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:36.939494 kubelet[2392]: E0909 05:32:36.939427 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:37.040233 kubelet[2392]: E0909 05:32:37.040186 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:37.140973 kubelet[2392]: E0909 05:32:37.140928 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:37.241121 kubelet[2392]: E0909 05:32:37.241079 2392 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4452-0-0-n-de00512edc\" not found" Sep 9 05:32:37.361723 kubelet[2392]: I0909 05:32:37.361615 2392 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:37.370192 kubelet[2392]: I0909 05:32:37.370162 2392 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452-0-0-n-de00512edc" Sep 9 05:32:37.374242 kubelet[2392]: I0909 05:32:37.374040 2392 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:37.734064 kubelet[2392]: I0909 05:32:37.733935 2392 apiserver.go:52] "Watching apiserver" Sep 9 05:32:37.769721 kubelet[2392]: I0909 05:32:37.769691 2392 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 05:32:38.175088 systemd[1]: Reload requested from client PID 2663 ('systemctl') (unit session-7.scope)... Sep 9 05:32:38.175110 systemd[1]: Reloading... Sep 9 05:32:38.258078 zram_generator::config[2704]: No configuration found. Sep 9 05:32:38.449449 systemd[1]: Reloading finished in 273 ms. Sep 9 05:32:38.486982 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:32:38.505579 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 05:32:38.505810 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:38.505881 systemd[1]: kubelet.service: Consumed 860ms CPU time, 127.9M memory peak. Sep 9 05:32:38.508581 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 05:32:38.642645 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 05:32:38.650711 (kubelet)[2758]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 05:32:38.712352 kubelet[2758]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:32:38.712352 kubelet[2758]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 05:32:38.712352 kubelet[2758]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 05:32:38.712352 kubelet[2758]: I0909 05:32:38.712152 2758 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 05:32:38.719243 kubelet[2758]: I0909 05:32:38.719209 2758 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 05:32:38.720047 kubelet[2758]: I0909 05:32:38.719332 2758 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 05:32:38.720047 kubelet[2758]: I0909 05:32:38.719549 2758 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 05:32:38.721964 kubelet[2758]: I0909 05:32:38.721945 2758 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 05:32:38.726861 kubelet[2758]: I0909 05:32:38.726577 2758 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 05:32:38.730549 kubelet[2758]: I0909 05:32:38.730520 2758 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 05:32:38.733822 kubelet[2758]: I0909 05:32:38.733807 2758 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 05:32:38.734126 kubelet[2758]: I0909 05:32:38.734094 2758 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 05:32:38.734481 kubelet[2758]: I0909 05:32:38.734188 2758 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4452-0-0-n-de00512edc","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 05:32:38.734614 kubelet[2758]: I0909 05:32:38.734604 2758 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 05:32:38.734664 kubelet[2758]: I0909 05:32:38.734658 2758 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 05:32:38.734833 kubelet[2758]: I0909 05:32:38.734824 2758 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:32:38.735002 kubelet[2758]: I0909 05:32:38.734991 2758 kubelet.go:446] "Attempting to sync node with API server" Sep 9 05:32:38.735100 kubelet[2758]: I0909 05:32:38.735091 2758 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 05:32:38.735164 kubelet[2758]: I0909 05:32:38.735157 2758 kubelet.go:352] "Adding apiserver pod source" Sep 9 05:32:38.735308 kubelet[2758]: I0909 05:32:38.735301 2758 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 05:32:38.742991 kubelet[2758]: I0909 05:32:38.742974 2758 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 05:32:38.743497 kubelet[2758]: I0909 05:32:38.743486 2758 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 05:32:38.743869 kubelet[2758]: I0909 05:32:38.743858 2758 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 05:32:38.744083 kubelet[2758]: I0909 05:32:38.744061 2758 server.go:1287] "Started kubelet" Sep 9 05:32:38.747620 kubelet[2758]: I0909 05:32:38.747592 2758 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 05:32:38.749513 kubelet[2758]: I0909 05:32:38.748966 2758 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 05:32:38.749729 kubelet[2758]: I0909 05:32:38.749716 2758 server.go:479] "Adding debug handlers to kubelet server" Sep 9 05:32:38.751231 kubelet[2758]: I0909 05:32:38.751168 2758 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 05:32:38.751371 kubelet[2758]: I0909 05:32:38.751346 2758 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 05:32:38.757248 kubelet[2758]: I0909 05:32:38.757219 2758 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 05:32:38.759901 kubelet[2758]: I0909 05:32:38.759871 2758 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 05:32:38.763947 kubelet[2758]: I0909 05:32:38.763693 2758 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 05:32:38.764896 kubelet[2758]: I0909 05:32:38.764878 2758 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 05:32:38.764973 kubelet[2758]: I0909 05:32:38.764961 2758 reconciler.go:26] "Reconciler: start to sync state" Sep 9 05:32:38.765317 kubelet[2758]: I0909 05:32:38.765243 2758 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 05:32:38.765317 kubelet[2758]: I0909 05:32:38.765266 2758 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 05:32:38.765422 kubelet[2758]: I0909 05:32:38.765280 2758 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 05:32:38.765565 kubelet[2758]: I0909 05:32:38.765476 2758 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 05:32:38.765565 kubelet[2758]: E0909 05:32:38.765512 2758 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 05:32:38.770140 kubelet[2758]: I0909 05:32:38.770116 2758 factory.go:221] Registration of the systemd container factory successfully Sep 9 05:32:38.770210 kubelet[2758]: I0909 05:32:38.770188 2758 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 05:32:38.773693 kubelet[2758]: E0909 05:32:38.773674 2758 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 05:32:38.774157 kubelet[2758]: I0909 05:32:38.774105 2758 factory.go:221] Registration of the containerd container factory successfully Sep 9 05:32:38.820812 kubelet[2758]: I0909 05:32:38.820788 2758 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 05:32:38.820812 kubelet[2758]: I0909 05:32:38.820804 2758 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 05:32:38.820812 kubelet[2758]: I0909 05:32:38.820820 2758 state_mem.go:36] "Initialized new in-memory state store" Sep 9 05:32:38.820985 kubelet[2758]: I0909 05:32:38.820937 2758 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 05:32:38.820985 kubelet[2758]: I0909 05:32:38.820945 2758 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 05:32:38.820985 kubelet[2758]: I0909 05:32:38.820959 2758 policy_none.go:49] "None policy: Start" Sep 9 05:32:38.820985 kubelet[2758]: I0909 05:32:38.820966 2758 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 05:32:38.820985 kubelet[2758]: I0909 05:32:38.820973 2758 state_mem.go:35] "Initializing new in-memory state store" Sep 9 05:32:38.821102 kubelet[2758]: I0909 05:32:38.821068 2758 state_mem.go:75] "Updated machine memory state" Sep 9 05:32:38.824767 kubelet[2758]: I0909 05:32:38.824744 2758 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 05:32:38.824881 kubelet[2758]: I0909 05:32:38.824856 2758 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 05:32:38.824914 kubelet[2758]: I0909 05:32:38.824875 2758 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 05:32:38.825442 kubelet[2758]: I0909 05:32:38.825395 2758 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 05:32:38.826437 kubelet[2758]: E0909 05:32:38.826381 2758 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 05:32:38.866698 kubelet[2758]: I0909 05:32:38.866648 2758 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:38.869432 kubelet[2758]: I0909 05:32:38.869400 2758 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:38.869689 kubelet[2758]: I0909 05:32:38.869658 2758 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4452-0-0-n-de00512edc" Sep 9 05:32:38.873236 kubelet[2758]: E0909 05:32:38.873204 2758 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452-0-0-n-de00512edc\" already exists" pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:38.875977 kubelet[2758]: E0909 05:32:38.875950 2758 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4452-0-0-n-de00512edc\" already exists" pod="kube-system/kube-scheduler-ci-4452-0-0-n-de00512edc" Sep 9 05:32:38.877153 kubelet[2758]: E0909 05:32:38.877103 2758 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" already exists" pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:38.935121 kubelet[2758]: I0909 05:32:38.935070 2758 kubelet_node_status.go:75] "Attempting to register node" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:38.944098 kubelet[2758]: I0909 05:32:38.944064 2758 kubelet_node_status.go:124] "Node was previously registered" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:38.944226 kubelet[2758]: I0909 05:32:38.944154 2758 kubelet_node_status.go:78] "Successfully registered node" node="ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066162 kubelet[2758]: I0909 05:32:39.066098 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ea2e3e2e3f0d27b731b4b6cda3d70855-ca-certs\") pod \"kube-apiserver-ci-4452-0-0-n-de00512edc\" (UID: \"ea2e3e2e3f0d27b731b4b6cda3d70855\") " pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066162 kubelet[2758]: I0909 05:32:39.066142 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ea2e3e2e3f0d27b731b4b6cda3d70855-k8s-certs\") pod \"kube-apiserver-ci-4452-0-0-n-de00512edc\" (UID: \"ea2e3e2e3f0d27b731b4b6cda3d70855\") " pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066333 kubelet[2758]: I0909 05:32:39.066180 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ea2e3e2e3f0d27b731b4b6cda3d70855-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4452-0-0-n-de00512edc\" (UID: \"ea2e3e2e3f0d27b731b4b6cda3d70855\") " pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066333 kubelet[2758]: I0909 05:32:39.066209 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-flexvolume-dir\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066333 kubelet[2758]: I0909 05:32:39.066237 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066333 kubelet[2758]: I0909 05:32:39.066261 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-ca-certs\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066333 kubelet[2758]: I0909 05:32:39.066281 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-k8s-certs\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066515 kubelet[2758]: I0909 05:32:39.066305 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/54be5da420ebbb69fe2357682547bd05-kubeconfig\") pod \"kube-controller-manager-ci-4452-0-0-n-de00512edc\" (UID: \"54be5da420ebbb69fe2357682547bd05\") " pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.066515 kubelet[2758]: I0909 05:32:39.066329 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/abf10bac82dabdf156ba0ebc027d4dc0-kubeconfig\") pod \"kube-scheduler-ci-4452-0-0-n-de00512edc\" (UID: \"abf10bac82dabdf156ba0ebc027d4dc0\") " pod="kube-system/kube-scheduler-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.740391 kubelet[2758]: I0909 05:32:39.740340 2758 apiserver.go:52] "Watching apiserver" Sep 9 05:32:39.765327 kubelet[2758]: I0909 05:32:39.765284 2758 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 05:32:39.805916 kubelet[2758]: I0909 05:32:39.805879 2758 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.823483 kubelet[2758]: E0909 05:32:39.823299 2758 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4452-0-0-n-de00512edc\" already exists" pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" Sep 9 05:32:39.860915 kubelet[2758]: I0909 05:32:39.860864 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4452-0-0-n-de00512edc" podStartSLOduration=2.860848028 podStartE2EDuration="2.860848028s" podCreationTimestamp="2025-09-09 05:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:32:39.851301827 +0000 UTC m=+1.194489353" watchObservedRunningTime="2025-09-09 05:32:39.860848028 +0000 UTC m=+1.204035544" Sep 9 05:32:39.886123 kubelet[2758]: I0909 05:32:39.885766 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4452-0-0-n-de00512edc" podStartSLOduration=2.8857507399999998 podStartE2EDuration="2.88575074s" podCreationTimestamp="2025-09-09 05:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:32:39.861360637 +0000 UTC m=+1.204548153" watchObservedRunningTime="2025-09-09 05:32:39.88575074 +0000 UTC m=+1.228938255" Sep 9 05:32:39.886688 kubelet[2758]: I0909 05:32:39.886089 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4452-0-0-n-de00512edc" podStartSLOduration=2.886080566 podStartE2EDuration="2.886080566s" podCreationTimestamp="2025-09-09 05:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:32:39.885144576 +0000 UTC m=+1.228332091" watchObservedRunningTime="2025-09-09 05:32:39.886080566 +0000 UTC m=+1.229268082" Sep 9 05:32:44.437876 kubelet[2758]: I0909 05:32:44.437822 2758 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 05:32:44.438979 kubelet[2758]: I0909 05:32:44.438762 2758 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 05:32:44.439062 containerd[1581]: time="2025-09-09T05:32:44.438313560Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 05:32:44.644325 update_engine[1566]: I20250909 05:32:44.644213 1566 update_attempter.cc:509] Updating boot flags... Sep 9 05:32:45.279632 systemd[1]: Created slice kubepods-besteffort-pod162d4b59_cafa_45be_844c_18ab49d773e8.slice - libcontainer container kubepods-besteffort-pod162d4b59_cafa_45be_844c_18ab49d773e8.slice. Sep 9 05:32:45.306380 kubelet[2758]: I0909 05:32:45.306240 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/162d4b59-cafa-45be-844c-18ab49d773e8-kube-proxy\") pod \"kube-proxy-lphp7\" (UID: \"162d4b59-cafa-45be-844c-18ab49d773e8\") " pod="kube-system/kube-proxy-lphp7" Sep 9 05:32:45.306380 kubelet[2758]: I0909 05:32:45.306277 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/162d4b59-cafa-45be-844c-18ab49d773e8-lib-modules\") pod \"kube-proxy-lphp7\" (UID: \"162d4b59-cafa-45be-844c-18ab49d773e8\") " pod="kube-system/kube-proxy-lphp7" Sep 9 05:32:45.306380 kubelet[2758]: I0909 05:32:45.306302 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/162d4b59-cafa-45be-844c-18ab49d773e8-xtables-lock\") pod \"kube-proxy-lphp7\" (UID: \"162d4b59-cafa-45be-844c-18ab49d773e8\") " pod="kube-system/kube-proxy-lphp7" Sep 9 05:32:45.306380 kubelet[2758]: I0909 05:32:45.306317 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btsd4\" (UniqueName: \"kubernetes.io/projected/162d4b59-cafa-45be-844c-18ab49d773e8-kube-api-access-btsd4\") pod \"kube-proxy-lphp7\" (UID: \"162d4b59-cafa-45be-844c-18ab49d773e8\") " pod="kube-system/kube-proxy-lphp7" Sep 9 05:32:45.542673 systemd[1]: Created slice kubepods-besteffort-poda3878727_d80e_4b59_8b29_c5cbb3836976.slice - libcontainer container kubepods-besteffort-poda3878727_d80e_4b59_8b29_c5cbb3836976.slice. Sep 9 05:32:45.592852 containerd[1581]: time="2025-09-09T05:32:45.592778124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lphp7,Uid:162d4b59-cafa-45be-844c-18ab49d773e8,Namespace:kube-system,Attempt:0,}" Sep 9 05:32:45.609734 kubelet[2758]: I0909 05:32:45.608974 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a3878727-d80e-4b59-8b29-c5cbb3836976-var-lib-calico\") pod \"tigera-operator-755d956888-6dlvc\" (UID: \"a3878727-d80e-4b59-8b29-c5cbb3836976\") " pod="tigera-operator/tigera-operator-755d956888-6dlvc" Sep 9 05:32:45.609734 kubelet[2758]: I0909 05:32:45.609423 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsp9v\" (UniqueName: \"kubernetes.io/projected/a3878727-d80e-4b59-8b29-c5cbb3836976-kube-api-access-gsp9v\") pod \"tigera-operator-755d956888-6dlvc\" (UID: \"a3878727-d80e-4b59-8b29-c5cbb3836976\") " pod="tigera-operator/tigera-operator-755d956888-6dlvc" Sep 9 05:32:45.619535 containerd[1581]: time="2025-09-09T05:32:45.619500060Z" level=info msg="connecting to shim 76aa37287fef3ec679ce9b2e59fb907397bfe81aa9871161af404d72d6b7ff13" address="unix:///run/containerd/s/ef7ba84624185adce2525591b65a2889576ac5a4c88cc331bfc1fadf5dd60179" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:32:45.650299 systemd[1]: Started cri-containerd-76aa37287fef3ec679ce9b2e59fb907397bfe81aa9871161af404d72d6b7ff13.scope - libcontainer container 76aa37287fef3ec679ce9b2e59fb907397bfe81aa9871161af404d72d6b7ff13. Sep 9 05:32:45.677320 containerd[1581]: time="2025-09-09T05:32:45.677280504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-lphp7,Uid:162d4b59-cafa-45be-844c-18ab49d773e8,Namespace:kube-system,Attempt:0,} returns sandbox id \"76aa37287fef3ec679ce9b2e59fb907397bfe81aa9871161af404d72d6b7ff13\"" Sep 9 05:32:45.682204 containerd[1581]: time="2025-09-09T05:32:45.682161757Z" level=info msg="CreateContainer within sandbox \"76aa37287fef3ec679ce9b2e59fb907397bfe81aa9871161af404d72d6b7ff13\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 05:32:45.694625 containerd[1581]: time="2025-09-09T05:32:45.694575380Z" level=info msg="Container abea4af063c10bf49e677501c96dd371ecc777573322936e3a4c39cc31b91fd3: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:32:45.697174 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3242591477.mount: Deactivated successfully. Sep 9 05:32:45.703660 containerd[1581]: time="2025-09-09T05:32:45.703632360Z" level=info msg="CreateContainer within sandbox \"76aa37287fef3ec679ce9b2e59fb907397bfe81aa9871161af404d72d6b7ff13\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"abea4af063c10bf49e677501c96dd371ecc777573322936e3a4c39cc31b91fd3\"" Sep 9 05:32:45.704710 containerd[1581]: time="2025-09-09T05:32:45.704676532Z" level=info msg="StartContainer for \"abea4af063c10bf49e677501c96dd371ecc777573322936e3a4c39cc31b91fd3\"" Sep 9 05:32:45.705906 containerd[1581]: time="2025-09-09T05:32:45.705881282Z" level=info msg="connecting to shim abea4af063c10bf49e677501c96dd371ecc777573322936e3a4c39cc31b91fd3" address="unix:///run/containerd/s/ef7ba84624185adce2525591b65a2889576ac5a4c88cc331bfc1fadf5dd60179" protocol=ttrpc version=3 Sep 9 05:32:45.734267 systemd[1]: Started cri-containerd-abea4af063c10bf49e677501c96dd371ecc777573322936e3a4c39cc31b91fd3.scope - libcontainer container abea4af063c10bf49e677501c96dd371ecc777573322936e3a4c39cc31b91fd3. Sep 9 05:32:45.778157 containerd[1581]: time="2025-09-09T05:32:45.778099250Z" level=info msg="StartContainer for \"abea4af063c10bf49e677501c96dd371ecc777573322936e3a4c39cc31b91fd3\" returns successfully" Sep 9 05:32:45.834120 kubelet[2758]: I0909 05:32:45.833662 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-lphp7" podStartSLOduration=0.833644547 podStartE2EDuration="833.644547ms" podCreationTimestamp="2025-09-09 05:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:32:45.833371316 +0000 UTC m=+7.176558832" watchObservedRunningTime="2025-09-09 05:32:45.833644547 +0000 UTC m=+7.176832063" Sep 9 05:32:45.845593 containerd[1581]: time="2025-09-09T05:32:45.845523417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-6dlvc,Uid:a3878727-d80e-4b59-8b29-c5cbb3836976,Namespace:tigera-operator,Attempt:0,}" Sep 9 05:32:45.869625 containerd[1581]: time="2025-09-09T05:32:45.869560859Z" level=info msg="connecting to shim f48115aad17413e2635af8e47e2d02d04c9475a55095d71f85603ca174d473b0" address="unix:///run/containerd/s/ce089228e9cea4860f72bac773f0eb35f49c9e7870814f4c9d8ca14675dddc96" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:32:45.896246 systemd[1]: Started cri-containerd-f48115aad17413e2635af8e47e2d02d04c9475a55095d71f85603ca174d473b0.scope - libcontainer container f48115aad17413e2635af8e47e2d02d04c9475a55095d71f85603ca174d473b0. Sep 9 05:32:45.964909 containerd[1581]: time="2025-09-09T05:32:45.964816356Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-6dlvc,Uid:a3878727-d80e-4b59-8b29-c5cbb3836976,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f48115aad17413e2635af8e47e2d02d04c9475a55095d71f85603ca174d473b0\"" Sep 9 05:32:45.971795 containerd[1581]: time="2025-09-09T05:32:45.971727784Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 05:32:47.945900 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2998506621.mount: Deactivated successfully. Sep 9 05:32:48.460220 containerd[1581]: time="2025-09-09T05:32:48.460166157Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:48.461228 containerd[1581]: time="2025-09-09T05:32:48.461072495Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 9 05:32:48.461934 containerd[1581]: time="2025-09-09T05:32:48.461912346Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:48.463629 containerd[1581]: time="2025-09-09T05:32:48.463590857Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:32:48.464111 containerd[1581]: time="2025-09-09T05:32:48.464093177Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.492175195s" Sep 9 05:32:48.464177 containerd[1581]: time="2025-09-09T05:32:48.464166505Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 9 05:32:48.466964 containerd[1581]: time="2025-09-09T05:32:48.466923385Z" level=info msg="CreateContainer within sandbox \"f48115aad17413e2635af8e47e2d02d04c9475a55095d71f85603ca174d473b0\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 05:32:48.473707 containerd[1581]: time="2025-09-09T05:32:48.472588931Z" level=info msg="Container 585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:32:48.482553 containerd[1581]: time="2025-09-09T05:32:48.482528174Z" level=info msg="CreateContainer within sandbox \"f48115aad17413e2635af8e47e2d02d04c9475a55095d71f85603ca174d473b0\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8\"" Sep 9 05:32:48.484041 containerd[1581]: time="2025-09-09T05:32:48.483997128Z" level=info msg="StartContainer for \"585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8\"" Sep 9 05:32:48.484577 containerd[1581]: time="2025-09-09T05:32:48.484552901Z" level=info msg="connecting to shim 585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8" address="unix:///run/containerd/s/ce089228e9cea4860f72bac773f0eb35f49c9e7870814f4c9d8ca14675dddc96" protocol=ttrpc version=3 Sep 9 05:32:48.504169 systemd[1]: Started cri-containerd-585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8.scope - libcontainer container 585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8. Sep 9 05:32:48.528954 containerd[1581]: time="2025-09-09T05:32:48.528883687Z" level=info msg="StartContainer for \"585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8\" returns successfully" Sep 9 05:32:48.853315 kubelet[2758]: I0909 05:32:48.852332 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-6dlvc" podStartSLOduration=1.356349131 podStartE2EDuration="3.85231631s" podCreationTimestamp="2025-09-09 05:32:45 +0000 UTC" firstStartedPulling="2025-09-09 05:32:45.968870903 +0000 UTC m=+7.312058409" lastFinishedPulling="2025-09-09 05:32:48.464838072 +0000 UTC m=+9.808025588" observedRunningTime="2025-09-09 05:32:48.851952502 +0000 UTC m=+10.195140039" watchObservedRunningTime="2025-09-09 05:32:48.85231631 +0000 UTC m=+10.195503836" Sep 9 05:32:54.054193 sudo[1831]: pam_unix(sudo:session): session closed for user root Sep 9 05:32:54.211095 sshd[1830]: Connection closed by 147.75.109.163 port 52146 Sep 9 05:32:54.212418 sshd-session[1827]: pam_unix(sshd:session): session closed for user core Sep 9 05:32:54.216385 systemd[1]: sshd@6-65.109.237.121:22-147.75.109.163:52146.service: Deactivated successfully. Sep 9 05:32:54.218505 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 05:32:54.218765 systemd[1]: session-7.scope: Consumed 4.591s CPU time, 155.9M memory peak. Sep 9 05:32:54.220847 systemd-logind[1561]: Session 7 logged out. Waiting for processes to exit. Sep 9 05:32:54.222628 systemd-logind[1561]: Removed session 7. Sep 9 05:32:57.153190 systemd[1]: Created slice kubepods-besteffort-pod0b5d05be_be0f_4eb0_b845_3fdfd9e30cba.slice - libcontainer container kubepods-besteffort-pod0b5d05be_be0f_4eb0_b845_3fdfd9e30cba.slice. Sep 9 05:32:57.182838 kubelet[2758]: I0909 05:32:57.182789 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0b5d05be-be0f-4eb0-b845-3fdfd9e30cba-tigera-ca-bundle\") pod \"calico-typha-8d9bbc9f7-ldk9m\" (UID: \"0b5d05be-be0f-4eb0-b845-3fdfd9e30cba\") " pod="calico-system/calico-typha-8d9bbc9f7-ldk9m" Sep 9 05:32:57.182838 kubelet[2758]: I0909 05:32:57.182832 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/0b5d05be-be0f-4eb0-b845-3fdfd9e30cba-typha-certs\") pod \"calico-typha-8d9bbc9f7-ldk9m\" (UID: \"0b5d05be-be0f-4eb0-b845-3fdfd9e30cba\") " pod="calico-system/calico-typha-8d9bbc9f7-ldk9m" Sep 9 05:32:57.183248 kubelet[2758]: I0909 05:32:57.182853 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c2sv\" (UniqueName: \"kubernetes.io/projected/0b5d05be-be0f-4eb0-b845-3fdfd9e30cba-kube-api-access-4c2sv\") pod \"calico-typha-8d9bbc9f7-ldk9m\" (UID: \"0b5d05be-be0f-4eb0-b845-3fdfd9e30cba\") " pod="calico-system/calico-typha-8d9bbc9f7-ldk9m" Sep 9 05:32:57.464219 containerd[1581]: time="2025-09-09T05:32:57.464095533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8d9bbc9f7-ldk9m,Uid:0b5d05be-be0f-4eb0-b845-3fdfd9e30cba,Namespace:calico-system,Attempt:0,}" Sep 9 05:32:57.492492 containerd[1581]: time="2025-09-09T05:32:57.492032719Z" level=info msg="connecting to shim ef2be9aff683a9893bc9ee78977be630e4ccffd341c7949c1a55a405e0cb7bc9" address="unix:///run/containerd/s/3d44f451a7b24c066bb1d30469612aa0889d08d34cea9ef41ded81a065931783" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:32:57.528364 systemd[1]: Started cri-containerd-ef2be9aff683a9893bc9ee78977be630e4ccffd341c7949c1a55a405e0cb7bc9.scope - libcontainer container ef2be9aff683a9893bc9ee78977be630e4ccffd341c7949c1a55a405e0cb7bc9. Sep 9 05:32:57.547152 systemd[1]: Created slice kubepods-besteffort-pode3c6de88_59d9_4aab_ac8d_d1850c925705.slice - libcontainer container kubepods-besteffort-pode3c6de88_59d9_4aab_ac8d_d1850c925705.slice. Sep 9 05:32:57.586279 kubelet[2758]: I0909 05:32:57.586225 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-lib-modules\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586279 kubelet[2758]: I0909 05:32:57.586271 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-xtables-lock\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586279 kubelet[2758]: I0909 05:32:57.586288 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-policysync\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586484 kubelet[2758]: I0909 05:32:57.586301 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-cni-log-dir\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586484 kubelet[2758]: I0909 05:32:57.586316 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-flexvol-driver-host\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586484 kubelet[2758]: I0909 05:32:57.586331 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e3c6de88-59d9-4aab-ac8d-d1850c925705-node-certs\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586484 kubelet[2758]: I0909 05:32:57.586342 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-var-run-calico\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586484 kubelet[2758]: I0909 05:32:57.586355 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-cni-bin-dir\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586587 kubelet[2758]: I0909 05:32:57.586367 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-var-lib-calico\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586587 kubelet[2758]: I0909 05:32:57.586391 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw2kh\" (UniqueName: \"kubernetes.io/projected/e3c6de88-59d9-4aab-ac8d-d1850c925705-kube-api-access-dw2kh\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586587 kubelet[2758]: I0909 05:32:57.586404 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e3c6de88-59d9-4aab-ac8d-d1850c925705-cni-net-dir\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.586587 kubelet[2758]: I0909 05:32:57.586416 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3c6de88-59d9-4aab-ac8d-d1850c925705-tigera-ca-bundle\") pod \"calico-node-wxpcs\" (UID: \"e3c6de88-59d9-4aab-ac8d-d1850c925705\") " pod="calico-system/calico-node-wxpcs" Sep 9 05:32:57.608973 containerd[1581]: time="2025-09-09T05:32:57.608918209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-8d9bbc9f7-ldk9m,Uid:0b5d05be-be0f-4eb0-b845-3fdfd9e30cba,Namespace:calico-system,Attempt:0,} returns sandbox id \"ef2be9aff683a9893bc9ee78977be630e4ccffd341c7949c1a55a405e0cb7bc9\"" Sep 9 05:32:57.610965 containerd[1581]: time="2025-09-09T05:32:57.610931515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 05:32:57.699383 kubelet[2758]: E0909 05:32:57.698129 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.699383 kubelet[2758]: W0909 05:32:57.698161 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.703365 kubelet[2758]: E0909 05:32:57.703129 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.705823 kubelet[2758]: E0909 05:32:57.705448 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.705920 kubelet[2758]: W0909 05:32:57.705835 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.705920 kubelet[2758]: E0909 05:32:57.705863 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.832485 kubelet[2758]: E0909 05:32:57.831802 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qrqmt" podUID="c1606d3b-7ca4-49f1-8973-eb24dc83693b" Sep 9 05:32:57.853127 containerd[1581]: time="2025-09-09T05:32:57.852869946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wxpcs,Uid:e3c6de88-59d9-4aab-ac8d-d1850c925705,Namespace:calico-system,Attempt:0,}" Sep 9 05:32:57.872261 containerd[1581]: time="2025-09-09T05:32:57.872228778Z" level=info msg="connecting to shim 64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894" address="unix:///run/containerd/s/949320806d6b47ace0ad1fd02480b30b7ae30e7d890f69cd7333156a8512f7ac" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:32:57.876716 kubelet[2758]: E0909 05:32:57.876678 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.876893 kubelet[2758]: W0909 05:32:57.876827 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.876893 kubelet[2758]: E0909 05:32:57.876852 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.877155 kubelet[2758]: E0909 05:32:57.877139 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.877232 kubelet[2758]: W0909 05:32:57.877211 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.877289 kubelet[2758]: E0909 05:32:57.877280 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.877537 kubelet[2758]: E0909 05:32:57.877481 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.877537 kubelet[2758]: W0909 05:32:57.877489 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.877537 kubelet[2758]: E0909 05:32:57.877498 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.879043 kubelet[2758]: E0909 05:32:57.878961 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.879043 kubelet[2758]: W0909 05:32:57.878971 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.879043 kubelet[2758]: E0909 05:32:57.878979 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.879288 kubelet[2758]: E0909 05:32:57.879278 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.880319 kubelet[2758]: W0909 05:32:57.880215 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.880319 kubelet[2758]: E0909 05:32:57.880230 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.880547 kubelet[2758]: E0909 05:32:57.880442 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.880547 kubelet[2758]: W0909 05:32:57.880451 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.880547 kubelet[2758]: E0909 05:32:57.880459 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.880833 kubelet[2758]: E0909 05:32:57.880706 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.880833 kubelet[2758]: W0909 05:32:57.880715 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.880833 kubelet[2758]: E0909 05:32:57.880723 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.881062 kubelet[2758]: E0909 05:32:57.880965 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.881062 kubelet[2758]: W0909 05:32:57.880974 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.881062 kubelet[2758]: E0909 05:32:57.880981 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.881435 kubelet[2758]: E0909 05:32:57.881341 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.881435 kubelet[2758]: W0909 05:32:57.881352 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.881435 kubelet[2758]: E0909 05:32:57.881360 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.881749 kubelet[2758]: E0909 05:32:57.881548 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.881749 kubelet[2758]: W0909 05:32:57.881557 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.881749 kubelet[2758]: E0909 05:32:57.881564 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.881749 kubelet[2758]: E0909 05:32:57.881708 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.881749 kubelet[2758]: W0909 05:32:57.881714 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.881749 kubelet[2758]: E0909 05:32:57.881720 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.882014 kubelet[2758]: E0909 05:32:57.881956 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.882014 kubelet[2758]: W0909 05:32:57.881964 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.882014 kubelet[2758]: E0909 05:32:57.881971 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.882269 kubelet[2758]: E0909 05:32:57.882231 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.882269 kubelet[2758]: W0909 05:32:57.882239 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.882269 kubelet[2758]: E0909 05:32:57.882245 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.882496 kubelet[2758]: E0909 05:32:57.882455 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.882496 kubelet[2758]: W0909 05:32:57.882464 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.882496 kubelet[2758]: E0909 05:32:57.882471 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.882764 kubelet[2758]: E0909 05:32:57.882690 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.882764 kubelet[2758]: W0909 05:32:57.882699 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.882764 kubelet[2758]: E0909 05:32:57.882709 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.882970 kubelet[2758]: E0909 05:32:57.882918 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.882970 kubelet[2758]: W0909 05:32:57.882926 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.882970 kubelet[2758]: E0909 05:32:57.882933 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.883252 kubelet[2758]: E0909 05:32:57.883177 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.883252 kubelet[2758]: W0909 05:32:57.883185 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.883252 kubelet[2758]: E0909 05:32:57.883192 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.883460 kubelet[2758]: E0909 05:32:57.883420 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.883460 kubelet[2758]: W0909 05:32:57.883428 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.883460 kubelet[2758]: E0909 05:32:57.883435 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.883668 kubelet[2758]: E0909 05:32:57.883626 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.883668 kubelet[2758]: W0909 05:32:57.883634 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.883668 kubelet[2758]: E0909 05:32:57.883641 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.883927 kubelet[2758]: E0909 05:32:57.883846 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.883927 kubelet[2758]: W0909 05:32:57.883857 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.883927 kubelet[2758]: E0909 05:32:57.883867 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.889208 kubelet[2758]: E0909 05:32:57.889144 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.889208 kubelet[2758]: W0909 05:32:57.889185 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.889208 kubelet[2758]: E0909 05:32:57.889206 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.889296 kubelet[2758]: I0909 05:32:57.889237 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c1606d3b-7ca4-49f1-8973-eb24dc83693b-kubelet-dir\") pod \"csi-node-driver-qrqmt\" (UID: \"c1606d3b-7ca4-49f1-8973-eb24dc83693b\") " pod="calico-system/csi-node-driver-qrqmt" Sep 9 05:32:57.889724 kubelet[2758]: E0909 05:32:57.889712 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.890032 kubelet[2758]: W0909 05:32:57.889984 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.890032 kubelet[2758]: E0909 05:32:57.890002 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.890333 kubelet[2758]: I0909 05:32:57.890287 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c1606d3b-7ca4-49f1-8973-eb24dc83693b-varrun\") pod \"csi-node-driver-qrqmt\" (UID: \"c1606d3b-7ca4-49f1-8973-eb24dc83693b\") " pod="calico-system/csi-node-driver-qrqmt" Sep 9 05:32:57.890538 kubelet[2758]: E0909 05:32:57.890529 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.890678 kubelet[2758]: W0909 05:32:57.890584 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.890678 kubelet[2758]: E0909 05:32:57.890638 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.890979 kubelet[2758]: E0909 05:32:57.890955 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.891106 kubelet[2758]: W0909 05:32:57.890964 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.891106 kubelet[2758]: E0909 05:32:57.891059 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.891480 kubelet[2758]: E0909 05:32:57.891384 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.891480 kubelet[2758]: W0909 05:32:57.891393 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.891480 kubelet[2758]: E0909 05:32:57.891401 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.892246 kubelet[2758]: I0909 05:32:57.891417 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c1606d3b-7ca4-49f1-8973-eb24dc83693b-registration-dir\") pod \"csi-node-driver-qrqmt\" (UID: \"c1606d3b-7ca4-49f1-8973-eb24dc83693b\") " pod="calico-system/csi-node-driver-qrqmt" Sep 9 05:32:57.892246 kubelet[2758]: E0909 05:32:57.892127 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.892246 kubelet[2758]: W0909 05:32:57.892135 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.892246 kubelet[2758]: E0909 05:32:57.892151 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.892472 kubelet[2758]: E0909 05:32:57.892409 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.892472 kubelet[2758]: W0909 05:32:57.892418 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.892680 kubelet[2758]: E0909 05:32:57.892628 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.892680 kubelet[2758]: W0909 05:32:57.892638 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.892680 kubelet[2758]: E0909 05:32:57.892647 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.892680 kubelet[2758]: I0909 05:32:57.892660 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c1606d3b-7ca4-49f1-8973-eb24dc83693b-socket-dir\") pod \"csi-node-driver-qrqmt\" (UID: \"c1606d3b-7ca4-49f1-8973-eb24dc83693b\") " pod="calico-system/csi-node-driver-qrqmt" Sep 9 05:32:57.892902 kubelet[2758]: E0909 05:32:57.892432 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.893157 systemd[1]: Started cri-containerd-64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894.scope - libcontainer container 64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894. Sep 9 05:32:57.893585 kubelet[2758]: E0909 05:32:57.893522 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.893585 kubelet[2758]: W0909 05:32:57.893533 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.893585 kubelet[2758]: E0909 05:32:57.893547 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.894048 kubelet[2758]: E0909 05:32:57.894012 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.894127 kubelet[2758]: W0909 05:32:57.894108 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.894294 kubelet[2758]: E0909 05:32:57.894165 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.894415 kubelet[2758]: E0909 05:32:57.894395 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.894415 kubelet[2758]: W0909 05:32:57.894405 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.894536 kubelet[2758]: E0909 05:32:57.894503 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.894731 kubelet[2758]: E0909 05:32:57.894712 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.894731 kubelet[2758]: W0909 05:32:57.894721 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.894876 kubelet[2758]: E0909 05:32:57.894855 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.895043 kubelet[2758]: E0909 05:32:57.895011 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.895111 kubelet[2758]: W0909 05:32:57.895102 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.895154 kubelet[2758]: E0909 05:32:57.895147 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.895235 kubelet[2758]: I0909 05:32:57.895220 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwpsq\" (UniqueName: \"kubernetes.io/projected/c1606d3b-7ca4-49f1-8973-eb24dc83693b-kube-api-access-gwpsq\") pod \"csi-node-driver-qrqmt\" (UID: \"c1606d3b-7ca4-49f1-8973-eb24dc83693b\") " pod="calico-system/csi-node-driver-qrqmt" Sep 9 05:32:57.895511 kubelet[2758]: E0909 05:32:57.895501 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.895585 kubelet[2758]: W0909 05:32:57.895577 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.895663 kubelet[2758]: E0909 05:32:57.895654 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.895941 kubelet[2758]: E0909 05:32:57.895932 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.896075 kubelet[2758]: W0909 05:32:57.896008 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.896075 kubelet[2758]: E0909 05:32:57.896057 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.918843 containerd[1581]: time="2025-09-09T05:32:57.918765485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wxpcs,Uid:e3c6de88-59d9-4aab-ac8d-d1850c925705,Namespace:calico-system,Attempt:0,} returns sandbox id \"64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894\"" Sep 9 05:32:57.996350 kubelet[2758]: E0909 05:32:57.996229 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.996837 kubelet[2758]: W0909 05:32:57.996441 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.996837 kubelet[2758]: E0909 05:32:57.996465 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.996837 kubelet[2758]: E0909 05:32:57.996768 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.996837 kubelet[2758]: W0909 05:32:57.996781 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.996837 kubelet[2758]: E0909 05:32:57.996795 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.997269 kubelet[2758]: E0909 05:32:57.997122 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.997269 kubelet[2758]: W0909 05:32:57.997132 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.997563 kubelet[2758]: E0909 05:32:57.997541 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.997839 kubelet[2758]: E0909 05:32:57.997821 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.997903 kubelet[2758]: W0909 05:32:57.997860 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.998045 kubelet[2758]: E0909 05:32:57.997957 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.998544 kubelet[2758]: E0909 05:32:57.998441 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.998668 kubelet[2758]: W0909 05:32:57.998653 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.998828 kubelet[2758]: E0909 05:32:57.998720 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.999008 kubelet[2758]: E0909 05:32:57.998979 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.999008 kubelet[2758]: W0909 05:32:57.998995 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.999175 kubelet[2758]: E0909 05:32:57.999014 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.999388 kubelet[2758]: E0909 05:32:57.999371 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.999388 kubelet[2758]: W0909 05:32:57.999386 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.999567 kubelet[2758]: E0909 05:32:57.999410 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:57.999897 kubelet[2758]: E0909 05:32:57.999880 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:57.999897 kubelet[2758]: W0909 05:32:57.999893 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:57.999966 kubelet[2758]: E0909 05:32:57.999917 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.000716 kubelet[2758]: E0909 05:32:58.000449 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.000716 kubelet[2758]: W0909 05:32:58.000464 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.000716 kubelet[2758]: E0909 05:32:58.000494 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.000808 kubelet[2758]: E0909 05:32:58.000742 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.000808 kubelet[2758]: W0909 05:32:58.000750 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.001212 kubelet[2758]: E0909 05:32:58.000959 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.001212 kubelet[2758]: W0909 05:32:58.000974 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.001212 kubelet[2758]: E0909 05:32:58.001007 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.001212 kubelet[2758]: E0909 05:32:58.001091 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.001727 kubelet[2758]: E0909 05:32:58.001241 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.001727 kubelet[2758]: W0909 05:32:58.001262 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.001727 kubelet[2758]: E0909 05:32:58.001323 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.001727 kubelet[2758]: E0909 05:32:58.001548 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.001727 kubelet[2758]: W0909 05:32:58.001557 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.001727 kubelet[2758]: E0909 05:32:58.001604 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.002274 kubelet[2758]: E0909 05:32:58.001792 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.002274 kubelet[2758]: W0909 05:32:58.001826 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.002274 kubelet[2758]: E0909 05:32:58.001836 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.002274 kubelet[2758]: E0909 05:32:58.002162 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.002274 kubelet[2758]: W0909 05:32:58.002170 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.002372 kubelet[2758]: E0909 05:32:58.002351 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.002741 kubelet[2758]: E0909 05:32:58.002439 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.002741 kubelet[2758]: W0909 05:32:58.002448 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.002741 kubelet[2758]: E0909 05:32:58.002607 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.002741 kubelet[2758]: E0909 05:32:58.002686 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.002741 kubelet[2758]: W0909 05:32:58.002694 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.003360 kubelet[2758]: E0909 05:32:58.002893 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.003360 kubelet[2758]: E0909 05:32:58.002941 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.003360 kubelet[2758]: W0909 05:32:58.002948 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.003360 kubelet[2758]: E0909 05:32:58.003322 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.004096 kubelet[2758]: E0909 05:32:58.004078 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.004096 kubelet[2758]: W0909 05:32:58.004093 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.004278 kubelet[2758]: E0909 05:32:58.004117 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.004301 kubelet[2758]: E0909 05:32:58.004287 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.004322 kubelet[2758]: W0909 05:32:58.004300 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.004584 kubelet[2758]: E0909 05:32:58.004397 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.004584 kubelet[2758]: E0909 05:32:58.004462 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.004584 kubelet[2758]: W0909 05:32:58.004469 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.004584 kubelet[2758]: E0909 05:32:58.004499 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.004584 kubelet[2758]: E0909 05:32:58.004582 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.004994 kubelet[2758]: W0909 05:32:58.004589 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.004994 kubelet[2758]: E0909 05:32:58.004691 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.004994 kubelet[2758]: E0909 05:32:58.004799 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.004994 kubelet[2758]: W0909 05:32:58.004807 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.004994 kubelet[2758]: E0909 05:32:58.004827 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.005325 kubelet[2758]: E0909 05:32:58.005065 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.005325 kubelet[2758]: W0909 05:32:58.005077 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.005325 kubelet[2758]: E0909 05:32:58.005102 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.005520 kubelet[2758]: E0909 05:32:58.005502 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.005556 kubelet[2758]: W0909 05:32:58.005520 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.005556 kubelet[2758]: E0909 05:32:58.005533 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:58.025534 kubelet[2758]: E0909 05:32:58.025496 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:32:58.025534 kubelet[2758]: W0909 05:32:58.025518 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:32:58.025534 kubelet[2758]: E0909 05:32:58.025533 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:32:59.363589 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1052460508.mount: Deactivated successfully. Sep 9 05:32:59.766643 kubelet[2758]: E0909 05:32:59.766386 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qrqmt" podUID="c1606d3b-7ca4-49f1-8973-eb24dc83693b" Sep 9 05:33:00.403675 containerd[1581]: time="2025-09-09T05:33:00.403621664Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:00.404829 containerd[1581]: time="2025-09-09T05:33:00.404794858Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 9 05:33:00.406041 containerd[1581]: time="2025-09-09T05:33:00.405640905Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:00.408952 containerd[1581]: time="2025-09-09T05:33:00.408933462Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:00.410456 containerd[1581]: time="2025-09-09T05:33:00.410437266Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.799474253s" Sep 9 05:33:00.410534 containerd[1581]: time="2025-09-09T05:33:00.410522791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 9 05:33:00.413066 containerd[1581]: time="2025-09-09T05:33:00.413053502Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 05:33:00.428204 containerd[1581]: time="2025-09-09T05:33:00.428161510Z" level=info msg="CreateContainer within sandbox \"ef2be9aff683a9893bc9ee78977be630e4ccffd341c7949c1a55a405e0cb7bc9\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 05:33:00.435809 containerd[1581]: time="2025-09-09T05:33:00.435745980Z" level=info msg="Container d94dd1bc2ddcebabc01814d89729444ebef25ed0986fa338cb781c71611284d9: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:00.445000 containerd[1581]: time="2025-09-09T05:33:00.444945346Z" level=info msg="CreateContainer within sandbox \"ef2be9aff683a9893bc9ee78977be630e4ccffd341c7949c1a55a405e0cb7bc9\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d94dd1bc2ddcebabc01814d89729444ebef25ed0986fa338cb781c71611284d9\"" Sep 9 05:33:00.446156 containerd[1581]: time="2025-09-09T05:33:00.446133947Z" level=info msg="StartContainer for \"d94dd1bc2ddcebabc01814d89729444ebef25ed0986fa338cb781c71611284d9\"" Sep 9 05:33:00.447036 containerd[1581]: time="2025-09-09T05:33:00.446892085Z" level=info msg="connecting to shim d94dd1bc2ddcebabc01814d89729444ebef25ed0986fa338cb781c71611284d9" address="unix:///run/containerd/s/3d44f451a7b24c066bb1d30469612aa0889d08d34cea9ef41ded81a065931783" protocol=ttrpc version=3 Sep 9 05:33:00.468245 systemd[1]: Started cri-containerd-d94dd1bc2ddcebabc01814d89729444ebef25ed0986fa338cb781c71611284d9.scope - libcontainer container d94dd1bc2ddcebabc01814d89729444ebef25ed0986fa338cb781c71611284d9. Sep 9 05:33:00.522684 containerd[1581]: time="2025-09-09T05:33:00.522640385Z" level=info msg="StartContainer for \"d94dd1bc2ddcebabc01814d89729444ebef25ed0986fa338cb781c71611284d9\" returns successfully" Sep 9 05:33:00.907314 kubelet[2758]: E0909 05:33:00.907257 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.907314 kubelet[2758]: W0909 05:33:00.907289 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.908421 kubelet[2758]: E0909 05:33:00.908365 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.908615 kubelet[2758]: E0909 05:33:00.908577 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.908615 kubelet[2758]: W0909 05:33:00.908595 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.908615 kubelet[2758]: E0909 05:33:00.908607 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.908926 kubelet[2758]: E0909 05:33:00.908731 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.908926 kubelet[2758]: W0909 05:33:00.908738 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.908926 kubelet[2758]: E0909 05:33:00.908747 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.908926 kubelet[2758]: E0909 05:33:00.908903 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.908926 kubelet[2758]: W0909 05:33:00.908910 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.908926 kubelet[2758]: E0909 05:33:00.908918 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.909505 kubelet[2758]: E0909 05:33:00.909117 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.909505 kubelet[2758]: W0909 05:33:00.909127 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.909505 kubelet[2758]: E0909 05:33:00.909156 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.909505 kubelet[2758]: E0909 05:33:00.909303 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.909505 kubelet[2758]: W0909 05:33:00.909312 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.909505 kubelet[2758]: E0909 05:33:00.909320 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.909505 kubelet[2758]: E0909 05:33:00.909461 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.909505 kubelet[2758]: W0909 05:33:00.909469 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.909505 kubelet[2758]: E0909 05:33:00.909476 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.910154 kubelet[2758]: E0909 05:33:00.909585 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.910154 kubelet[2758]: W0909 05:33:00.909592 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.910154 kubelet[2758]: E0909 05:33:00.909599 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.910154 kubelet[2758]: E0909 05:33:00.909720 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.910154 kubelet[2758]: W0909 05:33:00.909728 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.910154 kubelet[2758]: E0909 05:33:00.909735 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.910154 kubelet[2758]: E0909 05:33:00.909844 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.910154 kubelet[2758]: W0909 05:33:00.909851 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.910154 kubelet[2758]: E0909 05:33:00.909858 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.910154 kubelet[2758]: E0909 05:33:00.910065 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.910902 kubelet[2758]: W0909 05:33:00.910074 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.910902 kubelet[2758]: E0909 05:33:00.910083 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.910902 kubelet[2758]: E0909 05:33:00.910300 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.910902 kubelet[2758]: W0909 05:33:00.910310 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.910902 kubelet[2758]: E0909 05:33:00.910318 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.910902 kubelet[2758]: E0909 05:33:00.910479 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.910902 kubelet[2758]: W0909 05:33:00.910487 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.910902 kubelet[2758]: E0909 05:33:00.910497 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.910902 kubelet[2758]: E0909 05:33:00.910611 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.910902 kubelet[2758]: W0909 05:33:00.910618 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.911284 kubelet[2758]: E0909 05:33:00.910625 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.911284 kubelet[2758]: E0909 05:33:00.910745 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.911284 kubelet[2758]: W0909 05:33:00.910753 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.911284 kubelet[2758]: E0909 05:33:00.910760 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.921461 kubelet[2758]: E0909 05:33:00.921417 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.921461 kubelet[2758]: W0909 05:33:00.921446 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.921461 kubelet[2758]: E0909 05:33:00.921471 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.921979 kubelet[2758]: E0909 05:33:00.921823 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.921979 kubelet[2758]: W0909 05:33:00.921837 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.921979 kubelet[2758]: E0909 05:33:00.921869 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.922594 kubelet[2758]: E0909 05:33:00.922068 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.922594 kubelet[2758]: W0909 05:33:00.922082 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.922594 kubelet[2758]: E0909 05:33:00.922104 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.922594 kubelet[2758]: E0909 05:33:00.922334 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.922594 kubelet[2758]: W0909 05:33:00.922346 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.922594 kubelet[2758]: E0909 05:33:00.922365 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.922594 kubelet[2758]: E0909 05:33:00.922548 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.922594 kubelet[2758]: W0909 05:33:00.922560 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.922594 kubelet[2758]: E0909 05:33:00.922568 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.923226 kubelet[2758]: E0909 05:33:00.922695 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.923226 kubelet[2758]: W0909 05:33:00.922702 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.923226 kubelet[2758]: E0909 05:33:00.922709 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.923226 kubelet[2758]: E0909 05:33:00.922991 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.923226 kubelet[2758]: W0909 05:33:00.923001 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.923226 kubelet[2758]: E0909 05:33:00.923049 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.923451 kubelet[2758]: E0909 05:33:00.923426 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.923451 kubelet[2758]: W0909 05:33:00.923440 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.923584 kubelet[2758]: E0909 05:33:00.923501 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.923584 kubelet[2758]: E0909 05:33:00.923579 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.923698 kubelet[2758]: W0909 05:33:00.923587 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.923698 kubelet[2758]: E0909 05:33:00.923618 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.923847 kubelet[2758]: E0909 05:33:00.923832 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.923847 kubelet[2758]: W0909 05:33:00.923844 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.923971 kubelet[2758]: E0909 05:33:00.923862 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.924009 kubelet[2758]: E0909 05:33:00.924001 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.924088 kubelet[2758]: W0909 05:33:00.924009 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.924088 kubelet[2758]: E0909 05:33:00.924063 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.924196 kubelet[2758]: E0909 05:33:00.924183 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.924196 kubelet[2758]: W0909 05:33:00.924191 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.924264 kubelet[2758]: E0909 05:33:00.924206 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.924441 kubelet[2758]: E0909 05:33:00.924414 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.924441 kubelet[2758]: W0909 05:33:00.924430 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.924525 kubelet[2758]: E0909 05:33:00.924445 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.925057 kubelet[2758]: E0909 05:33:00.924976 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.925057 kubelet[2758]: W0909 05:33:00.924994 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.925057 kubelet[2758]: E0909 05:33:00.925003 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.925653 kubelet[2758]: E0909 05:33:00.925628 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.925653 kubelet[2758]: W0909 05:33:00.925643 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.925734 kubelet[2758]: E0909 05:33:00.925658 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.925977 kubelet[2758]: E0909 05:33:00.925803 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.925977 kubelet[2758]: W0909 05:33:00.925810 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.925977 kubelet[2758]: E0909 05:33:00.925818 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.926164 kubelet[2758]: E0909 05:33:00.926111 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.926164 kubelet[2758]: W0909 05:33:00.926120 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.926164 kubelet[2758]: E0909 05:33:00.926132 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:00.926304 kubelet[2758]: E0909 05:33:00.926277 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:00.926304 kubelet[2758]: W0909 05:33:00.926291 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:00.926304 kubelet[2758]: E0909 05:33:00.926299 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.766082 kubelet[2758]: E0909 05:33:01.765899 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qrqmt" podUID="c1606d3b-7ca4-49f1-8973-eb24dc83693b" Sep 9 05:33:01.865504 kubelet[2758]: I0909 05:33:01.865448 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:01.919592 kubelet[2758]: E0909 05:33:01.919549 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.919592 kubelet[2758]: W0909 05:33:01.919577 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.919592 kubelet[2758]: E0909 05:33:01.919598 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.920089 kubelet[2758]: E0909 05:33:01.919740 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.920089 kubelet[2758]: W0909 05:33:01.919748 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.920089 kubelet[2758]: E0909 05:33:01.919757 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.920089 kubelet[2758]: E0909 05:33:01.919931 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.920089 kubelet[2758]: W0909 05:33:01.919940 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.920089 kubelet[2758]: E0909 05:33:01.919979 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.920270 kubelet[2758]: E0909 05:33:01.920152 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.920270 kubelet[2758]: W0909 05:33:01.920161 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.920270 kubelet[2758]: E0909 05:33:01.920169 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.920366 kubelet[2758]: E0909 05:33:01.920347 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.920366 kubelet[2758]: W0909 05:33:01.920355 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.920366 kubelet[2758]: E0909 05:33:01.920363 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.920548 kubelet[2758]: E0909 05:33:01.920525 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.920548 kubelet[2758]: W0909 05:33:01.920540 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.920635 kubelet[2758]: E0909 05:33:01.920552 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.920716 kubelet[2758]: E0909 05:33:01.920687 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.920716 kubelet[2758]: W0909 05:33:01.920704 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.920716 kubelet[2758]: E0909 05:33:01.920714 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.920848 kubelet[2758]: E0909 05:33:01.920838 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.920848 kubelet[2758]: W0909 05:33:01.920848 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.920897 kubelet[2758]: E0909 05:33:01.920855 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.920992 kubelet[2758]: E0909 05:33:01.920971 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.920992 kubelet[2758]: W0909 05:33:01.920983 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.920992 kubelet[2758]: E0909 05:33:01.920991 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.921127 kubelet[2758]: E0909 05:33:01.921119 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.921155 kubelet[2758]: W0909 05:33:01.921128 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.921155 kubelet[2758]: E0909 05:33:01.921135 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.921383 kubelet[2758]: E0909 05:33:01.921270 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.921383 kubelet[2758]: W0909 05:33:01.921296 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.921383 kubelet[2758]: E0909 05:33:01.921316 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.921584 kubelet[2758]: E0909 05:33:01.921561 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.921584 kubelet[2758]: W0909 05:33:01.921577 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.921584 kubelet[2758]: E0909 05:33:01.921584 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.921768 kubelet[2758]: E0909 05:33:01.921722 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.921768 kubelet[2758]: W0909 05:33:01.921733 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.921768 kubelet[2758]: E0909 05:33:01.921740 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.921915 kubelet[2758]: E0909 05:33:01.921893 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.921915 kubelet[2758]: W0909 05:33:01.921910 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.921967 kubelet[2758]: E0909 05:33:01.921917 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.922084 kubelet[2758]: E0909 05:33:01.922071 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.922084 kubelet[2758]: W0909 05:33:01.922082 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.922146 kubelet[2758]: E0909 05:33:01.922089 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.931607 kubelet[2758]: E0909 05:33:01.931582 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.931698 kubelet[2758]: W0909 05:33:01.931610 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.931698 kubelet[2758]: E0909 05:33:01.931631 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.931913 kubelet[2758]: E0909 05:33:01.931856 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.931913 kubelet[2758]: W0909 05:33:01.931888 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.931913 kubelet[2758]: E0909 05:33:01.931902 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.932260 kubelet[2758]: E0909 05:33:01.932113 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.932260 kubelet[2758]: W0909 05:33:01.932126 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.932260 kubelet[2758]: E0909 05:33:01.932140 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.932820 kubelet[2758]: E0909 05:33:01.932372 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.932820 kubelet[2758]: W0909 05:33:01.932385 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.932820 kubelet[2758]: E0909 05:33:01.932416 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.932820 kubelet[2758]: E0909 05:33:01.932624 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.932820 kubelet[2758]: W0909 05:33:01.932636 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.932820 kubelet[2758]: E0909 05:33:01.932665 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.933465 kubelet[2758]: E0909 05:33:01.932878 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.933465 kubelet[2758]: W0909 05:33:01.932889 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.933465 kubelet[2758]: E0909 05:33:01.932907 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.933465 kubelet[2758]: E0909 05:33:01.933152 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.933465 kubelet[2758]: W0909 05:33:01.933195 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.933465 kubelet[2758]: E0909 05:33:01.933223 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.934106 kubelet[2758]: E0909 05:33:01.933864 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.934106 kubelet[2758]: W0909 05:33:01.933895 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.934106 kubelet[2758]: E0909 05:33:01.933917 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.934274 kubelet[2758]: E0909 05:33:01.934258 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.934400 kubelet[2758]: W0909 05:33:01.934361 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.934871 kubelet[2758]: E0909 05:33:01.934578 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.934871 kubelet[2758]: W0909 05:33:01.934588 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.934871 kubelet[2758]: E0909 05:33:01.934741 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.934871 kubelet[2758]: W0909 05:33:01.934752 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.934871 kubelet[2758]: E0909 05:33:01.934766 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.934871 kubelet[2758]: E0909 05:33:01.934799 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.935185 kubelet[2758]: E0909 05:33:01.934988 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.935185 kubelet[2758]: W0909 05:33:01.935000 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.935185 kubelet[2758]: E0909 05:33:01.935012 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.935299 kubelet[2758]: E0909 05:33:01.935210 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.935299 kubelet[2758]: W0909 05:33:01.935220 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.935299 kubelet[2758]: E0909 05:33:01.935231 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.935605 kubelet[2758]: E0909 05:33:01.935461 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.935605 kubelet[2758]: E0909 05:33:01.935584 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.935605 kubelet[2758]: W0909 05:33:01.935596 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.935831 kubelet[2758]: E0909 05:33:01.935619 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.935831 kubelet[2758]: E0909 05:33:01.935805 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.935831 kubelet[2758]: W0909 05:33:01.935816 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.935831 kubelet[2758]: E0909 05:33:01.935828 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.936179 kubelet[2758]: E0909 05:33:01.936006 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.936179 kubelet[2758]: W0909 05:33:01.936035 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.936179 kubelet[2758]: E0909 05:33:01.936060 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.936383 kubelet[2758]: E0909 05:33:01.936253 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.936383 kubelet[2758]: W0909 05:33:01.936270 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.936383 kubelet[2758]: E0909 05:33:01.936281 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:01.937008 kubelet[2758]: E0909 05:33:01.936953 2758 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 05:33:01.937008 kubelet[2758]: W0909 05:33:01.936971 2758 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 05:33:01.937008 kubelet[2758]: E0909 05:33:01.936984 2758 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 05:33:02.376108 containerd[1581]: time="2025-09-09T05:33:02.376011080Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:02.385525 containerd[1581]: time="2025-09-09T05:33:02.385498262Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 9 05:33:02.386455 containerd[1581]: time="2025-09-09T05:33:02.386419164Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:02.387937 containerd[1581]: time="2025-09-09T05:33:02.387866926Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:02.388523 containerd[1581]: time="2025-09-09T05:33:02.388310114Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.975181516s" Sep 9 05:33:02.388523 containerd[1581]: time="2025-09-09T05:33:02.388345831Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 9 05:33:02.390227 containerd[1581]: time="2025-09-09T05:33:02.390201777Z" level=info msg="CreateContainer within sandbox \"64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 05:33:02.402046 containerd[1581]: time="2025-09-09T05:33:02.400818412Z" level=info msg="Container 1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:02.407569 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2497654547.mount: Deactivated successfully. Sep 9 05:33:02.412303 containerd[1581]: time="2025-09-09T05:33:02.412246929Z" level=info msg="CreateContainer within sandbox \"64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d\"" Sep 9 05:33:02.413758 containerd[1581]: time="2025-09-09T05:33:02.413697075Z" level=info msg="StartContainer for \"1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d\"" Sep 9 05:33:02.415382 containerd[1581]: time="2025-09-09T05:33:02.415304349Z" level=info msg="connecting to shim 1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d" address="unix:///run/containerd/s/949320806d6b47ace0ad1fd02480b30b7ae30e7d890f69cd7333156a8512f7ac" protocol=ttrpc version=3 Sep 9 05:33:02.439148 systemd[1]: Started cri-containerd-1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d.scope - libcontainer container 1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d. Sep 9 05:33:02.552356 containerd[1581]: time="2025-09-09T05:33:02.552277722Z" level=info msg="StartContainer for \"1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d\" returns successfully" Sep 9 05:33:02.556000 systemd[1]: cri-containerd-1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d.scope: Deactivated successfully. Sep 9 05:33:02.562347 containerd[1581]: time="2025-09-09T05:33:02.562317084Z" level=info msg="received exit event container_id:\"1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d\" id:\"1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d\" pid:3483 exited_at:{seconds:1757395982 nanos:558644231}" Sep 9 05:33:02.576745 containerd[1581]: time="2025-09-09T05:33:02.576717113Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d\" id:\"1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d\" pid:3483 exited_at:{seconds:1757395982 nanos:558644231}" Sep 9 05:33:02.586326 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1204960a2c32146acf74d7eec8f1c3edc771e3341ecb1d73155c6d7379873b4d-rootfs.mount: Deactivated successfully. Sep 9 05:33:02.870776 containerd[1581]: time="2025-09-09T05:33:02.870683684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 05:33:02.886557 kubelet[2758]: I0909 05:33:02.886513 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-8d9bbc9f7-ldk9m" podStartSLOduration=3.085512761 podStartE2EDuration="5.886496443s" podCreationTimestamp="2025-09-09 05:32:57 +0000 UTC" firstStartedPulling="2025-09-09 05:32:57.610519281 +0000 UTC m=+18.953706797" lastFinishedPulling="2025-09-09 05:33:00.411502962 +0000 UTC m=+21.754690479" observedRunningTime="2025-09-09 05:33:00.88060507 +0000 UTC m=+22.223792605" watchObservedRunningTime="2025-09-09 05:33:02.886496443 +0000 UTC m=+24.229683969" Sep 9 05:33:03.767408 kubelet[2758]: E0909 05:33:03.767362 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qrqmt" podUID="c1606d3b-7ca4-49f1-8973-eb24dc83693b" Sep 9 05:33:05.766579 kubelet[2758]: E0909 05:33:05.766508 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-qrqmt" podUID="c1606d3b-7ca4-49f1-8973-eb24dc83693b" Sep 9 05:33:06.957768 containerd[1581]: time="2025-09-09T05:33:06.957716497Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:06.958749 containerd[1581]: time="2025-09-09T05:33:06.958716704Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 9 05:33:06.960036 containerd[1581]: time="2025-09-09T05:33:06.959475282Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:06.961248 containerd[1581]: time="2025-09-09T05:33:06.961222467Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:06.961742 containerd[1581]: time="2025-09-09T05:33:06.961719569Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.090948265s" Sep 9 05:33:06.961780 containerd[1581]: time="2025-09-09T05:33:06.961743084Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 9 05:33:06.964538 containerd[1581]: time="2025-09-09T05:33:06.964520009Z" level=info msg="CreateContainer within sandbox \"64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 05:33:06.974062 containerd[1581]: time="2025-09-09T05:33:06.973141798Z" level=info msg="Container aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:06.976311 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1177106600.mount: Deactivated successfully. Sep 9 05:33:06.983813 containerd[1581]: time="2025-09-09T05:33:06.983779059Z" level=info msg="CreateContainer within sandbox \"64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a\"" Sep 9 05:33:06.985295 containerd[1581]: time="2025-09-09T05:33:06.984210421Z" level=info msg="StartContainer for \"aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a\"" Sep 9 05:33:06.985295 containerd[1581]: time="2025-09-09T05:33:06.985225836Z" level=info msg="connecting to shim aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a" address="unix:///run/containerd/s/949320806d6b47ace0ad1fd02480b30b7ae30e7d890f69cd7333156a8512f7ac" protocol=ttrpc version=3 Sep 9 05:33:07.009175 systemd[1]: Started cri-containerd-aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a.scope - libcontainer container aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a. Sep 9 05:33:07.050321 containerd[1581]: time="2025-09-09T05:33:07.050249054Z" level=info msg="StartContainer for \"aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a\" returns successfully" Sep 9 05:33:07.420322 systemd[1]: cri-containerd-aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a.scope: Deactivated successfully. Sep 9 05:33:07.420547 systemd[1]: cri-containerd-aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a.scope: Consumed 358ms CPU time, 167.1M memory peak, 11.6M read from disk, 171.3M written to disk. Sep 9 05:33:07.447756 containerd[1581]: time="2025-09-09T05:33:07.447678152Z" level=info msg="received exit event container_id:\"aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a\" id:\"aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a\" pid:3538 exited_at:{seconds:1757395987 nanos:447115868}" Sep 9 05:33:07.457465 containerd[1581]: time="2025-09-09T05:33:07.457406872Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a\" id:\"aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a\" pid:3538 exited_at:{seconds:1757395987 nanos:447115868}" Sep 9 05:33:07.484580 kubelet[2758]: I0909 05:33:07.484365 2758 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 05:33:07.492282 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-aa62550d15e5c8a797b6f31780b2e0a96172b17b974587672df83a131a4cb35a-rootfs.mount: Deactivated successfully. Sep 9 05:33:07.539299 systemd[1]: Created slice kubepods-burstable-pod319ebac8_412e_4d4a_90af_84d99ce90213.slice - libcontainer container kubepods-burstable-pod319ebac8_412e_4d4a_90af_84d99ce90213.slice. Sep 9 05:33:07.552497 systemd[1]: Created slice kubepods-burstable-pod02708d09_983e_47b5_b5d8_8af08b1e4a26.slice - libcontainer container kubepods-burstable-pod02708d09_983e_47b5_b5d8_8af08b1e4a26.slice. Sep 9 05:33:07.563826 systemd[1]: Created slice kubepods-besteffort-pod6ede1a02_20db_4ba2_a9be_569e19e43266.slice - libcontainer container kubepods-besteffort-pod6ede1a02_20db_4ba2_a9be_569e19e43266.slice. Sep 9 05:33:07.571731 systemd[1]: Created slice kubepods-besteffort-pod96451131_9615_4466_8637_9110e935815d.slice - libcontainer container kubepods-besteffort-pod96451131_9615_4466_8637_9110e935815d.slice. Sep 9 05:33:07.579433 kubelet[2758]: I0909 05:33:07.579415 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/319ebac8-412e-4d4a-90af-84d99ce90213-config-volume\") pod \"coredns-668d6bf9bc-6nz5j\" (UID: \"319ebac8-412e-4d4a-90af-84d99ce90213\") " pod="kube-system/coredns-668d6bf9bc-6nz5j" Sep 9 05:33:07.579778 kubelet[2758]: I0909 05:33:07.579670 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/02708d09-983e-47b5-b5d8-8af08b1e4a26-config-volume\") pod \"coredns-668d6bf9bc-kbfs2\" (UID: \"02708d09-983e-47b5-b5d8-8af08b1e4a26\") " pod="kube-system/coredns-668d6bf9bc-kbfs2" Sep 9 05:33:07.579778 kubelet[2758]: I0909 05:33:07.579690 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ede1a02-20db-4ba2-a9be-569e19e43266-tigera-ca-bundle\") pod \"calico-kube-controllers-7cf8d45fc7-g66ps\" (UID: \"6ede1a02-20db-4ba2-a9be-569e19e43266\") " pod="calico-system/calico-kube-controllers-7cf8d45fc7-g66ps" Sep 9 05:33:07.579778 kubelet[2758]: I0909 05:33:07.579705 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqgcw\" (UniqueName: \"kubernetes.io/projected/3afed2c6-cb7f-44a3-a525-6523bbddd214-kube-api-access-vqgcw\") pod \"calico-apiserver-6b8c85d7cc-mbm69\" (UID: \"3afed2c6-cb7f-44a3-a525-6523bbddd214\") " pod="calico-apiserver/calico-apiserver-6b8c85d7cc-mbm69" Sep 9 05:33:07.579464 systemd[1]: Created slice kubepods-besteffort-poda438f529_ea8a_4969_a2a9_1d9ed1fd6a79.slice - libcontainer container kubepods-besteffort-poda438f529_ea8a_4969_a2a9_1d9ed1fd6a79.slice. Sep 9 05:33:07.580530 kubelet[2758]: I0909 05:33:07.580248 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a438f529-ea8a-4969-a2a9-1d9ed1fd6a79-calico-apiserver-certs\") pod \"calico-apiserver-dfd4dff7c-shntr\" (UID: \"a438f529-ea8a-4969-a2a9-1d9ed1fd6a79\") " pod="calico-apiserver/calico-apiserver-dfd4dff7c-shntr" Sep 9 05:33:07.580530 kubelet[2758]: I0909 05:33:07.580267 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24mnb\" (UniqueName: \"kubernetes.io/projected/a438f529-ea8a-4969-a2a9-1d9ed1fd6a79-kube-api-access-24mnb\") pod \"calico-apiserver-dfd4dff7c-shntr\" (UID: \"a438f529-ea8a-4969-a2a9-1d9ed1fd6a79\") " pod="calico-apiserver/calico-apiserver-dfd4dff7c-shntr" Sep 9 05:33:07.580530 kubelet[2758]: I0909 05:33:07.580281 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6tpt\" (UniqueName: \"kubernetes.io/projected/ed22ad11-18c4-4799-9a41-dea68afd7844-kube-api-access-v6tpt\") pod \"whisker-58f7bc4dbf-rmklt\" (UID: \"ed22ad11-18c4-4799-9a41-dea68afd7844\") " pod="calico-system/whisker-58f7bc4dbf-rmklt" Sep 9 05:33:07.581306 kubelet[2758]: I0909 05:33:07.580914 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86lps\" (UniqueName: \"kubernetes.io/projected/74db9c19-3a76-4b16-941f-533dc3919843-kube-api-access-86lps\") pod \"goldmane-54d579b49d-gd4t2\" (UID: \"74db9c19-3a76-4b16-941f-533dc3919843\") " pod="calico-system/goldmane-54d579b49d-gd4t2" Sep 9 05:33:07.581306 kubelet[2758]: I0909 05:33:07.580940 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dflvn\" (UniqueName: \"kubernetes.io/projected/96451131-9615-4466-8637-9110e935815d-kube-api-access-dflvn\") pod \"calico-apiserver-6b8c85d7cc-csjg4\" (UID: \"96451131-9615-4466-8637-9110e935815d\") " pod="calico-apiserver/calico-apiserver-6b8c85d7cc-csjg4" Sep 9 05:33:07.581306 kubelet[2758]: I0909 05:33:07.580953 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btlkg\" (UniqueName: \"kubernetes.io/projected/319ebac8-412e-4d4a-90af-84d99ce90213-kube-api-access-btlkg\") pod \"coredns-668d6bf9bc-6nz5j\" (UID: \"319ebac8-412e-4d4a-90af-84d99ce90213\") " pod="kube-system/coredns-668d6bf9bc-6nz5j" Sep 9 05:33:07.581499 kubelet[2758]: I0909 05:33:07.581414 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74db9c19-3a76-4b16-941f-533dc3919843-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-gd4t2\" (UID: \"74db9c19-3a76-4b16-941f-533dc3919843\") " pod="calico-system/goldmane-54d579b49d-gd4t2" Sep 9 05:33:07.582134 kubelet[2758]: I0909 05:33:07.581438 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/96451131-9615-4466-8637-9110e935815d-calico-apiserver-certs\") pod \"calico-apiserver-6b8c85d7cc-csjg4\" (UID: \"96451131-9615-4466-8637-9110e935815d\") " pod="calico-apiserver/calico-apiserver-6b8c85d7cc-csjg4" Sep 9 05:33:07.582134 kubelet[2758]: I0909 05:33:07.582096 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3afed2c6-cb7f-44a3-a525-6523bbddd214-calico-apiserver-certs\") pod \"calico-apiserver-6b8c85d7cc-mbm69\" (UID: \"3afed2c6-cb7f-44a3-a525-6523bbddd214\") " pod="calico-apiserver/calico-apiserver-6b8c85d7cc-mbm69" Sep 9 05:33:07.582230 kubelet[2758]: I0909 05:33:07.582219 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwcr6\" (UniqueName: \"kubernetes.io/projected/02708d09-983e-47b5-b5d8-8af08b1e4a26-kube-api-access-pwcr6\") pod \"coredns-668d6bf9bc-kbfs2\" (UID: \"02708d09-983e-47b5-b5d8-8af08b1e4a26\") " pod="kube-system/coredns-668d6bf9bc-kbfs2" Sep 9 05:33:07.582319 kubelet[2758]: I0909 05:33:07.582306 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74db9c19-3a76-4b16-941f-533dc3919843-config\") pod \"goldmane-54d579b49d-gd4t2\" (UID: \"74db9c19-3a76-4b16-941f-533dc3919843\") " pod="calico-system/goldmane-54d579b49d-gd4t2" Sep 9 05:33:07.582461 kubelet[2758]: I0909 05:33:07.582385 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed22ad11-18c4-4799-9a41-dea68afd7844-whisker-ca-bundle\") pod \"whisker-58f7bc4dbf-rmklt\" (UID: \"ed22ad11-18c4-4799-9a41-dea68afd7844\") " pod="calico-system/whisker-58f7bc4dbf-rmklt" Sep 9 05:33:07.582461 kubelet[2758]: I0909 05:33:07.582401 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/74db9c19-3a76-4b16-941f-533dc3919843-goldmane-key-pair\") pod \"goldmane-54d579b49d-gd4t2\" (UID: \"74db9c19-3a76-4b16-941f-533dc3919843\") " pod="calico-system/goldmane-54d579b49d-gd4t2" Sep 9 05:33:07.582461 kubelet[2758]: I0909 05:33:07.582416 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ed22ad11-18c4-4799-9a41-dea68afd7844-whisker-backend-key-pair\") pod \"whisker-58f7bc4dbf-rmklt\" (UID: \"ed22ad11-18c4-4799-9a41-dea68afd7844\") " pod="calico-system/whisker-58f7bc4dbf-rmklt" Sep 9 05:33:07.582636 kubelet[2758]: I0909 05:33:07.582546 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4lft\" (UniqueName: \"kubernetes.io/projected/6ede1a02-20db-4ba2-a9be-569e19e43266-kube-api-access-r4lft\") pod \"calico-kube-controllers-7cf8d45fc7-g66ps\" (UID: \"6ede1a02-20db-4ba2-a9be-569e19e43266\") " pod="calico-system/calico-kube-controllers-7cf8d45fc7-g66ps" Sep 9 05:33:07.587382 systemd[1]: Created slice kubepods-besteffort-pod3afed2c6_cb7f_44a3_a525_6523bbddd214.slice - libcontainer container kubepods-besteffort-pod3afed2c6_cb7f_44a3_a525_6523bbddd214.slice. Sep 9 05:33:07.599452 systemd[1]: Created slice kubepods-besteffort-pod74db9c19_3a76_4b16_941f_533dc3919843.slice - libcontainer container kubepods-besteffort-pod74db9c19_3a76_4b16_941f_533dc3919843.slice. Sep 9 05:33:07.607668 systemd[1]: Created slice kubepods-besteffort-poded22ad11_18c4_4799_9a41_dea68afd7844.slice - libcontainer container kubepods-besteffort-poded22ad11_18c4_4799_9a41_dea68afd7844.slice. Sep 9 05:33:07.771679 systemd[1]: Created slice kubepods-besteffort-podc1606d3b_7ca4_49f1_8973_eb24dc83693b.slice - libcontainer container kubepods-besteffort-podc1606d3b_7ca4_49f1_8973_eb24dc83693b.slice. Sep 9 05:33:07.781571 containerd[1581]: time="2025-09-09T05:33:07.781532425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qrqmt,Uid:c1606d3b-7ca4-49f1-8973-eb24dc83693b,Namespace:calico-system,Attempt:0,}" Sep 9 05:33:07.850848 containerd[1581]: time="2025-09-09T05:33:07.850812827Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6nz5j,Uid:319ebac8-412e-4d4a-90af-84d99ce90213,Namespace:kube-system,Attempt:0,}" Sep 9 05:33:07.858487 containerd[1581]: time="2025-09-09T05:33:07.858442107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kbfs2,Uid:02708d09-983e-47b5-b5d8-8af08b1e4a26,Namespace:kube-system,Attempt:0,}" Sep 9 05:33:07.873282 containerd[1581]: time="2025-09-09T05:33:07.873230513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cf8d45fc7-g66ps,Uid:6ede1a02-20db-4ba2-a9be-569e19e43266,Namespace:calico-system,Attempt:0,}" Sep 9 05:33:07.877865 containerd[1581]: time="2025-09-09T05:33:07.877658408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b8c85d7cc-csjg4,Uid:96451131-9615-4466-8637-9110e935815d,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:33:07.886659 containerd[1581]: time="2025-09-09T05:33:07.886634148Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dfd4dff7c-shntr,Uid:a438f529-ea8a-4969-a2a9-1d9ed1fd6a79,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:33:07.895151 containerd[1581]: time="2025-09-09T05:33:07.895082980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b8c85d7cc-mbm69,Uid:3afed2c6-cb7f-44a3-a525-6523bbddd214,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:33:07.901589 containerd[1581]: time="2025-09-09T05:33:07.901561252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 05:33:07.905823 containerd[1581]: time="2025-09-09T05:33:07.905712832Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-gd4t2,Uid:74db9c19-3a76-4b16-941f-533dc3919843,Namespace:calico-system,Attempt:0,}" Sep 9 05:33:07.914368 containerd[1581]: time="2025-09-09T05:33:07.914292778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58f7bc4dbf-rmklt,Uid:ed22ad11-18c4-4799-9a41-dea68afd7844,Namespace:calico-system,Attempt:0,}" Sep 9 05:33:08.038319 containerd[1581]: time="2025-09-09T05:33:08.038276018Z" level=error msg="Failed to destroy network for sandbox \"39c6690b2bc48ed50cb98deefb26c9001888d7a7a6a13b08950ce73ce25d21af\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.042333 containerd[1581]: time="2025-09-09T05:33:08.041116695Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6nz5j,Uid:319ebac8-412e-4d4a-90af-84d99ce90213,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"39c6690b2bc48ed50cb98deefb26c9001888d7a7a6a13b08950ce73ce25d21af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.041596 systemd[1]: run-netns-cni\x2d719e09b0\x2d05e8\x2d4507\x2d5da9\x2d127bc293af43.mount: Deactivated successfully. Sep 9 05:33:08.045532 kubelet[2758]: E0909 05:33:08.045293 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39c6690b2bc48ed50cb98deefb26c9001888d7a7a6a13b08950ce73ce25d21af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.045532 kubelet[2758]: E0909 05:33:08.045505 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39c6690b2bc48ed50cb98deefb26c9001888d7a7a6a13b08950ce73ce25d21af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6nz5j" Sep 9 05:33:08.045532 kubelet[2758]: E0909 05:33:08.045528 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"39c6690b2bc48ed50cb98deefb26c9001888d7a7a6a13b08950ce73ce25d21af\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-6nz5j" Sep 9 05:33:08.046775 kubelet[2758]: E0909 05:33:08.045679 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-6nz5j_kube-system(319ebac8-412e-4d4a-90af-84d99ce90213)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-6nz5j_kube-system(319ebac8-412e-4d4a-90af-84d99ce90213)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"39c6690b2bc48ed50cb98deefb26c9001888d7a7a6a13b08950ce73ce25d21af\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-6nz5j" podUID="319ebac8-412e-4d4a-90af-84d99ce90213" Sep 9 05:33:08.069073 containerd[1581]: time="2025-09-09T05:33:08.068996659Z" level=error msg="Failed to destroy network for sandbox \"f40161c5142710b5ac847d0177ebf2398c7efb3ac6cdce29eec2df3799858ecc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.071762 systemd[1]: run-netns-cni\x2d8e7f7fe7\x2d88ac\x2dca9d\x2d4512\x2d739c4ca0bec1.mount: Deactivated successfully. Sep 9 05:33:08.074500 containerd[1581]: time="2025-09-09T05:33:08.073894118Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dfd4dff7c-shntr,Uid:a438f529-ea8a-4969-a2a9-1d9ed1fd6a79,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f40161c5142710b5ac847d0177ebf2398c7efb3ac6cdce29eec2df3799858ecc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.074631 kubelet[2758]: E0909 05:33:08.074134 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f40161c5142710b5ac847d0177ebf2398c7efb3ac6cdce29eec2df3799858ecc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.074631 kubelet[2758]: E0909 05:33:08.074211 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f40161c5142710b5ac847d0177ebf2398c7efb3ac6cdce29eec2df3799858ecc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dfd4dff7c-shntr" Sep 9 05:33:08.074631 kubelet[2758]: E0909 05:33:08.074230 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f40161c5142710b5ac847d0177ebf2398c7efb3ac6cdce29eec2df3799858ecc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-dfd4dff7c-shntr" Sep 9 05:33:08.074716 kubelet[2758]: E0909 05:33:08.074273 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-dfd4dff7c-shntr_calico-apiserver(a438f529-ea8a-4969-a2a9-1d9ed1fd6a79)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-dfd4dff7c-shntr_calico-apiserver(a438f529-ea8a-4969-a2a9-1d9ed1fd6a79)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f40161c5142710b5ac847d0177ebf2398c7efb3ac6cdce29eec2df3799858ecc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-dfd4dff7c-shntr" podUID="a438f529-ea8a-4969-a2a9-1d9ed1fd6a79" Sep 9 05:33:08.091062 containerd[1581]: time="2025-09-09T05:33:08.089168804Z" level=error msg="Failed to destroy network for sandbox \"8bd6b602c4d68cb56f5441329c7a661dd7368aae90275988e557010342deabb1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.092497 systemd[1]: run-netns-cni\x2d1d2d694b\x2d37db\x2da0a7\x2d822e\x2d9c9668d513f5.mount: Deactivated successfully. Sep 9 05:33:08.094794 containerd[1581]: time="2025-09-09T05:33:08.094761034Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b8c85d7cc-csjg4,Uid:96451131-9615-4466-8637-9110e935815d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8bd6b602c4d68cb56f5441329c7a661dd7368aae90275988e557010342deabb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.095169 kubelet[2758]: E0909 05:33:08.095138 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8bd6b602c4d68cb56f5441329c7a661dd7368aae90275988e557010342deabb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.095314 kubelet[2758]: E0909 05:33:08.095299 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8bd6b602c4d68cb56f5441329c7a661dd7368aae90275988e557010342deabb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b8c85d7cc-csjg4" Sep 9 05:33:08.096178 kubelet[2758]: E0909 05:33:08.095803 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8bd6b602c4d68cb56f5441329c7a661dd7368aae90275988e557010342deabb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b8c85d7cc-csjg4" Sep 9 05:33:08.096178 kubelet[2758]: E0909 05:33:08.095864 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b8c85d7cc-csjg4_calico-apiserver(96451131-9615-4466-8637-9110e935815d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b8c85d7cc-csjg4_calico-apiserver(96451131-9615-4466-8637-9110e935815d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8bd6b602c4d68cb56f5441329c7a661dd7368aae90275988e557010342deabb1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b8c85d7cc-csjg4" podUID="96451131-9615-4466-8637-9110e935815d" Sep 9 05:33:08.122072 containerd[1581]: time="2025-09-09T05:33:08.121142914Z" level=error msg="Failed to destroy network for sandbox \"9a64005cdc3e6391414aed98ad2b5f22a62d33c8e8f8278b835474004ed55368\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.125670 systemd[1]: run-netns-cni\x2d111eb580\x2d407b\x2d93bb\x2dc6d1\x2dd97a7f198130.mount: Deactivated successfully. Sep 9 05:33:08.126748 containerd[1581]: time="2025-09-09T05:33:08.126720055Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kbfs2,Uid:02708d09-983e-47b5-b5d8-8af08b1e4a26,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a64005cdc3e6391414aed98ad2b5f22a62d33c8e8f8278b835474004ed55368\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.127391 kubelet[2758]: E0909 05:33:08.127063 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a64005cdc3e6391414aed98ad2b5f22a62d33c8e8f8278b835474004ed55368\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.127391 kubelet[2758]: E0909 05:33:08.127121 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a64005cdc3e6391414aed98ad2b5f22a62d33c8e8f8278b835474004ed55368\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-kbfs2" Sep 9 05:33:08.127391 kubelet[2758]: E0909 05:33:08.127142 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a64005cdc3e6391414aed98ad2b5f22a62d33c8e8f8278b835474004ed55368\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-kbfs2" Sep 9 05:33:08.127554 kubelet[2758]: E0909 05:33:08.127179 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-kbfs2_kube-system(02708d09-983e-47b5-b5d8-8af08b1e4a26)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-kbfs2_kube-system(02708d09-983e-47b5-b5d8-8af08b1e4a26)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9a64005cdc3e6391414aed98ad2b5f22a62d33c8e8f8278b835474004ed55368\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-kbfs2" podUID="02708d09-983e-47b5-b5d8-8af08b1e4a26" Sep 9 05:33:08.137230 containerd[1581]: time="2025-09-09T05:33:08.137193953Z" level=error msg="Failed to destroy network for sandbox \"3f5f04b6eb7e34d9464d7fd194e97f48225349b4f579288fc69043d8790f8673\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.138518 containerd[1581]: time="2025-09-09T05:33:08.138491494Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qrqmt,Uid:c1606d3b-7ca4-49f1-8973-eb24dc83693b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f5f04b6eb7e34d9464d7fd194e97f48225349b4f579288fc69043d8790f8673\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.139151 kubelet[2758]: E0909 05:33:08.139119 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f5f04b6eb7e34d9464d7fd194e97f48225349b4f579288fc69043d8790f8673\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.140444 kubelet[2758]: E0909 05:33:08.140121 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f5f04b6eb7e34d9464d7fd194e97f48225349b4f579288fc69043d8790f8673\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-qrqmt" Sep 9 05:33:08.140444 kubelet[2758]: E0909 05:33:08.140166 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f5f04b6eb7e34d9464d7fd194e97f48225349b4f579288fc69043d8790f8673\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-qrqmt" Sep 9 05:33:08.140444 kubelet[2758]: E0909 05:33:08.140208 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-qrqmt_calico-system(c1606d3b-7ca4-49f1-8973-eb24dc83693b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-qrqmt_calico-system(c1606d3b-7ca4-49f1-8973-eb24dc83693b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f5f04b6eb7e34d9464d7fd194e97f48225349b4f579288fc69043d8790f8673\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-qrqmt" podUID="c1606d3b-7ca4-49f1-8973-eb24dc83693b" Sep 9 05:33:08.147174 containerd[1581]: time="2025-09-09T05:33:08.147140532Z" level=error msg="Failed to destroy network for sandbox \"5882900226842be12822c991884d87c3af9af8f00223258816e6b9ea45f22ea5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.148547 containerd[1581]: time="2025-09-09T05:33:08.148504065Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cf8d45fc7-g66ps,Uid:6ede1a02-20db-4ba2-a9be-569e19e43266,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5882900226842be12822c991884d87c3af9af8f00223258816e6b9ea45f22ea5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.149216 kubelet[2758]: E0909 05:33:08.148771 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5882900226842be12822c991884d87c3af9af8f00223258816e6b9ea45f22ea5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.149216 kubelet[2758]: E0909 05:33:08.148810 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5882900226842be12822c991884d87c3af9af8f00223258816e6b9ea45f22ea5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cf8d45fc7-g66ps" Sep 9 05:33:08.149216 kubelet[2758]: E0909 05:33:08.148829 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5882900226842be12822c991884d87c3af9af8f00223258816e6b9ea45f22ea5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cf8d45fc7-g66ps" Sep 9 05:33:08.150114 kubelet[2758]: E0909 05:33:08.148863 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cf8d45fc7-g66ps_calico-system(6ede1a02-20db-4ba2-a9be-569e19e43266)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cf8d45fc7-g66ps_calico-system(6ede1a02-20db-4ba2-a9be-569e19e43266)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5882900226842be12822c991884d87c3af9af8f00223258816e6b9ea45f22ea5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cf8d45fc7-g66ps" podUID="6ede1a02-20db-4ba2-a9be-569e19e43266" Sep 9 05:33:08.160034 containerd[1581]: time="2025-09-09T05:33:08.159992347Z" level=error msg="Failed to destroy network for sandbox \"b629133276d8a1c3feab8d773ba68537450b5220de7f27c41d6c7c0086a4dd51\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.162269 containerd[1581]: time="2025-09-09T05:33:08.161992312Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58f7bc4dbf-rmklt,Uid:ed22ad11-18c4-4799-9a41-dea68afd7844,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b629133276d8a1c3feab8d773ba68537450b5220de7f27c41d6c7c0086a4dd51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.162701 kubelet[2758]: E0909 05:33:08.162663 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b629133276d8a1c3feab8d773ba68537450b5220de7f27c41d6c7c0086a4dd51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.162757 kubelet[2758]: E0909 05:33:08.162734 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b629133276d8a1c3feab8d773ba68537450b5220de7f27c41d6c7c0086a4dd51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-58f7bc4dbf-rmklt" Sep 9 05:33:08.162781 kubelet[2758]: E0909 05:33:08.162752 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b629133276d8a1c3feab8d773ba68537450b5220de7f27c41d6c7c0086a4dd51\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-58f7bc4dbf-rmklt" Sep 9 05:33:08.162848 kubelet[2758]: E0909 05:33:08.162810 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-58f7bc4dbf-rmklt_calico-system(ed22ad11-18c4-4799-9a41-dea68afd7844)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-58f7bc4dbf-rmklt_calico-system(ed22ad11-18c4-4799-9a41-dea68afd7844)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b629133276d8a1c3feab8d773ba68537450b5220de7f27c41d6c7c0086a4dd51\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-58f7bc4dbf-rmklt" podUID="ed22ad11-18c4-4799-9a41-dea68afd7844" Sep 9 05:33:08.170354 containerd[1581]: time="2025-09-09T05:33:08.170247799Z" level=error msg="Failed to destroy network for sandbox \"def2c57cb0211174b1803e6537568a96c07bcc21e9e9046f29c80029d6a33684\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.170721 containerd[1581]: time="2025-09-09T05:33:08.170683598Z" level=error msg="Failed to destroy network for sandbox \"5b41370a019a1eecae4da9ab0041b223062e8b8fcac353e676d477729b820df6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.171447 containerd[1581]: time="2025-09-09T05:33:08.171419416Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-gd4t2,Uid:74db9c19-3a76-4b16-941f-533dc3919843,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"def2c57cb0211174b1803e6537568a96c07bcc21e9e9046f29c80029d6a33684\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.171574 kubelet[2758]: E0909 05:33:08.171553 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"def2c57cb0211174b1803e6537568a96c07bcc21e9e9046f29c80029d6a33684\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.171639 kubelet[2758]: E0909 05:33:08.171591 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"def2c57cb0211174b1803e6537568a96c07bcc21e9e9046f29c80029d6a33684\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-gd4t2" Sep 9 05:33:08.171639 kubelet[2758]: E0909 05:33:08.171611 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"def2c57cb0211174b1803e6537568a96c07bcc21e9e9046f29c80029d6a33684\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-gd4t2" Sep 9 05:33:08.171717 kubelet[2758]: E0909 05:33:08.171638 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-gd4t2_calico-system(74db9c19-3a76-4b16-941f-533dc3919843)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-gd4t2_calico-system(74db9c19-3a76-4b16-941f-533dc3919843)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"def2c57cb0211174b1803e6537568a96c07bcc21e9e9046f29c80029d6a33684\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-gd4t2" podUID="74db9c19-3a76-4b16-941f-533dc3919843" Sep 9 05:33:08.172660 kubelet[2758]: E0909 05:33:08.172417 2758 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b41370a019a1eecae4da9ab0041b223062e8b8fcac353e676d477729b820df6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.172660 kubelet[2758]: E0909 05:33:08.172445 2758 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b41370a019a1eecae4da9ab0041b223062e8b8fcac353e676d477729b820df6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b8c85d7cc-mbm69" Sep 9 05:33:08.172660 kubelet[2758]: E0909 05:33:08.172461 2758 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b41370a019a1eecae4da9ab0041b223062e8b8fcac353e676d477729b820df6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b8c85d7cc-mbm69" Sep 9 05:33:08.172729 containerd[1581]: time="2025-09-09T05:33:08.172258455Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b8c85d7cc-mbm69,Uid:3afed2c6-cb7f-44a3-a525-6523bbddd214,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b41370a019a1eecae4da9ab0041b223062e8b8fcac353e676d477729b820df6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 05:33:08.172770 kubelet[2758]: E0909 05:33:08.172513 2758 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b8c85d7cc-mbm69_calico-apiserver(3afed2c6-cb7f-44a3-a525-6523bbddd214)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b8c85d7cc-mbm69_calico-apiserver(3afed2c6-cb7f-44a3-a525-6523bbddd214)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b41370a019a1eecae4da9ab0041b223062e8b8fcac353e676d477729b820df6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b8c85d7cc-mbm69" podUID="3afed2c6-cb7f-44a3-a525-6523bbddd214" Sep 9 05:33:08.975002 systemd[1]: run-netns-cni\x2debff9ecd\x2d366b\x2d9b0d\x2da150\x2d648d781934e9.mount: Deactivated successfully. Sep 9 05:33:08.975145 systemd[1]: run-netns-cni\x2da275d291\x2dcded\x2da462\x2da67e\x2d06a065826dbd.mount: Deactivated successfully. Sep 9 05:33:08.975256 systemd[1]: run-netns-cni\x2debb5b604\x2db162\x2da720\x2def50\x2deb6b49ffecad.mount: Deactivated successfully. Sep 9 05:33:08.975298 systemd[1]: run-netns-cni\x2d197d006e\x2d8353\x2dcc38\x2dd26f\x2d79db81c44a2d.mount: Deactivated successfully. Sep 9 05:33:08.975339 systemd[1]: run-netns-cni\x2d56e73f13\x2dc75c\x2db6f4\x2d07e8\x2df8c51fc7c71c.mount: Deactivated successfully. Sep 9 05:33:13.570065 kubelet[2758]: I0909 05:33:13.570005 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:15.188252 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2045319353.mount: Deactivated successfully. Sep 9 05:33:15.221568 containerd[1581]: time="2025-09-09T05:33:15.221521867Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:15.225057 containerd[1581]: time="2025-09-09T05:33:15.225036456Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 9 05:33:15.232311 containerd[1581]: time="2025-09-09T05:33:15.232291145Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:15.234085 containerd[1581]: time="2025-09-09T05:33:15.234054115Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:15.234462 containerd[1581]: time="2025-09-09T05:33:15.234442879Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 7.331779661s" Sep 9 05:33:15.234541 containerd[1581]: time="2025-09-09T05:33:15.234529159Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 9 05:33:15.259577 containerd[1581]: time="2025-09-09T05:33:15.259534928Z" level=info msg="CreateContainer within sandbox \"64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 05:33:15.269167 containerd[1581]: time="2025-09-09T05:33:15.269137015Z" level=info msg="Container 4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:15.286870 containerd[1581]: time="2025-09-09T05:33:15.286764785Z" level=info msg="CreateContainer within sandbox \"64750215f45a45a90466fd340c71a163fc98bd4d6b81b600ac9377e040d91894\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\"" Sep 9 05:33:15.287649 containerd[1581]: time="2025-09-09T05:33:15.287588548Z" level=info msg="StartContainer for \"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\"" Sep 9 05:33:15.288929 containerd[1581]: time="2025-09-09T05:33:15.288866356Z" level=info msg="connecting to shim 4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d" address="unix:///run/containerd/s/949320806d6b47ace0ad1fd02480b30b7ae30e7d890f69cd7333156a8512f7ac" protocol=ttrpc version=3 Sep 9 05:33:15.399314 systemd[1]: Started cri-containerd-4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d.scope - libcontainer container 4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d. Sep 9 05:33:15.479686 containerd[1581]: time="2025-09-09T05:33:15.479575040Z" level=info msg="StartContainer for \"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\" returns successfully" Sep 9 05:33:15.538814 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 05:33:15.539812 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 05:33:15.739357 kubelet[2758]: I0909 05:33:15.739247 2758 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed22ad11-18c4-4799-9a41-dea68afd7844-whisker-ca-bundle\") pod \"ed22ad11-18c4-4799-9a41-dea68afd7844\" (UID: \"ed22ad11-18c4-4799-9a41-dea68afd7844\") " Sep 9 05:33:15.740356 kubelet[2758]: I0909 05:33:15.740329 2758 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6tpt\" (UniqueName: \"kubernetes.io/projected/ed22ad11-18c4-4799-9a41-dea68afd7844-kube-api-access-v6tpt\") pod \"ed22ad11-18c4-4799-9a41-dea68afd7844\" (UID: \"ed22ad11-18c4-4799-9a41-dea68afd7844\") " Sep 9 05:33:15.740456 kubelet[2758]: I0909 05:33:15.740381 2758 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ed22ad11-18c4-4799-9a41-dea68afd7844-whisker-backend-key-pair\") pod \"ed22ad11-18c4-4799-9a41-dea68afd7844\" (UID: \"ed22ad11-18c4-4799-9a41-dea68afd7844\") " Sep 9 05:33:15.747046 kubelet[2758]: I0909 05:33:15.746982 2758 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed22ad11-18c4-4799-9a41-dea68afd7844-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ed22ad11-18c4-4799-9a41-dea68afd7844" (UID: "ed22ad11-18c4-4799-9a41-dea68afd7844"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 05:33:15.751223 kubelet[2758]: I0909 05:33:15.751188 2758 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed22ad11-18c4-4799-9a41-dea68afd7844-kube-api-access-v6tpt" (OuterVolumeSpecName: "kube-api-access-v6tpt") pod "ed22ad11-18c4-4799-9a41-dea68afd7844" (UID: "ed22ad11-18c4-4799-9a41-dea68afd7844"). InnerVolumeSpecName "kube-api-access-v6tpt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 05:33:15.751437 kubelet[2758]: I0909 05:33:15.751403 2758 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed22ad11-18c4-4799-9a41-dea68afd7844-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ed22ad11-18c4-4799-9a41-dea68afd7844" (UID: "ed22ad11-18c4-4799-9a41-dea68afd7844"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 05:33:15.841546 kubelet[2758]: I0909 05:33:15.841507 2758 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ed22ad11-18c4-4799-9a41-dea68afd7844-whisker-backend-key-pair\") on node \"ci-4452-0-0-n-de00512edc\" DevicePath \"\"" Sep 9 05:33:15.841746 kubelet[2758]: I0909 05:33:15.841718 2758 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed22ad11-18c4-4799-9a41-dea68afd7844-whisker-ca-bundle\") on node \"ci-4452-0-0-n-de00512edc\" DevicePath \"\"" Sep 9 05:33:15.841746 kubelet[2758]: I0909 05:33:15.841732 2758 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v6tpt\" (UniqueName: \"kubernetes.io/projected/ed22ad11-18c4-4799-9a41-dea68afd7844-kube-api-access-v6tpt\") on node \"ci-4452-0-0-n-de00512edc\" DevicePath \"\"" Sep 9 05:33:15.932829 systemd[1]: Removed slice kubepods-besteffort-poded22ad11_18c4_4799_9a41_dea68afd7844.slice - libcontainer container kubepods-besteffort-poded22ad11_18c4_4799_9a41_dea68afd7844.slice. Sep 9 05:33:15.949008 kubelet[2758]: I0909 05:33:15.948946 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wxpcs" podStartSLOduration=1.6340246440000001 podStartE2EDuration="18.948927823s" podCreationTimestamp="2025-09-09 05:32:57 +0000 UTC" firstStartedPulling="2025-09-09 05:32:57.920526626 +0000 UTC m=+19.263714142" lastFinishedPulling="2025-09-09 05:33:15.235429795 +0000 UTC m=+36.578617321" observedRunningTime="2025-09-09 05:33:15.947376657 +0000 UTC m=+37.290564183" watchObservedRunningTime="2025-09-09 05:33:15.948927823 +0000 UTC m=+37.292115358" Sep 9 05:33:16.048379 systemd[1]: Created slice kubepods-besteffort-pod3c71d9d5_f0a9_4797_b0b4_f96728da5c42.slice - libcontainer container kubepods-besteffort-pod3c71d9d5_f0a9_4797_b0b4_f96728da5c42.slice. Sep 9 05:33:16.144123 kubelet[2758]: I0909 05:33:16.144051 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh9xr\" (UniqueName: \"kubernetes.io/projected/3c71d9d5-f0a9-4797-b0b4-f96728da5c42-kube-api-access-bh9xr\") pod \"whisker-684cb6f94d-n5znq\" (UID: \"3c71d9d5-f0a9-4797-b0b4-f96728da5c42\") " pod="calico-system/whisker-684cb6f94d-n5znq" Sep 9 05:33:16.144123 kubelet[2758]: I0909 05:33:16.144103 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3c71d9d5-f0a9-4797-b0b4-f96728da5c42-whisker-ca-bundle\") pod \"whisker-684cb6f94d-n5znq\" (UID: \"3c71d9d5-f0a9-4797-b0b4-f96728da5c42\") " pod="calico-system/whisker-684cb6f94d-n5znq" Sep 9 05:33:16.144123 kubelet[2758]: I0909 05:33:16.144126 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3c71d9d5-f0a9-4797-b0b4-f96728da5c42-whisker-backend-key-pair\") pod \"whisker-684cb6f94d-n5znq\" (UID: \"3c71d9d5-f0a9-4797-b0b4-f96728da5c42\") " pod="calico-system/whisker-684cb6f94d-n5znq" Sep 9 05:33:16.182644 systemd[1]: var-lib-kubelet-pods-ed22ad11\x2d18c4\x2d4799\x2d9a41\x2ddea68afd7844-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dv6tpt.mount: Deactivated successfully. Sep 9 05:33:16.182763 systemd[1]: var-lib-kubelet-pods-ed22ad11\x2d18c4\x2d4799\x2d9a41\x2ddea68afd7844-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 05:33:16.354701 containerd[1581]: time="2025-09-09T05:33:16.354184020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-684cb6f94d-n5znq,Uid:3c71d9d5-f0a9-4797-b0b4-f96728da5c42,Namespace:calico-system,Attempt:0,}" Sep 9 05:33:16.772270 kubelet[2758]: I0909 05:33:16.771951 2758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed22ad11-18c4-4799-9a41-dea68afd7844" path="/var/lib/kubelet/pods/ed22ad11-18c4-4799-9a41-dea68afd7844/volumes" Sep 9 05:33:16.928432 systemd-networkd[1472]: cali1acd00ce8e9: Link UP Sep 9 05:33:16.928998 systemd-networkd[1472]: cali1acd00ce8e9: Gained carrier Sep 9 05:33:16.929673 kubelet[2758]: I0909 05:33:16.929105 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:16.970386 containerd[1581]: 2025-09-09 05:33:16.550 [INFO][3903] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 05:33:16.970386 containerd[1581]: 2025-09-09 05:33:16.604 [INFO][3903] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0 whisker-684cb6f94d- calico-system 3c71d9d5-f0a9-4797-b0b4-f96728da5c42 912 0 2025-09-09 05:33:16 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:684cb6f94d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc whisker-684cb6f94d-n5znq eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali1acd00ce8e9 [] [] }} ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Namespace="calico-system" Pod="whisker-684cb6f94d-n5znq" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-" Sep 9 05:33:16.970386 containerd[1581]: 2025-09-09 05:33:16.604 [INFO][3903] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Namespace="calico-system" Pod="whisker-684cb6f94d-n5znq" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" Sep 9 05:33:16.970386 containerd[1581]: 2025-09-09 05:33:16.831 [INFO][3915] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" HandleID="k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Workload="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.835 [INFO][3915] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" HandleID="k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Workload="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e750), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452-0-0-n-de00512edc", "pod":"whisker-684cb6f94d-n5znq", "timestamp":"2025-09-09 05:33:16.83125466 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.835 [INFO][3915] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.836 [INFO][3915] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.836 [INFO][3915] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.856 [INFO][3915] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.881 [INFO][3915] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.889 [INFO][3915] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.894 [INFO][3915] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.971072 containerd[1581]: 2025-09-09 05:33:16.898 [INFO][3915] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.973186 containerd[1581]: 2025-09-09 05:33:16.898 [INFO][3915] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.973186 containerd[1581]: 2025-09-09 05:33:16.900 [INFO][3915] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2 Sep 9 05:33:16.973186 containerd[1581]: 2025-09-09 05:33:16.909 [INFO][3915] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.973186 containerd[1581]: 2025-09-09 05:33:16.915 [INFO][3915] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.65/26] block=192.168.21.64/26 handle="k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.973186 containerd[1581]: 2025-09-09 05:33:16.915 [INFO][3915] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.65/26] handle="k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:16.973186 containerd[1581]: 2025-09-09 05:33:16.915 [INFO][3915] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:16.973186 containerd[1581]: 2025-09-09 05:33:16.915 [INFO][3915] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.65/26] IPv6=[] ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" HandleID="k8s-pod-network.a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Workload="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" Sep 9 05:33:16.973303 containerd[1581]: 2025-09-09 05:33:16.918 [INFO][3903] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Namespace="calico-system" Pod="whisker-684cb6f94d-n5znq" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0", GenerateName:"whisker-684cb6f94d-", Namespace:"calico-system", SelfLink:"", UID:"3c71d9d5-f0a9-4797-b0b4-f96728da5c42", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 33, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"684cb6f94d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"whisker-684cb6f94d-n5znq", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.21.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1acd00ce8e9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:16.973303 containerd[1581]: 2025-09-09 05:33:16.918 [INFO][3903] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.65/32] ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Namespace="calico-system" Pod="whisker-684cb6f94d-n5znq" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" Sep 9 05:33:16.973369 containerd[1581]: 2025-09-09 05:33:16.918 [INFO][3903] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1acd00ce8e9 ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Namespace="calico-system" Pod="whisker-684cb6f94d-n5znq" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" Sep 9 05:33:16.973369 containerd[1581]: 2025-09-09 05:33:16.936 [INFO][3903] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Namespace="calico-system" Pod="whisker-684cb6f94d-n5znq" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" Sep 9 05:33:16.973704 containerd[1581]: 2025-09-09 05:33:16.938 [INFO][3903] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Namespace="calico-system" Pod="whisker-684cb6f94d-n5znq" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0", GenerateName:"whisker-684cb6f94d-", Namespace:"calico-system", SelfLink:"", UID:"3c71d9d5-f0a9-4797-b0b4-f96728da5c42", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 33, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"684cb6f94d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2", Pod:"whisker-684cb6f94d-n5znq", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.21.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1acd00ce8e9", MAC:"ca:31:56:e3:0a:b9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:16.973755 containerd[1581]: 2025-09-09 05:33:16.954 [INFO][3903] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" Namespace="calico-system" Pod="whisker-684cb6f94d-n5znq" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-whisker--684cb6f94d--n5znq-eth0" Sep 9 05:33:17.218420 containerd[1581]: time="2025-09-09T05:33:17.217679517Z" level=info msg="connecting to shim a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2" address="unix:///run/containerd/s/c56a5b9b03d2d8a1e1d0feedb88d4534a52c7454831b106c9c28b2c564936963" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:17.270603 systemd[1]: Started cri-containerd-a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2.scope - libcontainer container a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2. Sep 9 05:33:17.409823 containerd[1581]: time="2025-09-09T05:33:17.409792225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-684cb6f94d-n5znq,Uid:3c71d9d5-f0a9-4797-b0b4-f96728da5c42,Namespace:calico-system,Attempt:0,} returns sandbox id \"a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2\"" Sep 9 05:33:17.412862 containerd[1581]: time="2025-09-09T05:33:17.412845118Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 05:33:17.604880 systemd-networkd[1472]: vxlan.calico: Link UP Sep 9 05:33:17.604886 systemd-networkd[1472]: vxlan.calico: Gained carrier Sep 9 05:33:18.271245 systemd-networkd[1472]: cali1acd00ce8e9: Gained IPv6LL Sep 9 05:33:18.848416 systemd-networkd[1472]: vxlan.calico: Gained IPv6LL Sep 9 05:33:19.292059 containerd[1581]: time="2025-09-09T05:33:19.291998348Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:19.293128 containerd[1581]: time="2025-09-09T05:33:19.293071777Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 9 05:33:19.294050 containerd[1581]: time="2025-09-09T05:33:19.293535891Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:19.295137 containerd[1581]: time="2025-09-09T05:33:19.295092881Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:19.295914 containerd[1581]: time="2025-09-09T05:33:19.295492225Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.882538273s" Sep 9 05:33:19.295914 containerd[1581]: time="2025-09-09T05:33:19.295525396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 9 05:33:19.298647 containerd[1581]: time="2025-09-09T05:33:19.298606864Z" level=info msg="CreateContainer within sandbox \"a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 05:33:19.310050 containerd[1581]: time="2025-09-09T05:33:19.308111003Z" level=info msg="Container 5ae380e905da49dc0dc99cc9f897ce36858a655ab4c78811004dd800182445e0: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:19.310364 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1898790758.mount: Deactivated successfully. Sep 9 05:33:19.328867 containerd[1581]: time="2025-09-09T05:33:19.328815446Z" level=info msg="CreateContainer within sandbox \"a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"5ae380e905da49dc0dc99cc9f897ce36858a655ab4c78811004dd800182445e0\"" Sep 9 05:33:19.330218 containerd[1581]: time="2025-09-09T05:33:19.330135275Z" level=info msg="StartContainer for \"5ae380e905da49dc0dc99cc9f897ce36858a655ab4c78811004dd800182445e0\"" Sep 9 05:33:19.331345 containerd[1581]: time="2025-09-09T05:33:19.331325832Z" level=info msg="connecting to shim 5ae380e905da49dc0dc99cc9f897ce36858a655ab4c78811004dd800182445e0" address="unix:///run/containerd/s/c56a5b9b03d2d8a1e1d0feedb88d4534a52c7454831b106c9c28b2c564936963" protocol=ttrpc version=3 Sep 9 05:33:19.356199 systemd[1]: Started cri-containerd-5ae380e905da49dc0dc99cc9f897ce36858a655ab4c78811004dd800182445e0.scope - libcontainer container 5ae380e905da49dc0dc99cc9f897ce36858a655ab4c78811004dd800182445e0. Sep 9 05:33:19.410911 containerd[1581]: time="2025-09-09T05:33:19.410796165Z" level=info msg="StartContainer for \"5ae380e905da49dc0dc99cc9f897ce36858a655ab4c78811004dd800182445e0\" returns successfully" Sep 9 05:33:19.415120 containerd[1581]: time="2025-09-09T05:33:19.414921017Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 05:33:19.767152 containerd[1581]: time="2025-09-09T05:33:19.766862007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cf8d45fc7-g66ps,Uid:6ede1a02-20db-4ba2-a9be-569e19e43266,Namespace:calico-system,Attempt:0,}" Sep 9 05:33:19.894366 systemd-networkd[1472]: califd515f9a270: Link UP Sep 9 05:33:19.895801 systemd-networkd[1472]: califd515f9a270: Gained carrier Sep 9 05:33:19.913546 containerd[1581]: 2025-09-09 05:33:19.804 [INFO][4212] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0 calico-kube-controllers-7cf8d45fc7- calico-system 6ede1a02-20db-4ba2-a9be-569e19e43266 838 0 2025-09-09 05:32:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7cf8d45fc7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc calico-kube-controllers-7cf8d45fc7-g66ps eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] califd515f9a270 [] [] }} ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Namespace="calico-system" Pod="calico-kube-controllers-7cf8d45fc7-g66ps" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-" Sep 9 05:33:19.913546 containerd[1581]: 2025-09-09 05:33:19.804 [INFO][4212] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Namespace="calico-system" Pod="calico-kube-controllers-7cf8d45fc7-g66ps" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" Sep 9 05:33:19.913546 containerd[1581]: 2025-09-09 05:33:19.831 [INFO][4225] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" HandleID="k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.832 [INFO][4225] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" HandleID="k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ac710), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452-0-0-n-de00512edc", "pod":"calico-kube-controllers-7cf8d45fc7-g66ps", "timestamp":"2025-09-09 05:33:19.831830266 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.832 [INFO][4225] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.832 [INFO][4225] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.832 [INFO][4225] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.838 [INFO][4225] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.843 [INFO][4225] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.849 [INFO][4225] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.853 [INFO][4225] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.918731 containerd[1581]: 2025-09-09 05:33:19.857 [INFO][4225] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.922265 containerd[1581]: 2025-09-09 05:33:19.857 [INFO][4225] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.922265 containerd[1581]: 2025-09-09 05:33:19.862 [INFO][4225] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304 Sep 9 05:33:19.922265 containerd[1581]: 2025-09-09 05:33:19.872 [INFO][4225] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.922265 containerd[1581]: 2025-09-09 05:33:19.885 [INFO][4225] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.66/26] block=192.168.21.64/26 handle="k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.922265 containerd[1581]: 2025-09-09 05:33:19.885 [INFO][4225] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.66/26] handle="k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:19.922265 containerd[1581]: 2025-09-09 05:33:19.885 [INFO][4225] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:19.922265 containerd[1581]: 2025-09-09 05:33:19.885 [INFO][4225] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.66/26] IPv6=[] ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" HandleID="k8s-pod-network.7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" Sep 9 05:33:19.924610 containerd[1581]: 2025-09-09 05:33:19.890 [INFO][4212] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Namespace="calico-system" Pod="calico-kube-controllers-7cf8d45fc7-g66ps" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0", GenerateName:"calico-kube-controllers-7cf8d45fc7-", Namespace:"calico-system", SelfLink:"", UID:"6ede1a02-20db-4ba2-a9be-569e19e43266", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cf8d45fc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"calico-kube-controllers-7cf8d45fc7-g66ps", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califd515f9a270", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:19.924720 containerd[1581]: 2025-09-09 05:33:19.890 [INFO][4212] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.66/32] ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Namespace="calico-system" Pod="calico-kube-controllers-7cf8d45fc7-g66ps" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" Sep 9 05:33:19.924720 containerd[1581]: 2025-09-09 05:33:19.890 [INFO][4212] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califd515f9a270 ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Namespace="calico-system" Pod="calico-kube-controllers-7cf8d45fc7-g66ps" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" Sep 9 05:33:19.924720 containerd[1581]: 2025-09-09 05:33:19.896 [INFO][4212] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Namespace="calico-system" Pod="calico-kube-controllers-7cf8d45fc7-g66ps" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" Sep 9 05:33:19.924815 containerd[1581]: 2025-09-09 05:33:19.897 [INFO][4212] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Namespace="calico-system" Pod="calico-kube-controllers-7cf8d45fc7-g66ps" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0", GenerateName:"calico-kube-controllers-7cf8d45fc7-", Namespace:"calico-system", SelfLink:"", UID:"6ede1a02-20db-4ba2-a9be-569e19e43266", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cf8d45fc7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304", Pod:"calico-kube-controllers-7cf8d45fc7-g66ps", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.21.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califd515f9a270", MAC:"56:50:54:de:d8:e1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:19.924888 containerd[1581]: 2025-09-09 05:33:19.906 [INFO][4212] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" Namespace="calico-system" Pod="calico-kube-controllers-7cf8d45fc7-g66ps" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--kube--controllers--7cf8d45fc7--g66ps-eth0" Sep 9 05:33:19.950479 containerd[1581]: time="2025-09-09T05:33:19.949829286Z" level=info msg="connecting to shim 7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304" address="unix:///run/containerd/s/9b25593bdbc0f43a8a123ccd29e8116b14f55c294273f545d24fc987c86080a7" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:19.983185 systemd[1]: Started cri-containerd-7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304.scope - libcontainer container 7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304. Sep 9 05:33:20.046982 containerd[1581]: time="2025-09-09T05:33:20.046901562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cf8d45fc7-g66ps,Uid:6ede1a02-20db-4ba2-a9be-569e19e43266,Namespace:calico-system,Attempt:0,} returns sandbox id \"7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304\"" Sep 9 05:33:20.774905 containerd[1581]: time="2025-09-09T05:33:20.774799231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qrqmt,Uid:c1606d3b-7ca4-49f1-8973-eb24dc83693b,Namespace:calico-system,Attempt:0,}" Sep 9 05:33:20.775454 containerd[1581]: time="2025-09-09T05:33:20.775389050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-gd4t2,Uid:74db9c19-3a76-4b16-941f-533dc3919843,Namespace:calico-system,Attempt:0,}" Sep 9 05:33:20.775855 containerd[1581]: time="2025-09-09T05:33:20.775797811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b8c85d7cc-csjg4,Uid:96451131-9615-4466-8637-9110e935815d,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:33:20.960044 systemd-networkd[1472]: cali143759898e4: Link UP Sep 9 05:33:20.961522 systemd-networkd[1472]: cali143759898e4: Gained carrier Sep 9 05:33:20.978269 containerd[1581]: 2025-09-09 05:33:20.854 [INFO][4284] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0 goldmane-54d579b49d- calico-system 74db9c19-3a76-4b16-941f-533dc3919843 839 0 2025-09-09 05:32:57 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc goldmane-54d579b49d-gd4t2 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali143759898e4 [] [] }} ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Namespace="calico-system" Pod="goldmane-54d579b49d-gd4t2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-" Sep 9 05:33:20.978269 containerd[1581]: 2025-09-09 05:33:20.855 [INFO][4284] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Namespace="calico-system" Pod="goldmane-54d579b49d-gd4t2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" Sep 9 05:33:20.978269 containerd[1581]: 2025-09-09 05:33:20.918 [INFO][4319] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" HandleID="k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Workload="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.918 [INFO][4319] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" HandleID="k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Workload="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000335990), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452-0-0-n-de00512edc", "pod":"goldmane-54d579b49d-gd4t2", "timestamp":"2025-09-09 05:33:20.918747378 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.918 [INFO][4319] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.918 [INFO][4319] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.919 [INFO][4319] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.925 [INFO][4319] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.929 [INFO][4319] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.934 [INFO][4319] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.936 [INFO][4319] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978548 containerd[1581]: 2025-09-09 05:33:20.938 [INFO][4319] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978804 containerd[1581]: 2025-09-09 05:33:20.938 [INFO][4319] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978804 containerd[1581]: 2025-09-09 05:33:20.940 [INFO][4319] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2 Sep 9 05:33:20.978804 containerd[1581]: 2025-09-09 05:33:20.945 [INFO][4319] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978804 containerd[1581]: 2025-09-09 05:33:20.952 [INFO][4319] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.67/26] block=192.168.21.64/26 handle="k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978804 containerd[1581]: 2025-09-09 05:33:20.952 [INFO][4319] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.67/26] handle="k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:20.978804 containerd[1581]: 2025-09-09 05:33:20.953 [INFO][4319] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:20.978804 containerd[1581]: 2025-09-09 05:33:20.953 [INFO][4319] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.67/26] IPv6=[] ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" HandleID="k8s-pod-network.e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Workload="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" Sep 9 05:33:20.978953 containerd[1581]: 2025-09-09 05:33:20.956 [INFO][4284] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Namespace="calico-system" Pod="goldmane-54d579b49d-gd4t2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"74db9c19-3a76-4b16-941f-533dc3919843", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"goldmane-54d579b49d-gd4t2", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.21.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali143759898e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:20.979005 containerd[1581]: 2025-09-09 05:33:20.956 [INFO][4284] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.67/32] ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Namespace="calico-system" Pod="goldmane-54d579b49d-gd4t2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" Sep 9 05:33:20.979005 containerd[1581]: 2025-09-09 05:33:20.956 [INFO][4284] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali143759898e4 ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Namespace="calico-system" Pod="goldmane-54d579b49d-gd4t2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" Sep 9 05:33:20.979005 containerd[1581]: 2025-09-09 05:33:20.962 [INFO][4284] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Namespace="calico-system" Pod="goldmane-54d579b49d-gd4t2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" Sep 9 05:33:20.979097 containerd[1581]: 2025-09-09 05:33:20.962 [INFO][4284] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Namespace="calico-system" Pod="goldmane-54d579b49d-gd4t2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"74db9c19-3a76-4b16-941f-533dc3919843", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2", Pod:"goldmane-54d579b49d-gd4t2", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.21.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali143759898e4", MAC:"0e:cd:5f:f5:61:49", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:20.979164 containerd[1581]: 2025-09-09 05:33:20.973 [INFO][4284] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" Namespace="calico-system" Pod="goldmane-54d579b49d-gd4t2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-goldmane--54d579b49d--gd4t2-eth0" Sep 9 05:33:21.002048 containerd[1581]: time="2025-09-09T05:33:21.001778120Z" level=info msg="connecting to shim e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2" address="unix:///run/containerd/s/7062fd463fdb3dbe67d6ef96bc583d6d46f21b6016ff6f610b3e7868a6078e70" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:21.024200 systemd[1]: Started cri-containerd-e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2.scope - libcontainer container e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2. Sep 9 05:33:21.071339 systemd-networkd[1472]: calie8410a9c0ca: Link UP Sep 9 05:33:21.074487 systemd-networkd[1472]: calie8410a9c0ca: Gained carrier Sep 9 05:33:21.094833 containerd[1581]: 2025-09-09 05:33:20.874 [INFO][4292] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0 csi-node-driver- calico-system c1606d3b-7ca4-49f1-8973-eb24dc83693b 732 0 2025-09-09 05:32:57 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc csi-node-driver-qrqmt eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie8410a9c0ca [] [] }} ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Namespace="calico-system" Pod="csi-node-driver-qrqmt" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-" Sep 9 05:33:21.094833 containerd[1581]: 2025-09-09 05:33:20.874 [INFO][4292] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Namespace="calico-system" Pod="csi-node-driver-qrqmt" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" Sep 9 05:33:21.094833 containerd[1581]: 2025-09-09 05:33:20.919 [INFO][4325] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" HandleID="k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Workload="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:20.919 [INFO][4325] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" HandleID="k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Workload="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4452-0-0-n-de00512edc", "pod":"csi-node-driver-qrqmt", "timestamp":"2025-09-09 05:33:20.919184893 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:20.919 [INFO][4325] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:20.953 [INFO][4325] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:20.953 [INFO][4325] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:21.026 [INFO][4325] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:21.032 [INFO][4325] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:21.040 [INFO][4325] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:21.043 [INFO][4325] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095162 containerd[1581]: 2025-09-09 05:33:21.046 [INFO][4325] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095493 containerd[1581]: 2025-09-09 05:33:21.046 [INFO][4325] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095493 containerd[1581]: 2025-09-09 05:33:21.048 [INFO][4325] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1 Sep 9 05:33:21.095493 containerd[1581]: 2025-09-09 05:33:21.052 [INFO][4325] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095493 containerd[1581]: 2025-09-09 05:33:21.058 [INFO][4325] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.68/26] block=192.168.21.64/26 handle="k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095493 containerd[1581]: 2025-09-09 05:33:21.058 [INFO][4325] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.68/26] handle="k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.095493 containerd[1581]: 2025-09-09 05:33:21.059 [INFO][4325] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:21.095493 containerd[1581]: 2025-09-09 05:33:21.060 [INFO][4325] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.68/26] IPv6=[] ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" HandleID="k8s-pod-network.e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Workload="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" Sep 9 05:33:21.096153 containerd[1581]: 2025-09-09 05:33:21.064 [INFO][4292] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Namespace="calico-system" Pod="csi-node-driver-qrqmt" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c1606d3b-7ca4-49f1-8973-eb24dc83693b", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"csi-node-driver-qrqmt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie8410a9c0ca", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:21.096335 containerd[1581]: 2025-09-09 05:33:21.064 [INFO][4292] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.68/32] ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Namespace="calico-system" Pod="csi-node-driver-qrqmt" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" Sep 9 05:33:21.096335 containerd[1581]: 2025-09-09 05:33:21.064 [INFO][4292] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8410a9c0ca ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Namespace="calico-system" Pod="csi-node-driver-qrqmt" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" Sep 9 05:33:21.096335 containerd[1581]: 2025-09-09 05:33:21.074 [INFO][4292] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Namespace="calico-system" Pod="csi-node-driver-qrqmt" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" Sep 9 05:33:21.096419 containerd[1581]: 2025-09-09 05:33:21.074 [INFO][4292] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Namespace="calico-system" Pod="csi-node-driver-qrqmt" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c1606d3b-7ca4-49f1-8973-eb24dc83693b", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1", Pod:"csi-node-driver-qrqmt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.21.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie8410a9c0ca", MAC:"5a:0b:3f:a1:64:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:21.096511 containerd[1581]: 2025-09-09 05:33:21.091 [INFO][4292] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" Namespace="calico-system" Pod="csi-node-driver-qrqmt" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-csi--node--driver--qrqmt-eth0" Sep 9 05:33:21.116137 containerd[1581]: time="2025-09-09T05:33:21.116082323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-gd4t2,Uid:74db9c19-3a76-4b16-941f-533dc3919843,Namespace:calico-system,Attempt:0,} returns sandbox id \"e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2\"" Sep 9 05:33:21.137113 containerd[1581]: time="2025-09-09T05:33:21.137011790Z" level=info msg="connecting to shim e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1" address="unix:///run/containerd/s/4d8aab99556e505d14ed9f5a75a25e26a2009026794550561a2b36bb08d1e149" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:21.153520 systemd-networkd[1472]: califd515f9a270: Gained IPv6LL Sep 9 05:33:21.192306 systemd[1]: Started cri-containerd-e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1.scope - libcontainer container e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1. Sep 9 05:33:21.201599 systemd-networkd[1472]: califc1bbf96d7c: Link UP Sep 9 05:33:21.203166 systemd-networkd[1472]: califc1bbf96d7c: Gained carrier Sep 9 05:33:21.226776 containerd[1581]: 2025-09-09 05:33:20.881 [INFO][4298] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0 calico-apiserver-6b8c85d7cc- calico-apiserver 96451131-9615-4466-8637-9110e935815d 840 0 2025-09-09 05:32:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b8c85d7cc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc calico-apiserver-6b8c85d7cc-csjg4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califc1bbf96d7c [] [] }} ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-csjg4" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-" Sep 9 05:33:21.226776 containerd[1581]: 2025-09-09 05:33:20.881 [INFO][4298] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-csjg4" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:21.226776 containerd[1581]: 2025-09-09 05:33:20.921 [INFO][4327] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:20.921 [INFO][4327] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5730), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452-0-0-n-de00512edc", "pod":"calico-apiserver-6b8c85d7cc-csjg4", "timestamp":"2025-09-09 05:33:20.921468246 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:20.922 [INFO][4327] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:21.059 [INFO][4327] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:21.059 [INFO][4327] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:21.127 [INFO][4327] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:21.139 [INFO][4327] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:21.147 [INFO][4327] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:21.155 [INFO][4327] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.226980 containerd[1581]: 2025-09-09 05:33:21.169 [INFO][4327] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.227651 containerd[1581]: 2025-09-09 05:33:21.169 [INFO][4327] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.227651 containerd[1581]: 2025-09-09 05:33:21.172 [INFO][4327] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f Sep 9 05:33:21.227651 containerd[1581]: 2025-09-09 05:33:21.186 [INFO][4327] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.227651 containerd[1581]: 2025-09-09 05:33:21.193 [INFO][4327] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.69/26] block=192.168.21.64/26 handle="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.227651 containerd[1581]: 2025-09-09 05:33:21.194 [INFO][4327] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.69/26] handle="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.227651 containerd[1581]: 2025-09-09 05:33:21.195 [INFO][4327] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:21.227651 containerd[1581]: 2025-09-09 05:33:21.195 [INFO][4327] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.69/26] IPv6=[] ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:21.227797 containerd[1581]: 2025-09-09 05:33:21.197 [INFO][4298] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-csjg4" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0", GenerateName:"calico-apiserver-6b8c85d7cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"96451131-9615-4466-8637-9110e935815d", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b8c85d7cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"calico-apiserver-6b8c85d7cc-csjg4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califc1bbf96d7c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:21.227855 containerd[1581]: 2025-09-09 05:33:21.198 [INFO][4298] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.69/32] ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-csjg4" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:21.227855 containerd[1581]: 2025-09-09 05:33:21.198 [INFO][4298] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc1bbf96d7c ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-csjg4" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:21.227855 containerd[1581]: 2025-09-09 05:33:21.203 [INFO][4298] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-csjg4" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:21.227943 containerd[1581]: 2025-09-09 05:33:21.204 [INFO][4298] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-csjg4" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0", GenerateName:"calico-apiserver-6b8c85d7cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"96451131-9615-4466-8637-9110e935815d", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b8c85d7cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f", Pod:"calico-apiserver-6b8c85d7cc-csjg4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califc1bbf96d7c", MAC:"06:10:b0:5f:ec:34", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:21.227990 containerd[1581]: 2025-09-09 05:33:21.217 [INFO][4298] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-csjg4" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:21.256068 containerd[1581]: time="2025-09-09T05:33:21.254880282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-qrqmt,Uid:c1606d3b-7ca4-49f1-8973-eb24dc83693b,Namespace:calico-system,Attempt:0,} returns sandbox id \"e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1\"" Sep 9 05:33:21.265670 containerd[1581]: time="2025-09-09T05:33:21.265578330Z" level=info msg="connecting to shim bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" address="unix:///run/containerd/s/cb4492f114b86bc40033f9e871d0e045b2102b90cd1522612fe9b388ed59d479" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:21.291155 systemd[1]: Started cri-containerd-bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f.scope - libcontainer container bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f. Sep 9 05:33:21.347950 containerd[1581]: time="2025-09-09T05:33:21.347771758Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b8c85d7cc-csjg4,Uid:96451131-9615-4466-8637-9110e935815d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\"" Sep 9 05:33:21.767281 containerd[1581]: time="2025-09-09T05:33:21.767046973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6nz5j,Uid:319ebac8-412e-4d4a-90af-84d99ce90213,Namespace:kube-system,Attempt:0,}" Sep 9 05:33:21.885117 systemd-networkd[1472]: cali201afd8947d: Link UP Sep 9 05:33:21.886063 systemd-networkd[1472]: cali201afd8947d: Gained carrier Sep 9 05:33:21.914541 containerd[1581]: 2025-09-09 05:33:21.808 [INFO][4504] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0 coredns-668d6bf9bc- kube-system 319ebac8-412e-4d4a-90af-84d99ce90213 829 0 2025-09-09 05:32:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc coredns-668d6bf9bc-6nz5j eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali201afd8947d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Namespace="kube-system" Pod="coredns-668d6bf9bc-6nz5j" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-" Sep 9 05:33:21.914541 containerd[1581]: 2025-09-09 05:33:21.808 [INFO][4504] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Namespace="kube-system" Pod="coredns-668d6bf9bc-6nz5j" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" Sep 9 05:33:21.914541 containerd[1581]: 2025-09-09 05:33:21.837 [INFO][4514] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" HandleID="k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Workload="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.837 [INFO][4514] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" HandleID="k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Workload="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f2b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452-0-0-n-de00512edc", "pod":"coredns-668d6bf9bc-6nz5j", "timestamp":"2025-09-09 05:33:21.836985703 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.837 [INFO][4514] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.837 [INFO][4514] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.837 [INFO][4514] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.843 [INFO][4514] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.849 [INFO][4514] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.855 [INFO][4514] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.857 [INFO][4514] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.914919 containerd[1581]: 2025-09-09 05:33:21.859 [INFO][4514] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.916354 containerd[1581]: 2025-09-09 05:33:21.859 [INFO][4514] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.916354 containerd[1581]: 2025-09-09 05:33:21.861 [INFO][4514] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab Sep 9 05:33:21.916354 containerd[1581]: 2025-09-09 05:33:21.867 [INFO][4514] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.916354 containerd[1581]: 2025-09-09 05:33:21.873 [INFO][4514] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.70/26] block=192.168.21.64/26 handle="k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.916354 containerd[1581]: 2025-09-09 05:33:21.873 [INFO][4514] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.70/26] handle="k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:21.916354 containerd[1581]: 2025-09-09 05:33:21.873 [INFO][4514] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:21.916354 containerd[1581]: 2025-09-09 05:33:21.874 [INFO][4514] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.70/26] IPv6=[] ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" HandleID="k8s-pod-network.68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Workload="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" Sep 9 05:33:21.916475 containerd[1581]: 2025-09-09 05:33:21.877 [INFO][4504] cni-plugin/k8s.go 418: Populated endpoint ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Namespace="kube-system" Pod="coredns-668d6bf9bc-6nz5j" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"319ebac8-412e-4d4a-90af-84d99ce90213", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"coredns-668d6bf9bc-6nz5j", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali201afd8947d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:21.916475 containerd[1581]: 2025-09-09 05:33:21.877 [INFO][4504] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.70/32] ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Namespace="kube-system" Pod="coredns-668d6bf9bc-6nz5j" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" Sep 9 05:33:21.916475 containerd[1581]: 2025-09-09 05:33:21.877 [INFO][4504] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali201afd8947d ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Namespace="kube-system" Pod="coredns-668d6bf9bc-6nz5j" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" Sep 9 05:33:21.916475 containerd[1581]: 2025-09-09 05:33:21.888 [INFO][4504] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Namespace="kube-system" Pod="coredns-668d6bf9bc-6nz5j" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" Sep 9 05:33:21.916475 containerd[1581]: 2025-09-09 05:33:21.889 [INFO][4504] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Namespace="kube-system" Pod="coredns-668d6bf9bc-6nz5j" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"319ebac8-412e-4d4a-90af-84d99ce90213", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab", Pod:"coredns-668d6bf9bc-6nz5j", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali201afd8947d", MAC:"be:bb:11:db:08:4f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:21.916475 containerd[1581]: 2025-09-09 05:33:21.906 [INFO][4504] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" Namespace="kube-system" Pod="coredns-668d6bf9bc-6nz5j" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--6nz5j-eth0" Sep 9 05:33:21.947113 containerd[1581]: time="2025-09-09T05:33:21.946962245Z" level=info msg="connecting to shim 68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab" address="unix:///run/containerd/s/ca9cd1b76efc4c2f2fd73d4c48dc6fe4ccb8f430ab63e66087e5b02ad1601787" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:21.975155 systemd[1]: Started cri-containerd-68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab.scope - libcontainer container 68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab. Sep 9 05:33:22.030202 containerd[1581]: time="2025-09-09T05:33:22.030110661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-6nz5j,Uid:319ebac8-412e-4d4a-90af-84d99ce90213,Namespace:kube-system,Attempt:0,} returns sandbox id \"68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab\"" Sep 9 05:33:22.035939 containerd[1581]: time="2025-09-09T05:33:22.035694473Z" level=info msg="CreateContainer within sandbox \"68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:33:22.055860 containerd[1581]: time="2025-09-09T05:33:22.055715434Z" level=info msg="Container 4461c758d8ee9c9f7dc9a56a6bc0aae930dda4c6af8124e176d90bfe856265f1: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:22.063992 containerd[1581]: time="2025-09-09T05:33:22.063912646Z" level=info msg="CreateContainer within sandbox \"68fdb99dd5d9328bec41cc10ea094e53506fb0623277aa5de5ee45263b2eb0ab\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4461c758d8ee9c9f7dc9a56a6bc0aae930dda4c6af8124e176d90bfe856265f1\"" Sep 9 05:33:22.065078 containerd[1581]: time="2025-09-09T05:33:22.064993249Z" level=info msg="StartContainer for \"4461c758d8ee9c9f7dc9a56a6bc0aae930dda4c6af8124e176d90bfe856265f1\"" Sep 9 05:33:22.066043 containerd[1581]: time="2025-09-09T05:33:22.066009844Z" level=info msg="connecting to shim 4461c758d8ee9c9f7dc9a56a6bc0aae930dda4c6af8124e176d90bfe856265f1" address="unix:///run/containerd/s/ca9cd1b76efc4c2f2fd73d4c48dc6fe4ccb8f430ab63e66087e5b02ad1601787" protocol=ttrpc version=3 Sep 9 05:33:22.100233 systemd[1]: Started cri-containerd-4461c758d8ee9c9f7dc9a56a6bc0aae930dda4c6af8124e176d90bfe856265f1.scope - libcontainer container 4461c758d8ee9c9f7dc9a56a6bc0aae930dda4c6af8124e176d90bfe856265f1. Sep 9 05:33:22.162591 containerd[1581]: time="2025-09-09T05:33:22.153807047Z" level=info msg="StartContainer for \"4461c758d8ee9c9f7dc9a56a6bc0aae930dda4c6af8124e176d90bfe856265f1\" returns successfully" Sep 9 05:33:22.176542 systemd-networkd[1472]: calie8410a9c0ca: Gained IPv6LL Sep 9 05:33:22.239777 systemd-networkd[1472]: cali143759898e4: Gained IPv6LL Sep 9 05:33:22.471415 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3175540997.mount: Deactivated successfully. Sep 9 05:33:22.491563 containerd[1581]: time="2025-09-09T05:33:22.491510807Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:22.492362 containerd[1581]: time="2025-09-09T05:33:22.492335173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 9 05:33:22.494256 containerd[1581]: time="2025-09-09T05:33:22.493416206Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:22.495376 containerd[1581]: time="2025-09-09T05:33:22.495357212Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:22.495945 containerd[1581]: time="2025-09-09T05:33:22.495927085Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.080957378s" Sep 9 05:33:22.496380 containerd[1581]: time="2025-09-09T05:33:22.496044703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 9 05:33:22.497864 containerd[1581]: time="2025-09-09T05:33:22.497736186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 05:33:22.498687 containerd[1581]: time="2025-09-09T05:33:22.498662832Z" level=info msg="CreateContainer within sandbox \"a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 05:33:22.508772 containerd[1581]: time="2025-09-09T05:33:22.505797714Z" level=info msg="Container 8a8ffe8c3114802222bdd96cca76fdc2beaccff1636fbef3756bccd3cd2d982a: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:22.517689 containerd[1581]: time="2025-09-09T05:33:22.517654134Z" level=info msg="CreateContainer within sandbox \"a1e7397df13522bcc927f90bce945a2234af5eecc2b314497cbf309e4ce14ce2\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"8a8ffe8c3114802222bdd96cca76fdc2beaccff1636fbef3756bccd3cd2d982a\"" Sep 9 05:33:22.518699 containerd[1581]: time="2025-09-09T05:33:22.518573416Z" level=info msg="StartContainer for \"8a8ffe8c3114802222bdd96cca76fdc2beaccff1636fbef3756bccd3cd2d982a\"" Sep 9 05:33:22.520159 containerd[1581]: time="2025-09-09T05:33:22.520083300Z" level=info msg="connecting to shim 8a8ffe8c3114802222bdd96cca76fdc2beaccff1636fbef3756bccd3cd2d982a" address="unix:///run/containerd/s/c56a5b9b03d2d8a1e1d0feedb88d4534a52c7454831b106c9c28b2c564936963" protocol=ttrpc version=3 Sep 9 05:33:22.576658 systemd[1]: Started cri-containerd-8a8ffe8c3114802222bdd96cca76fdc2beaccff1636fbef3756bccd3cd2d982a.scope - libcontainer container 8a8ffe8c3114802222bdd96cca76fdc2beaccff1636fbef3756bccd3cd2d982a. Sep 9 05:33:22.624976 containerd[1581]: time="2025-09-09T05:33:22.624896885Z" level=info msg="StartContainer for \"8a8ffe8c3114802222bdd96cca76fdc2beaccff1636fbef3756bccd3cd2d982a\" returns successfully" Sep 9 05:33:22.751421 systemd-networkd[1472]: califc1bbf96d7c: Gained IPv6LL Sep 9 05:33:22.767782 containerd[1581]: time="2025-09-09T05:33:22.766783312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b8c85d7cc-mbm69,Uid:3afed2c6-cb7f-44a3-a525-6523bbddd214,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:33:22.768309 containerd[1581]: time="2025-09-09T05:33:22.767406823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dfd4dff7c-shntr,Uid:a438f529-ea8a-4969-a2a9-1d9ed1fd6a79,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:33:22.768650 containerd[1581]: time="2025-09-09T05:33:22.768590719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kbfs2,Uid:02708d09-983e-47b5-b5d8-8af08b1e4a26,Namespace:kube-system,Attempt:0,}" Sep 9 05:33:22.948738 systemd-networkd[1472]: cali0eca24e9198: Link UP Sep 9 05:33:22.950778 systemd-networkd[1472]: cali0eca24e9198: Gained carrier Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.828 [INFO][4652] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0 calico-apiserver-6b8c85d7cc- calico-apiserver 3afed2c6-cb7f-44a3-a525-6523bbddd214 834 0 2025-09-09 05:32:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b8c85d7cc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc calico-apiserver-6b8c85d7cc-mbm69 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0eca24e9198 [] [] }} ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-mbm69" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.836 [INFO][4652] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-mbm69" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.883 [INFO][4688] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.884 [INFO][4688] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5230), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452-0-0-n-de00512edc", "pod":"calico-apiserver-6b8c85d7cc-mbm69", "timestamp":"2025-09-09 05:33:22.883862831 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.884 [INFO][4688] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.884 [INFO][4688] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.884 [INFO][4688] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.893 [INFO][4688] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.902 [INFO][4688] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.913 [INFO][4688] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.916 [INFO][4688] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.919 [INFO][4688] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.919 [INFO][4688] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.921 [INFO][4688] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1 Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.932 [INFO][4688] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.939 [INFO][4688] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.71/26] block=192.168.21.64/26 handle="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.940 [INFO][4688] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.71/26] handle="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.940 [INFO][4688] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:22.967080 containerd[1581]: 2025-09-09 05:33:22.940 [INFO][4688] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.71/26] IPv6=[] ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:22.969972 containerd[1581]: 2025-09-09 05:33:22.946 [INFO][4652] cni-plugin/k8s.go 418: Populated endpoint ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-mbm69" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0", GenerateName:"calico-apiserver-6b8c85d7cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"3afed2c6-cb7f-44a3-a525-6523bbddd214", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b8c85d7cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"calico-apiserver-6b8c85d7cc-mbm69", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0eca24e9198", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:22.969972 containerd[1581]: 2025-09-09 05:33:22.946 [INFO][4652] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.71/32] ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-mbm69" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:22.969972 containerd[1581]: 2025-09-09 05:33:22.946 [INFO][4652] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0eca24e9198 ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-mbm69" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:22.969972 containerd[1581]: 2025-09-09 05:33:22.948 [INFO][4652] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-mbm69" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:22.969972 containerd[1581]: 2025-09-09 05:33:22.949 [INFO][4652] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-mbm69" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0", GenerateName:"calico-apiserver-6b8c85d7cc-", Namespace:"calico-apiserver", SelfLink:"", UID:"3afed2c6-cb7f-44a3-a525-6523bbddd214", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b8c85d7cc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1", Pod:"calico-apiserver-6b8c85d7cc-mbm69", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0eca24e9198", MAC:"2e:3f:bd:41:73:b7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:22.969972 containerd[1581]: 2025-09-09 05:33:22.958 [INFO][4652] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Namespace="calico-apiserver" Pod="calico-apiserver-6b8c85d7cc-mbm69" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:23.028179 containerd[1581]: time="2025-09-09T05:33:23.028138315Z" level=info msg="connecting to shim 64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" address="unix:///run/containerd/s/69432951ff23bae6312f0721f13883956ad9cc9d1485d9dc902a3d7bbf315395" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:23.071681 systemd-networkd[1472]: cali201afd8947d: Gained IPv6LL Sep 9 05:33:23.073314 systemd[1]: Started cri-containerd-64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1.scope - libcontainer container 64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1. Sep 9 05:33:23.076652 kubelet[2758]: I0909 05:33:23.075337 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-684cb6f94d-n5znq" podStartSLOduration=1.98344802 podStartE2EDuration="7.069150367s" podCreationTimestamp="2025-09-09 05:33:16 +0000 UTC" firstStartedPulling="2025-09-09 05:33:17.411343483 +0000 UTC m=+38.754530999" lastFinishedPulling="2025-09-09 05:33:22.497045829 +0000 UTC m=+43.840233346" observedRunningTime="2025-09-09 05:33:23.041247342 +0000 UTC m=+44.384434858" watchObservedRunningTime="2025-09-09 05:33:23.069150367 +0000 UTC m=+44.412337883" Sep 9 05:33:23.107509 systemd-networkd[1472]: cali45dd8ccb3ba: Link UP Sep 9 05:33:23.107753 systemd-networkd[1472]: cali45dd8ccb3ba: Gained carrier Sep 9 05:33:23.131273 kubelet[2758]: I0909 05:33:23.131207 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-6nz5j" podStartSLOduration=38.131188688 podStartE2EDuration="38.131188688s" podCreationTimestamp="2025-09-09 05:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:33:23.077777862 +0000 UTC m=+44.420965379" watchObservedRunningTime="2025-09-09 05:33:23.131188688 +0000 UTC m=+44.474376204" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:22.849 [INFO][4657] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0 calico-apiserver-dfd4dff7c- calico-apiserver a438f529-ea8a-4969-a2a9-1d9ed1fd6a79 836 0 2025-09-09 05:32:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:dfd4dff7c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc calico-apiserver-dfd4dff7c-shntr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali45dd8ccb3ba [] [] }} ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-shntr" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:22.849 [INFO][4657] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-shntr" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:22.927 [INFO][4699] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" HandleID="k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:22.927 [INFO][4699] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" HandleID="k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c6ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452-0-0-n-de00512edc", "pod":"calico-apiserver-dfd4dff7c-shntr", "timestamp":"2025-09-09 05:33:22.9275749 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:22.927 [INFO][4699] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:22.940 [INFO][4699] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:22.940 [INFO][4699] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:22.996 [INFO][4699] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.004 [INFO][4699] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.018 [INFO][4699] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.026 [INFO][4699] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.031 [INFO][4699] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.031 [INFO][4699] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.041 [INFO][4699] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.060 [INFO][4699] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.074 [INFO][4699] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.72/26] block=192.168.21.64/26 handle="k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.074 [INFO][4699] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.72/26] handle="k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.074 [INFO][4699] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:23.134677 containerd[1581]: 2025-09-09 05:33:23.074 [INFO][4699] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.72/26] IPv6=[] ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" HandleID="k8s-pod-network.5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" Sep 9 05:33:23.136330 containerd[1581]: 2025-09-09 05:33:23.099 [INFO][4657] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-shntr" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0", GenerateName:"calico-apiserver-dfd4dff7c-", Namespace:"calico-apiserver", SelfLink:"", UID:"a438f529-ea8a-4969-a2a9-1d9ed1fd6a79", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dfd4dff7c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"calico-apiserver-dfd4dff7c-shntr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali45dd8ccb3ba", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:23.136330 containerd[1581]: 2025-09-09 05:33:23.102 [INFO][4657] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.72/32] ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-shntr" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" Sep 9 05:33:23.136330 containerd[1581]: 2025-09-09 05:33:23.102 [INFO][4657] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali45dd8ccb3ba ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-shntr" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" Sep 9 05:33:23.136330 containerd[1581]: 2025-09-09 05:33:23.105 [INFO][4657] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-shntr" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" Sep 9 05:33:23.136330 containerd[1581]: 2025-09-09 05:33:23.106 [INFO][4657] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-shntr" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0", GenerateName:"calico-apiserver-dfd4dff7c-", Namespace:"calico-apiserver", SelfLink:"", UID:"a438f529-ea8a-4969-a2a9-1d9ed1fd6a79", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dfd4dff7c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c", Pod:"calico-apiserver-dfd4dff7c-shntr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali45dd8ccb3ba", MAC:"d2:97:54:12:f9:31", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:23.136330 containerd[1581]: 2025-09-09 05:33:23.132 [INFO][4657] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-shntr" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--shntr-eth0" Sep 9 05:33:23.157807 containerd[1581]: time="2025-09-09T05:33:23.157743480Z" level=info msg="connecting to shim 5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c" address="unix:///run/containerd/s/52d5829471dae52a9efc3fef185dbfbc5e826c1c9d8b2850d16fd1f816d8b0c1" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:23.160967 systemd-networkd[1472]: cali89298f3788e: Link UP Sep 9 05:33:23.161785 systemd-networkd[1472]: cali89298f3788e: Gained carrier Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:22.871 [INFO][4660] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0 coredns-668d6bf9bc- kube-system 02708d09-983e-47b5-b5d8-8af08b1e4a26 837 0 2025-09-09 05:32:45 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc coredns-668d6bf9bc-kbfs2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali89298f3788e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbfs2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:22.871 [INFO][4660] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbfs2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:22.944 [INFO][4706] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" HandleID="k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Workload="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:22.945 [INFO][4706] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" HandleID="k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Workload="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5230), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4452-0-0-n-de00512edc", "pod":"coredns-668d6bf9bc-kbfs2", "timestamp":"2025-09-09 05:33:22.944695817 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:22.945 [INFO][4706] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.078 [INFO][4706] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.078 [INFO][4706] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.094 [INFO][4706] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.111 [INFO][4706] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.129 [INFO][4706] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.134 [INFO][4706] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.137 [INFO][4706] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.137 [INFO][4706] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.140 [INFO][4706] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.143 [INFO][4706] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.151 [INFO][4706] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.73/26] block=192.168.21.64/26 handle="k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.151 [INFO][4706] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.73/26] handle="k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.151 [INFO][4706] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:23.196261 containerd[1581]: 2025-09-09 05:33:23.151 [INFO][4706] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.73/26] IPv6=[] ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" HandleID="k8s-pod-network.ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Workload="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" Sep 9 05:33:23.197441 containerd[1581]: 2025-09-09 05:33:23.155 [INFO][4660] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbfs2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"02708d09-983e-47b5-b5d8-8af08b1e4a26", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"coredns-668d6bf9bc-kbfs2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali89298f3788e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:23.197441 containerd[1581]: 2025-09-09 05:33:23.155 [INFO][4660] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.73/32] ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbfs2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" Sep 9 05:33:23.197441 containerd[1581]: 2025-09-09 05:33:23.155 [INFO][4660] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali89298f3788e ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbfs2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" Sep 9 05:33:23.197441 containerd[1581]: 2025-09-09 05:33:23.170 [INFO][4660] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbfs2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" Sep 9 05:33:23.197441 containerd[1581]: 2025-09-09 05:33:23.173 [INFO][4660] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbfs2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"02708d09-983e-47b5-b5d8-8af08b1e4a26", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 32, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e", Pod:"coredns-668d6bf9bc-kbfs2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.21.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali89298f3788e", MAC:"56:5e:8e:dc:37:92", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:23.197441 containerd[1581]: 2025-09-09 05:33:23.191 [INFO][4660] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" Namespace="kube-system" Pod="coredns-668d6bf9bc-kbfs2" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-coredns--668d6bf9bc--kbfs2-eth0" Sep 9 05:33:23.200500 systemd[1]: Started cri-containerd-5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c.scope - libcontainer container 5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c. Sep 9 05:33:23.230094 containerd[1581]: time="2025-09-09T05:33:23.230053781Z" level=info msg="connecting to shim ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e" address="unix:///run/containerd/s/c92b0ea1776cc87051a34deae4c1c57e8492455f8e969fc7ed2be95641a557ac" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:23.280225 containerd[1581]: time="2025-09-09T05:33:23.279985833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b8c85d7cc-mbm69,Uid:3afed2c6-cb7f-44a3-a525-6523bbddd214,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\"" Sep 9 05:33:23.287241 systemd[1]: Started cri-containerd-ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e.scope - libcontainer container ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e. Sep 9 05:33:23.303668 containerd[1581]: time="2025-09-09T05:33:23.303291871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dfd4dff7c-shntr,Uid:a438f529-ea8a-4969-a2a9-1d9ed1fd6a79,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c\"" Sep 9 05:33:23.345287 containerd[1581]: time="2025-09-09T05:33:23.345196445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-kbfs2,Uid:02708d09-983e-47b5-b5d8-8af08b1e4a26,Namespace:kube-system,Attempt:0,} returns sandbox id \"ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e\"" Sep 9 05:33:23.347735 containerd[1581]: time="2025-09-09T05:33:23.347687668Z" level=info msg="CreateContainer within sandbox \"ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 05:33:23.357059 containerd[1581]: time="2025-09-09T05:33:23.357006953Z" level=info msg="Container f5063db71097409f49315f1eb083373958f3589d7852afa1b82ce0d7e3b0bff2: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:23.360889 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2093669937.mount: Deactivated successfully. Sep 9 05:33:23.362909 containerd[1581]: time="2025-09-09T05:33:23.362878753Z" level=info msg="CreateContainer within sandbox \"ebc56648261ef1af37336514a0577147d08e3c123252cfdb3b99ecef15c8d94e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f5063db71097409f49315f1eb083373958f3589d7852afa1b82ce0d7e3b0bff2\"" Sep 9 05:33:23.363497 containerd[1581]: time="2025-09-09T05:33:23.363475205Z" level=info msg="StartContainer for \"f5063db71097409f49315f1eb083373958f3589d7852afa1b82ce0d7e3b0bff2\"" Sep 9 05:33:23.364344 containerd[1581]: time="2025-09-09T05:33:23.364318026Z" level=info msg="connecting to shim f5063db71097409f49315f1eb083373958f3589d7852afa1b82ce0d7e3b0bff2" address="unix:///run/containerd/s/c92b0ea1776cc87051a34deae4c1c57e8492455f8e969fc7ed2be95641a557ac" protocol=ttrpc version=3 Sep 9 05:33:23.378127 systemd[1]: Started cri-containerd-f5063db71097409f49315f1eb083373958f3589d7852afa1b82ce0d7e3b0bff2.scope - libcontainer container f5063db71097409f49315f1eb083373958f3589d7852afa1b82ce0d7e3b0bff2. Sep 9 05:33:23.402657 containerd[1581]: time="2025-09-09T05:33:23.402208490Z" level=info msg="StartContainer for \"f5063db71097409f49315f1eb083373958f3589d7852afa1b82ce0d7e3b0bff2\" returns successfully" Sep 9 05:33:24.009826 kubelet[2758]: I0909 05:33:24.009761 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-kbfs2" podStartSLOduration=39.009740745 podStartE2EDuration="39.009740745s" podCreationTimestamp="2025-09-09 05:32:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:33:23.994135194 +0000 UTC m=+45.337322720" watchObservedRunningTime="2025-09-09 05:33:24.009740745 +0000 UTC m=+45.352928271" Sep 9 05:33:24.479440 systemd-networkd[1472]: cali89298f3788e: Gained IPv6LL Sep 9 05:33:24.543357 systemd-networkd[1472]: cali0eca24e9198: Gained IPv6LL Sep 9 05:33:24.580145 kubelet[2758]: I0909 05:33:24.580072 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:24.811668 containerd[1581]: time="2025-09-09T05:33:24.811459281Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\" id:\"4a24b52bc9856fc5c80b6f9c33a83f542a8e754ea4fa8d4fce4c459a22330779\" pid:4926 exited_at:{seconds:1757396004 nanos:789157549}" Sep 9 05:33:24.928394 systemd-networkd[1472]: cali45dd8ccb3ba: Gained IPv6LL Sep 9 05:33:24.931837 containerd[1581]: time="2025-09-09T05:33:24.931786913Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\" id:\"dbd3f02f47fbc25a84c1dd4c866790617e19b8d3c2a8bdcb37feb1055502af0b\" pid:4952 exited_at:{seconds:1757396004 nanos:930180920}" Sep 9 05:33:26.475858 containerd[1581]: time="2025-09-09T05:33:26.475787724Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:26.479506 containerd[1581]: time="2025-09-09T05:33:26.476911811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 9 05:33:26.479506 containerd[1581]: time="2025-09-09T05:33:26.478522303Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:26.481668 containerd[1581]: time="2025-09-09T05:33:26.480895487Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:26.481668 containerd[1581]: time="2025-09-09T05:33:26.481502828Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 3.983741838s" Sep 9 05:33:26.481668 containerd[1581]: time="2025-09-09T05:33:26.481553804Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 9 05:33:26.483211 containerd[1581]: time="2025-09-09T05:33:26.483189344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 05:33:26.522624 containerd[1581]: time="2025-09-09T05:33:26.522577591Z" level=info msg="CreateContainer within sandbox \"7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 05:33:26.533072 containerd[1581]: time="2025-09-09T05:33:26.531106552Z" level=info msg="Container d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:26.542855 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1398205433.mount: Deactivated successfully. Sep 9 05:33:26.552133 containerd[1581]: time="2025-09-09T05:33:26.552098955Z" level=info msg="CreateContainer within sandbox \"7bbbc5181c1fa82bfd1545cd37bd0a9cc0f33a833ef58155bf81a12d07d43304\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\"" Sep 9 05:33:26.553304 containerd[1581]: time="2025-09-09T05:33:26.553278414Z" level=info msg="StartContainer for \"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\"" Sep 9 05:33:26.554983 containerd[1581]: time="2025-09-09T05:33:26.554943127Z" level=info msg="connecting to shim d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210" address="unix:///run/containerd/s/9b25593bdbc0f43a8a123ccd29e8116b14f55c294273f545d24fc987c86080a7" protocol=ttrpc version=3 Sep 9 05:33:26.586402 systemd[1]: Started cri-containerd-d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210.scope - libcontainer container d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210. Sep 9 05:33:26.656426 containerd[1581]: time="2025-09-09T05:33:26.656380690Z" level=info msg="StartContainer for \"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\" returns successfully" Sep 9 05:33:27.055244 kubelet[2758]: I0909 05:33:27.053409 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7cf8d45fc7-g66ps" podStartSLOduration=22.619233443 podStartE2EDuration="29.053371519s" podCreationTimestamp="2025-09-09 05:32:58 +0000 UTC" firstStartedPulling="2025-09-09 05:33:20.048474662 +0000 UTC m=+41.391662178" lastFinishedPulling="2025-09-09 05:33:26.482612738 +0000 UTC m=+47.825800254" observedRunningTime="2025-09-09 05:33:27.051751759 +0000 UTC m=+48.394939385" watchObservedRunningTime="2025-09-09 05:33:27.053371519 +0000 UTC m=+48.396559076" Sep 9 05:33:27.082328 containerd[1581]: time="2025-09-09T05:33:27.082246355Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\" id:\"5fed59d7d3241affcac7512e44a90b24db44bf5185b42d96ad6cfb21fc6516f3\" pid:5021 exited_at:{seconds:1757396007 nanos:81286156}" Sep 9 05:33:29.815690 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount843783918.mount: Deactivated successfully. Sep 9 05:33:30.476841 containerd[1581]: time="2025-09-09T05:33:30.476658012Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:30.479909 containerd[1581]: time="2025-09-09T05:33:30.479866918Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 9 05:33:30.480358 containerd[1581]: time="2025-09-09T05:33:30.480341462Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:30.482781 containerd[1581]: time="2025-09-09T05:33:30.482762849Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:30.483217 containerd[1581]: time="2025-09-09T05:33:30.483150683Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.999835535s" Sep 9 05:33:30.483217 containerd[1581]: time="2025-09-09T05:33:30.483179847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 9 05:33:30.484835 containerd[1581]: time="2025-09-09T05:33:30.484810549Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 05:33:30.488105 containerd[1581]: time="2025-09-09T05:33:30.488072894Z" level=info msg="CreateContainer within sandbox \"e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 05:33:30.506049 containerd[1581]: time="2025-09-09T05:33:30.502828096Z" level=info msg="Container 566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:30.530685 containerd[1581]: time="2025-09-09T05:33:30.530566243Z" level=info msg="CreateContainer within sandbox \"e52ac66b0ff2090d806d9007f015eb07e971330ce3d59e6e82a696eb7d6e00a2\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\"" Sep 9 05:33:30.532276 containerd[1581]: time="2025-09-09T05:33:30.532182608Z" level=info msg="StartContainer for \"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\"" Sep 9 05:33:30.534327 containerd[1581]: time="2025-09-09T05:33:30.534303774Z" level=info msg="connecting to shim 566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07" address="unix:///run/containerd/s/7062fd463fdb3dbe67d6ef96bc583d6d46f21b6016ff6f610b3e7868a6078e70" protocol=ttrpc version=3 Sep 9 05:33:30.569291 systemd[1]: Started cri-containerd-566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07.scope - libcontainer container 566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07. Sep 9 05:33:30.664993 containerd[1581]: time="2025-09-09T05:33:30.664929188Z" level=info msg="StartContainer for \"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\" returns successfully" Sep 9 05:33:31.090930 kubelet[2758]: I0909 05:33:31.090407 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-gd4t2" podStartSLOduration=24.715567351 podStartE2EDuration="34.080628324s" podCreationTimestamp="2025-09-09 05:32:57 +0000 UTC" firstStartedPulling="2025-09-09 05:33:21.119382239 +0000 UTC m=+42.462569756" lastFinishedPulling="2025-09-09 05:33:30.484443213 +0000 UTC m=+51.827630729" observedRunningTime="2025-09-09 05:33:31.079090046 +0000 UTC m=+52.422277591" watchObservedRunningTime="2025-09-09 05:33:31.080628324 +0000 UTC m=+52.423815850" Sep 9 05:33:31.290200 containerd[1581]: time="2025-09-09T05:33:31.290139106Z" level=info msg="TaskExit event in podsandbox handler container_id:\"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\" id:\"9996eeec89c28bbe926bf5329785dc02e5db856f6aaf4a09d5d8f94b8164a3ac\" pid:5101 exited_at:{seconds:1757396011 nanos:279488167}" Sep 9 05:33:32.477894 containerd[1581]: time="2025-09-09T05:33:32.476180526Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:32.478727 containerd[1581]: time="2025-09-09T05:33:32.478247823Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 9 05:33:32.478941 containerd[1581]: time="2025-09-09T05:33:32.478923594Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:32.481846 containerd[1581]: time="2025-09-09T05:33:32.481811782Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:32.484630 containerd[1581]: time="2025-09-09T05:33:32.484524564Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 1.999603167s" Sep 9 05:33:32.484630 containerd[1581]: time="2025-09-09T05:33:32.484550241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 9 05:33:32.508120 containerd[1581]: time="2025-09-09T05:33:32.508102291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:33:32.512585 containerd[1581]: time="2025-09-09T05:33:32.512567132Z" level=info msg="CreateContainer within sandbox \"e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 05:33:32.536830 containerd[1581]: time="2025-09-09T05:33:32.536673005Z" level=info msg="Container 0b661f5bd24561c37d1283e5126ebec4c9503a719130c6885435660a68cacd2e: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:32.581669 containerd[1581]: time="2025-09-09T05:33:32.581634929Z" level=info msg="CreateContainer within sandbox \"e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"0b661f5bd24561c37d1283e5126ebec4c9503a719130c6885435660a68cacd2e\"" Sep 9 05:33:32.582487 containerd[1581]: time="2025-09-09T05:33:32.582468294Z" level=info msg="StartContainer for \"0b661f5bd24561c37d1283e5126ebec4c9503a719130c6885435660a68cacd2e\"" Sep 9 05:33:32.584124 containerd[1581]: time="2025-09-09T05:33:32.583929029Z" level=info msg="connecting to shim 0b661f5bd24561c37d1283e5126ebec4c9503a719130c6885435660a68cacd2e" address="unix:///run/containerd/s/4d8aab99556e505d14ed9f5a75a25e26a2009026794550561a2b36bb08d1e149" protocol=ttrpc version=3 Sep 9 05:33:32.610501 systemd[1]: Started cri-containerd-0b661f5bd24561c37d1283e5126ebec4c9503a719130c6885435660a68cacd2e.scope - libcontainer container 0b661f5bd24561c37d1283e5126ebec4c9503a719130c6885435660a68cacd2e. Sep 9 05:33:32.729056 containerd[1581]: time="2025-09-09T05:33:32.727991582Z" level=info msg="StartContainer for \"0b661f5bd24561c37d1283e5126ebec4c9503a719130c6885435660a68cacd2e\" returns successfully" Sep 9 05:33:36.110709 containerd[1581]: time="2025-09-09T05:33:36.110651804Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:36.111619 containerd[1581]: time="2025-09-09T05:33:36.111582791Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 9 05:33:36.117770 containerd[1581]: time="2025-09-09T05:33:36.117732120Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:36.119400 containerd[1581]: time="2025-09-09T05:33:36.119359737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:36.120056 containerd[1581]: time="2025-09-09T05:33:36.119715591Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 3.61149562s" Sep 9 05:33:36.120056 containerd[1581]: time="2025-09-09T05:33:36.119740057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:33:36.120816 containerd[1581]: time="2025-09-09T05:33:36.120622725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:33:36.123073 containerd[1581]: time="2025-09-09T05:33:36.123046359Z" level=info msg="CreateContainer within sandbox \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:33:36.130053 containerd[1581]: time="2025-09-09T05:33:36.128578244Z" level=info msg="Container df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:36.156254 containerd[1581]: time="2025-09-09T05:33:36.156161207Z" level=info msg="CreateContainer within sandbox \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\"" Sep 9 05:33:36.157776 containerd[1581]: time="2025-09-09T05:33:36.157701802Z" level=info msg="StartContainer for \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\"" Sep 9 05:33:36.159670 containerd[1581]: time="2025-09-09T05:33:36.159622287Z" level=info msg="connecting to shim df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e" address="unix:///run/containerd/s/cb4492f114b86bc40033f9e871d0e045b2102b90cd1522612fe9b388ed59d479" protocol=ttrpc version=3 Sep 9 05:33:36.192206 systemd[1]: Started cri-containerd-df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e.scope - libcontainer container df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e. Sep 9 05:33:36.248120 containerd[1581]: time="2025-09-09T05:33:36.248083430Z" level=info msg="StartContainer for \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" returns successfully" Sep 9 05:33:36.623202 containerd[1581]: time="2025-09-09T05:33:36.622505105Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:36.623815 containerd[1581]: time="2025-09-09T05:33:36.623783501Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 05:33:36.653644 containerd[1581]: time="2025-09-09T05:33:36.653594012Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 532.936192ms" Sep 9 05:33:36.653644 containerd[1581]: time="2025-09-09T05:33:36.653640630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:33:36.656063 containerd[1581]: time="2025-09-09T05:33:36.655911818Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 05:33:36.663321 containerd[1581]: time="2025-09-09T05:33:36.663068045Z" level=info msg="CreateContainer within sandbox \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:33:36.684597 containerd[1581]: time="2025-09-09T05:33:36.684562234Z" level=info msg="Container 3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:36.711370 containerd[1581]: time="2025-09-09T05:33:36.711326941Z" level=info msg="CreateContainer within sandbox \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\"" Sep 9 05:33:36.712359 containerd[1581]: time="2025-09-09T05:33:36.712331075Z" level=info msg="StartContainer for \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\"" Sep 9 05:33:36.713688 containerd[1581]: time="2025-09-09T05:33:36.713666838Z" level=info msg="connecting to shim 3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf" address="unix:///run/containerd/s/69432951ff23bae6312f0721f13883956ad9cc9d1485d9dc902a3d7bbf315395" protocol=ttrpc version=3 Sep 9 05:33:36.786134 systemd[1]: Started cri-containerd-3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf.scope - libcontainer container 3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf. Sep 9 05:33:36.869577 containerd[1581]: time="2025-09-09T05:33:36.869518072Z" level=info msg="StartContainer for \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" returns successfully" Sep 9 05:33:37.140730 containerd[1581]: time="2025-09-09T05:33:37.140064443Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:37.142086 containerd[1581]: time="2025-09-09T05:33:37.142069004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 05:33:37.143501 containerd[1581]: time="2025-09-09T05:33:37.143480901Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 487.165769ms" Sep 9 05:33:37.143604 containerd[1581]: time="2025-09-09T05:33:37.143592188Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 9 05:33:37.147854 containerd[1581]: time="2025-09-09T05:33:37.147388864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 05:33:37.148077 containerd[1581]: time="2025-09-09T05:33:37.148059287Z" level=info msg="CreateContainer within sandbox \"5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:33:37.157253 containerd[1581]: time="2025-09-09T05:33:37.157226298Z" level=info msg="Container b55161fba1b31c4bb82d0ab9e9a8f4ff94f092528039581c6837e16c3f352b6c: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:37.160967 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1449108846.mount: Deactivated successfully. Sep 9 05:33:37.169505 kubelet[2758]: I0909 05:33:37.169394 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b8c85d7cc-csjg4" podStartSLOduration=28.395530949 podStartE2EDuration="43.1668915s" podCreationTimestamp="2025-09-09 05:32:54 +0000 UTC" firstStartedPulling="2025-09-09 05:33:21.349182376 +0000 UTC m=+42.692369891" lastFinishedPulling="2025-09-09 05:33:36.120542926 +0000 UTC m=+57.463730442" observedRunningTime="2025-09-09 05:33:37.139427013 +0000 UTC m=+58.482614528" watchObservedRunningTime="2025-09-09 05:33:37.1668915 +0000 UTC m=+58.510079016" Sep 9 05:33:37.170882 kubelet[2758]: I0909 05:33:37.170649 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b8c85d7cc-mbm69" podStartSLOduration=29.798708247 podStartE2EDuration="43.170634798s" podCreationTimestamp="2025-09-09 05:32:54 +0000 UTC" firstStartedPulling="2025-09-09 05:33:23.283475736 +0000 UTC m=+44.626663252" lastFinishedPulling="2025-09-09 05:33:36.655402287 +0000 UTC m=+57.998589803" observedRunningTime="2025-09-09 05:33:37.169676158 +0000 UTC m=+58.512863673" watchObservedRunningTime="2025-09-09 05:33:37.170634798 +0000 UTC m=+58.513822314" Sep 9 05:33:37.174082 containerd[1581]: time="2025-09-09T05:33:37.174049371Z" level=info msg="CreateContainer within sandbox \"5dafeabe27e9bfeb6dbab63ece47c12bb007d2a2e3952bbd357b4d1f9419c33c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b55161fba1b31c4bb82d0ab9e9a8f4ff94f092528039581c6837e16c3f352b6c\"" Sep 9 05:33:37.175811 containerd[1581]: time="2025-09-09T05:33:37.175669435Z" level=info msg="StartContainer for \"b55161fba1b31c4bb82d0ab9e9a8f4ff94f092528039581c6837e16c3f352b6c\"" Sep 9 05:33:37.178854 containerd[1581]: time="2025-09-09T05:33:37.178835336Z" level=info msg="connecting to shim b55161fba1b31c4bb82d0ab9e9a8f4ff94f092528039581c6837e16c3f352b6c" address="unix:///run/containerd/s/52d5829471dae52a9efc3fef185dbfbc5e826c1c9d8b2850d16fd1f816d8b0c1" protocol=ttrpc version=3 Sep 9 05:33:37.212672 systemd[1]: Started cri-containerd-b55161fba1b31c4bb82d0ab9e9a8f4ff94f092528039581c6837e16c3f352b6c.scope - libcontainer container b55161fba1b31c4bb82d0ab9e9a8f4ff94f092528039581c6837e16c3f352b6c. Sep 9 05:33:37.390032 containerd[1581]: time="2025-09-09T05:33:37.389239503Z" level=info msg="StartContainer for \"b55161fba1b31c4bb82d0ab9e9a8f4ff94f092528039581c6837e16c3f352b6c\" returns successfully" Sep 9 05:33:38.111625 kubelet[2758]: I0909 05:33:38.111545 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:39.109288 kubelet[2758]: I0909 05:33:39.108824 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:39.110553 kubelet[2758]: I0909 05:33:39.109914 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:39.551044 containerd[1581]: time="2025-09-09T05:33:39.550709118Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:39.553376 containerd[1581]: time="2025-09-09T05:33:39.553336865Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 9 05:33:39.554511 containerd[1581]: time="2025-09-09T05:33:39.554472966Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:39.564816 containerd[1581]: time="2025-09-09T05:33:39.564719195Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 05:33:39.566940 containerd[1581]: time="2025-09-09T05:33:39.566905768Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 2.419490886s" Sep 9 05:33:39.567137 containerd[1581]: time="2025-09-09T05:33:39.567102495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 9 05:33:39.585878 containerd[1581]: time="2025-09-09T05:33:39.585772724Z" level=info msg="CreateContainer within sandbox \"e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 05:33:39.594969 containerd[1581]: time="2025-09-09T05:33:39.594559249Z" level=info msg="Container c770516f947076132880f0c8034cbb8e602a0c36fa97990f0641d2182cc08ab0: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:39.605293 containerd[1581]: time="2025-09-09T05:33:39.605261831Z" level=info msg="CreateContainer within sandbox \"e4d494180bd69107309fe04a170d35d1dbe35da97699ca7a91fd7acfd558d1f1\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c770516f947076132880f0c8034cbb8e602a0c36fa97990f0641d2182cc08ab0\"" Sep 9 05:33:39.606246 containerd[1581]: time="2025-09-09T05:33:39.606229607Z" level=info msg="StartContainer for \"c770516f947076132880f0c8034cbb8e602a0c36fa97990f0641d2182cc08ab0\"" Sep 9 05:33:39.607978 containerd[1581]: time="2025-09-09T05:33:39.607934030Z" level=info msg="connecting to shim c770516f947076132880f0c8034cbb8e602a0c36fa97990f0641d2182cc08ab0" address="unix:///run/containerd/s/4d8aab99556e505d14ed9f5a75a25e26a2009026794550561a2b36bb08d1e149" protocol=ttrpc version=3 Sep 9 05:33:39.629711 systemd[1]: Started cri-containerd-c770516f947076132880f0c8034cbb8e602a0c36fa97990f0641d2182cc08ab0.scope - libcontainer container c770516f947076132880f0c8034cbb8e602a0c36fa97990f0641d2182cc08ab0. Sep 9 05:33:39.811892 containerd[1581]: time="2025-09-09T05:33:39.810112336Z" level=info msg="StartContainer for \"c770516f947076132880f0c8034cbb8e602a0c36fa97990f0641d2182cc08ab0\" returns successfully" Sep 9 05:33:40.106998 kubelet[2758]: I0909 05:33:40.106561 2758 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 05:33:40.112816 kubelet[2758]: I0909 05:33:40.112224 2758 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 05:33:40.318378 kubelet[2758]: I0909 05:33:40.318304 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-dfd4dff7c-shntr" podStartSLOduration=31.482881208 podStartE2EDuration="45.318287608s" podCreationTimestamp="2025-09-09 05:32:55 +0000 UTC" firstStartedPulling="2025-09-09 05:33:23.308958679 +0000 UTC m=+44.652146196" lastFinishedPulling="2025-09-09 05:33:37.14436508 +0000 UTC m=+58.487552596" observedRunningTime="2025-09-09 05:33:38.125718519 +0000 UTC m=+59.468906036" watchObservedRunningTime="2025-09-09 05:33:40.318287608 +0000 UTC m=+61.661475124" Sep 9 05:33:40.336718 kubelet[2758]: I0909 05:33:40.336662 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-qrqmt" podStartSLOduration=25.013375725 podStartE2EDuration="43.336646138s" podCreationTimestamp="2025-09-09 05:32:57 +0000 UTC" firstStartedPulling="2025-09-09 05:33:21.259885525 +0000 UTC m=+42.603073041" lastFinishedPulling="2025-09-09 05:33:39.583155938 +0000 UTC m=+60.926343454" observedRunningTime="2025-09-09 05:33:40.319144729 +0000 UTC m=+61.662332245" watchObservedRunningTime="2025-09-09 05:33:40.336646138 +0000 UTC m=+61.679833654" Sep 9 05:33:49.651044 kubelet[2758]: I0909 05:33:49.649305 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:49.716566 kubelet[2758]: I0909 05:33:49.716512 2758 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 05:33:49.754429 containerd[1581]: time="2025-09-09T05:33:49.753010438Z" level=info msg="StopContainer for \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" with timeout 30 (s)" Sep 9 05:33:49.785080 containerd[1581]: time="2025-09-09T05:33:49.785037954Z" level=info msg="Stop container \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" with signal terminated" Sep 9 05:33:49.803680 containerd[1581]: time="2025-09-09T05:33:49.803649680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\" id:\"58996a49a7e73f97c2038db5f145029598c1709823b9e6a7589fd5c11e6134c4\" pid:5337 exited_at:{seconds:1757396029 nanos:787194704}" Sep 9 05:33:49.846965 systemd[1]: cri-containerd-3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf.scope: Deactivated successfully. Sep 9 05:33:49.852944 containerd[1581]: time="2025-09-09T05:33:49.852719370Z" level=info msg="received exit event container_id:\"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" id:\"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" pid:5208 exit_status:1 exited_at:{seconds:1757396029 nanos:852183839}" Sep 9 05:33:49.852944 containerd[1581]: time="2025-09-09T05:33:49.852923791Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" id:\"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" pid:5208 exit_status:1 exited_at:{seconds:1757396029 nanos:852183839}" Sep 9 05:33:49.888374 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf-rootfs.mount: Deactivated successfully. Sep 9 05:33:49.950657 systemd[1]: Created slice kubepods-besteffort-podf0e8cdb2_d693_4353_858a_b0bebfbd56e0.slice - libcontainer container kubepods-besteffort-podf0e8cdb2_d693_4353_858a_b0bebfbd56e0.slice. Sep 9 05:33:49.952887 containerd[1581]: time="2025-09-09T05:33:49.949873087Z" level=info msg="StopContainer for \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" returns successfully" Sep 9 05:33:49.972050 containerd[1581]: time="2025-09-09T05:33:49.971344387Z" level=info msg="StopPodSandbox for \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\"" Sep 9 05:33:49.982354 kubelet[2758]: I0909 05:33:49.982329 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f0e8cdb2-d693-4353-858a-b0bebfbd56e0-calico-apiserver-certs\") pod \"calico-apiserver-dfd4dff7c-p5zbx\" (UID: \"f0e8cdb2-d693-4353-858a-b0bebfbd56e0\") " pod="calico-apiserver/calico-apiserver-dfd4dff7c-p5zbx" Sep 9 05:33:49.983623 kubelet[2758]: I0909 05:33:49.982505 2758 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdb6x\" (UniqueName: \"kubernetes.io/projected/f0e8cdb2-d693-4353-858a-b0bebfbd56e0-kube-api-access-hdb6x\") pod \"calico-apiserver-dfd4dff7c-p5zbx\" (UID: \"f0e8cdb2-d693-4353-858a-b0bebfbd56e0\") " pod="calico-apiserver/calico-apiserver-dfd4dff7c-p5zbx" Sep 9 05:33:49.990235 containerd[1581]: time="2025-09-09T05:33:49.990132584Z" level=info msg="Container to stop \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 9 05:33:50.002607 systemd[1]: cri-containerd-64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1.scope: Deactivated successfully. Sep 9 05:33:50.011666 containerd[1581]: time="2025-09-09T05:33:50.011625266Z" level=info msg="TaskExit event in podsandbox handler container_id:\"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" id:\"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" pid:4758 exit_status:137 exited_at:{seconds:1757396030 nanos:11332949}" Sep 9 05:33:50.047373 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1-rootfs.mount: Deactivated successfully. Sep 9 05:33:50.063126 containerd[1581]: time="2025-09-09T05:33:50.062982976Z" level=info msg="shim disconnected" id=64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1 namespace=k8s.io Sep 9 05:33:50.063126 containerd[1581]: time="2025-09-09T05:33:50.063064680Z" level=warning msg="cleaning up after shim disconnected" id=64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1 namespace=k8s.io Sep 9 05:33:50.066614 containerd[1581]: time="2025-09-09T05:33:50.063292896Z" level=error msg="failed sending message on channel" error="write unix /run/containerd/s/69432951ff23bae6312f0721f13883956ad9cc9d1485d9dc902a3d7bbf315395->@: write: broken pipe" runtime=io.containerd.runc.v2 Sep 9 05:33:50.077215 containerd[1581]: time="2025-09-09T05:33:50.063071953Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 9 05:33:50.223649 containerd[1581]: time="2025-09-09T05:33:50.223069968Z" level=info msg="received exit event sandbox_id:\"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" exit_status:137 exited_at:{seconds:1757396030 nanos:11332949}" Sep 9 05:33:50.225416 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1-shm.mount: Deactivated successfully. Sep 9 05:33:50.271122 containerd[1581]: time="2025-09-09T05:33:50.271056220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dfd4dff7c-p5zbx,Uid:f0e8cdb2-d693-4353-858a-b0bebfbd56e0,Namespace:calico-apiserver,Attempt:0,}" Sep 9 05:33:50.335148 kubelet[2758]: I0909 05:33:50.335123 2758 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:33:50.491775 systemd-networkd[1472]: cali0eca24e9198: Link DOWN Sep 9 05:33:50.495368 systemd-networkd[1472]: cali0eca24e9198: Lost carrier Sep 9 05:33:50.736923 systemd-networkd[1472]: cali1f6a9153fb3: Link UP Sep 9 05:33:50.740219 systemd-networkd[1472]: cali1f6a9153fb3: Gained carrier Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.470 [INFO][5422] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0 calico-apiserver-dfd4dff7c- calico-apiserver f0e8cdb2-d693-4353-858a-b0bebfbd56e0 1146 0 2025-09-09 05:33:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:dfd4dff7c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4452-0-0-n-de00512edc calico-apiserver-dfd4dff7c-p5zbx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1f6a9153fb3 [] [] }} ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-p5zbx" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.472 [INFO][5422] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-p5zbx" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.670 [INFO][5437] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" HandleID="k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.670 [INFO][5437] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" HandleID="k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033e1e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4452-0-0-n-de00512edc", "pod":"calico-apiserver-dfd4dff7c-p5zbx", "timestamp":"2025-09-09 05:33:50.669223086 +0000 UTC"}, Hostname:"ci-4452-0-0-n-de00512edc", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.670 [INFO][5437] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.671 [INFO][5437] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.671 [INFO][5437] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4452-0-0-n-de00512edc' Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.687 [INFO][5437] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.695 [INFO][5437] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.699 [INFO][5437] ipam/ipam.go 511: Trying affinity for 192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.701 [INFO][5437] ipam/ipam.go 158: Attempting to load block cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.703 [INFO][5437] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.21.64/26 host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.704 [INFO][5437] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.21.64/26 handle="k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.705 [INFO][5437] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113 Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.710 [INFO][5437] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.21.64/26 handle="k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.717 [INFO][5437] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.21.74/26] block=192.168.21.64/26 handle="k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.717 [INFO][5437] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.21.74/26] handle="k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" host="ci-4452-0-0-n-de00512edc" Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.717 [INFO][5437] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:50.759498 containerd[1581]: 2025-09-09 05:33:50.718 [INFO][5437] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.21.74/26] IPv6=[] ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" HandleID="k8s-pod-network.dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" Sep 9 05:33:50.764682 containerd[1581]: 2025-09-09 05:33:50.724 [INFO][5422] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-p5zbx" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0", GenerateName:"calico-apiserver-dfd4dff7c-", Namespace:"calico-apiserver", SelfLink:"", UID:"f0e8cdb2-d693-4353-858a-b0bebfbd56e0", ResourceVersion:"1146", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dfd4dff7c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"", Pod:"calico-apiserver-dfd4dff7c-p5zbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1f6a9153fb3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:50.764682 containerd[1581]: 2025-09-09 05:33:50.724 [INFO][5422] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.21.74/32] ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-p5zbx" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" Sep 9 05:33:50.764682 containerd[1581]: 2025-09-09 05:33:50.724 [INFO][5422] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1f6a9153fb3 ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-p5zbx" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" Sep 9 05:33:50.764682 containerd[1581]: 2025-09-09 05:33:50.739 [INFO][5422] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-p5zbx" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" Sep 9 05:33:50.764682 containerd[1581]: 2025-09-09 05:33:50.740 [INFO][5422] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-p5zbx" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0", GenerateName:"calico-apiserver-dfd4dff7c-", Namespace:"calico-apiserver", SelfLink:"", UID:"f0e8cdb2-d693-4353-858a-b0bebfbd56e0", ResourceVersion:"1146", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 5, 33, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"dfd4dff7c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4452-0-0-n-de00512edc", ContainerID:"dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113", Pod:"calico-apiserver-dfd4dff7c-p5zbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.21.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1f6a9153fb3", MAC:"6a:fa:84:ec:de:10", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 05:33:50.764682 containerd[1581]: 2025-09-09 05:33:50.757 [INFO][5422] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" Namespace="calico-apiserver" Pod="calico-apiserver-dfd4dff7c-p5zbx" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--dfd4dff7c--p5zbx-eth0" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.479 [INFO][5417] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.483 [INFO][5417] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" iface="eth0" netns="/var/run/netns/cni-f6ee01f3-de05-ccf0-dab8-d4002133e9d8" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.484 [INFO][5417] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" iface="eth0" netns="/var/run/netns/cni-f6ee01f3-de05-ccf0-dab8-d4002133e9d8" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.496 [INFO][5417] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" after=12.824975ms iface="eth0" netns="/var/run/netns/cni-f6ee01f3-de05-ccf0-dab8-d4002133e9d8" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.497 [INFO][5417] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.497 [INFO][5417] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.669 [INFO][5442] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.670 [INFO][5442] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.718 [INFO][5442] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.808 [INFO][5442] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.808 [INFO][5442] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.811 [INFO][5442] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:50.826598 containerd[1581]: 2025-09-09 05:33:50.819 [INFO][5417] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:33:50.831663 containerd[1581]: time="2025-09-09T05:33:50.831618371Z" level=info msg="TearDown network for sandbox \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" successfully" Sep 9 05:33:50.831663 containerd[1581]: time="2025-09-09T05:33:50.831646163Z" level=info msg="StopPodSandbox for \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" returns successfully" Sep 9 05:33:50.892144 systemd[1]: run-netns-cni\x2df6ee01f3\x2dde05\x2dccf0\x2ddab8\x2dd4002133e9d8.mount: Deactivated successfully. Sep 9 05:33:50.938880 kubelet[2758]: I0909 05:33:50.937889 2758 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqgcw\" (UniqueName: \"kubernetes.io/projected/3afed2c6-cb7f-44a3-a525-6523bbddd214-kube-api-access-vqgcw\") pod \"3afed2c6-cb7f-44a3-a525-6523bbddd214\" (UID: \"3afed2c6-cb7f-44a3-a525-6523bbddd214\") " Sep 9 05:33:50.940825 kubelet[2758]: I0909 05:33:50.940730 2758 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3afed2c6-cb7f-44a3-a525-6523bbddd214-calico-apiserver-certs\") pod \"3afed2c6-cb7f-44a3-a525-6523bbddd214\" (UID: \"3afed2c6-cb7f-44a3-a525-6523bbddd214\") " Sep 9 05:33:50.976521 systemd[1]: var-lib-kubelet-pods-3afed2c6\x2dcb7f\x2d44a3\x2da525\x2d6523bbddd214-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dvqgcw.mount: Deactivated successfully. Sep 9 05:33:50.981443 containerd[1581]: time="2025-09-09T05:33:50.981408962Z" level=info msg="connecting to shim dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113" address="unix:///run/containerd/s/02d49ddfbeafeeefd3043d6fbccb622f222db626f38aeb95365bc9b77c6349cf" namespace=k8s.io protocol=ttrpc version=3 Sep 9 05:33:50.984296 systemd[1]: var-lib-kubelet-pods-3afed2c6\x2dcb7f\x2d44a3\x2da525\x2d6523bbddd214-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 9 05:33:50.998104 kubelet[2758]: I0909 05:33:50.993469 2758 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afed2c6-cb7f-44a3-a525-6523bbddd214-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "3afed2c6-cb7f-44a3-a525-6523bbddd214" (UID: "3afed2c6-cb7f-44a3-a525-6523bbddd214"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 05:33:50.999078 kubelet[2758]: I0909 05:33:50.993512 2758 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afed2c6-cb7f-44a3-a525-6523bbddd214-kube-api-access-vqgcw" (OuterVolumeSpecName: "kube-api-access-vqgcw") pod "3afed2c6-cb7f-44a3-a525-6523bbddd214" (UID: "3afed2c6-cb7f-44a3-a525-6523bbddd214"). InnerVolumeSpecName "kube-api-access-vqgcw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 05:33:51.040967 systemd[1]: Started cri-containerd-dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113.scope - libcontainer container dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113. Sep 9 05:33:51.042662 kubelet[2758]: I0909 05:33:51.041714 2758 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3afed2c6-cb7f-44a3-a525-6523bbddd214-calico-apiserver-certs\") on node \"ci-4452-0-0-n-de00512edc\" DevicePath \"\"" Sep 9 05:33:51.042662 kubelet[2758]: I0909 05:33:51.041731 2758 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vqgcw\" (UniqueName: \"kubernetes.io/projected/3afed2c6-cb7f-44a3-a525-6523bbddd214-kube-api-access-vqgcw\") on node \"ci-4452-0-0-n-de00512edc\" DevicePath \"\"" Sep 9 05:33:51.344123 containerd[1581]: time="2025-09-09T05:33:51.343937643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-dfd4dff7c-p5zbx,Uid:f0e8cdb2-d693-4353-858a-b0bebfbd56e0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113\"" Sep 9 05:33:51.365482 systemd[1]: Removed slice kubepods-besteffort-pod3afed2c6_cb7f_44a3_a525_6523bbddd214.slice - libcontainer container kubepods-besteffort-pod3afed2c6_cb7f_44a3_a525_6523bbddd214.slice. Sep 9 05:33:51.375747 containerd[1581]: time="2025-09-09T05:33:51.375704610Z" level=info msg="CreateContainer within sandbox \"dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 05:33:51.405047 containerd[1581]: time="2025-09-09T05:33:51.404614066Z" level=info msg="Container 14edb3f0b9f50e8f39da5720105036002c472e9d1d3c648b6f749e08912d78d8: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:33:51.437945 containerd[1581]: time="2025-09-09T05:33:51.437891375Z" level=info msg="CreateContainer within sandbox \"dc71b2e5b7be9dfb194cba3508c97186fe072308688fd69fc6607da3c230a113\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"14edb3f0b9f50e8f39da5720105036002c472e9d1d3c648b6f749e08912d78d8\"" Sep 9 05:33:51.449470 containerd[1581]: time="2025-09-09T05:33:51.449420140Z" level=info msg="StartContainer for \"14edb3f0b9f50e8f39da5720105036002c472e9d1d3c648b6f749e08912d78d8\"" Sep 9 05:33:51.480339 containerd[1581]: time="2025-09-09T05:33:51.480286233Z" level=info msg="connecting to shim 14edb3f0b9f50e8f39da5720105036002c472e9d1d3c648b6f749e08912d78d8" address="unix:///run/containerd/s/02d49ddfbeafeeefd3043d6fbccb622f222db626f38aeb95365bc9b77c6349cf" protocol=ttrpc version=3 Sep 9 05:33:51.557326 systemd[1]: Started cri-containerd-14edb3f0b9f50e8f39da5720105036002c472e9d1d3c648b6f749e08912d78d8.scope - libcontainer container 14edb3f0b9f50e8f39da5720105036002c472e9d1d3c648b6f749e08912d78d8. Sep 9 05:33:51.658097 containerd[1581]: time="2025-09-09T05:33:51.657869251Z" level=info msg="StartContainer for \"14edb3f0b9f50e8f39da5720105036002c472e9d1d3c648b6f749e08912d78d8\" returns successfully" Sep 9 05:33:51.871283 systemd-networkd[1472]: cali1f6a9153fb3: Gained IPv6LL Sep 9 05:33:52.386539 kubelet[2758]: I0909 05:33:52.384247 2758 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-dfd4dff7c-p5zbx" podStartSLOduration=3.382349246 podStartE2EDuration="3.382349246s" podCreationTimestamp="2025-09-09 05:33:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 05:33:52.377488371 +0000 UTC m=+73.720675887" watchObservedRunningTime="2025-09-09 05:33:52.382349246 +0000 UTC m=+73.725536762" Sep 9 05:33:52.807121 kubelet[2758]: I0909 05:33:52.807074 2758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afed2c6-cb7f-44a3-a525-6523bbddd214" path="/var/lib/kubelet/pods/3afed2c6-cb7f-44a3-a525-6523bbddd214/volumes" Sep 9 05:33:53.669861 containerd[1581]: time="2025-09-09T05:33:53.669809955Z" level=info msg="StopContainer for \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" with timeout 30 (s)" Sep 9 05:33:53.675101 containerd[1581]: time="2025-09-09T05:33:53.675073913Z" level=info msg="Stop container \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" with signal terminated" Sep 9 05:33:53.731561 systemd[1]: cri-containerd-df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e.scope: Deactivated successfully. Sep 9 05:33:53.732015 systemd[1]: cri-containerd-df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e.scope: Consumed 1.409s CPU time, 62M memory peak, 1.5M read from disk. Sep 9 05:33:53.752313 containerd[1581]: time="2025-09-09T05:33:53.751846554Z" level=info msg="TaskExit event in podsandbox handler container_id:\"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" id:\"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" pid:5173 exit_status:1 exited_at:{seconds:1757396033 nanos:734829269}" Sep 9 05:33:53.752750 containerd[1581]: time="2025-09-09T05:33:53.752285516Z" level=info msg="received exit event container_id:\"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" id:\"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" pid:5173 exit_status:1 exited_at:{seconds:1757396033 nanos:734829269}" Sep 9 05:33:53.826777 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e-rootfs.mount: Deactivated successfully. Sep 9 05:33:53.848658 containerd[1581]: time="2025-09-09T05:33:53.848540279Z" level=info msg="StopContainer for \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" returns successfully" Sep 9 05:33:53.849492 containerd[1581]: time="2025-09-09T05:33:53.849152392Z" level=info msg="StopPodSandbox for \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\"" Sep 9 05:33:53.849492 containerd[1581]: time="2025-09-09T05:33:53.849203138Z" level=info msg="Container to stop \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 9 05:33:53.858204 systemd[1]: cri-containerd-bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f.scope: Deactivated successfully. Sep 9 05:33:53.868223 containerd[1581]: time="2025-09-09T05:33:53.868187801Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" id:\"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" pid:4490 exit_status:137 exited_at:{seconds:1757396033 nanos:867538527}" Sep 9 05:33:53.894696 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f-rootfs.mount: Deactivated successfully. Sep 9 05:33:53.900004 containerd[1581]: time="2025-09-09T05:33:53.899896315Z" level=info msg="shim disconnected" id=bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f namespace=k8s.io Sep 9 05:33:53.900004 containerd[1581]: time="2025-09-09T05:33:53.899937172Z" level=warning msg="cleaning up after shim disconnected" id=bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f namespace=k8s.io Sep 9 05:33:53.905527 containerd[1581]: time="2025-09-09T05:33:53.899944105Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 9 05:33:53.949010 containerd[1581]: time="2025-09-09T05:33:53.945593094Z" level=info msg="received exit event sandbox_id:\"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" exit_status:137 exited_at:{seconds:1757396033 nanos:867538527}" Sep 9 05:33:53.954054 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f-shm.mount: Deactivated successfully. Sep 9 05:33:54.040983 systemd-networkd[1472]: califc1bbf96d7c: Link DOWN Sep 9 05:33:54.040994 systemd-networkd[1472]: califc1bbf96d7c: Lost carrier Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.036 [INFO][5623] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.037 [INFO][5623] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" iface="eth0" netns="/var/run/netns/cni-03abba74-abfa-2963-8bb3-69e23284ff9e" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.038 [INFO][5623] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" iface="eth0" netns="/var/run/netns/cni-03abba74-abfa-2963-8bb3-69e23284ff9e" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.046 [INFO][5623] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" after=8.652091ms iface="eth0" netns="/var/run/netns/cni-03abba74-abfa-2963-8bb3-69e23284ff9e" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.046 [INFO][5623] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.046 [INFO][5623] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.090 [INFO][5631] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.090 [INFO][5631] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.090 [INFO][5631] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.142 [INFO][5631] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.143 [INFO][5631] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.144 [INFO][5631] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:33:54.151864 containerd[1581]: 2025-09-09 05:33:54.148 [INFO][5623] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:33:54.156889 containerd[1581]: time="2025-09-09T05:33:54.153130606Z" level=info msg="TearDown network for sandbox \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" successfully" Sep 9 05:33:54.156889 containerd[1581]: time="2025-09-09T05:33:54.153176221Z" level=info msg="StopPodSandbox for \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" returns successfully" Sep 9 05:33:54.156619 systemd[1]: run-netns-cni\x2d03abba74\x2dabfa\x2d2963\x2d8bb3\x2d69e23284ff9e.mount: Deactivated successfully. Sep 9 05:33:54.314969 kubelet[2758]: I0909 05:33:54.314925 2758 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/96451131-9615-4466-8637-9110e935815d-calico-apiserver-certs\") pod \"96451131-9615-4466-8637-9110e935815d\" (UID: \"96451131-9615-4466-8637-9110e935815d\") " Sep 9 05:33:54.315631 kubelet[2758]: I0909 05:33:54.315038 2758 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dflvn\" (UniqueName: \"kubernetes.io/projected/96451131-9615-4466-8637-9110e935815d-kube-api-access-dflvn\") pod \"96451131-9615-4466-8637-9110e935815d\" (UID: \"96451131-9615-4466-8637-9110e935815d\") " Sep 9 05:33:54.329050 kubelet[2758]: I0909 05:33:54.328971 2758 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96451131-9615-4466-8637-9110e935815d-kube-api-access-dflvn" (OuterVolumeSpecName: "kube-api-access-dflvn") pod "96451131-9615-4466-8637-9110e935815d" (UID: "96451131-9615-4466-8637-9110e935815d"). InnerVolumeSpecName "kube-api-access-dflvn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 05:33:54.329874 kubelet[2758]: I0909 05:33:54.329848 2758 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96451131-9615-4466-8637-9110e935815d-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "96451131-9615-4466-8637-9110e935815d" (UID: "96451131-9615-4466-8637-9110e935815d"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 05:33:54.330119 systemd[1]: var-lib-kubelet-pods-96451131\x2d9615\x2d4466\x2d8637\x2d9110e935815d-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddflvn.mount: Deactivated successfully. Sep 9 05:33:54.400408 kubelet[2758]: I0909 05:33:54.400355 2758 scope.go:117] "RemoveContainer" containerID="df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e" Sep 9 05:33:54.403688 systemd[1]: Removed slice kubepods-besteffort-pod96451131_9615_4466_8637_9110e935815d.slice - libcontainer container kubepods-besteffort-pod96451131_9615_4466_8637_9110e935815d.slice. Sep 9 05:33:54.404402 systemd[1]: kubepods-besteffort-pod96451131_9615_4466_8637_9110e935815d.slice: Consumed 1.437s CPU time, 62.3M memory peak, 1.5M read from disk. Sep 9 05:33:54.416327 kubelet[2758]: I0909 05:33:54.416291 2758 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dflvn\" (UniqueName: \"kubernetes.io/projected/96451131-9615-4466-8637-9110e935815d-kube-api-access-dflvn\") on node \"ci-4452-0-0-n-de00512edc\" DevicePath \"\"" Sep 9 05:33:54.417108 kubelet[2758]: I0909 05:33:54.416340 2758 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/96451131-9615-4466-8637-9110e935815d-calico-apiserver-certs\") on node \"ci-4452-0-0-n-de00512edc\" DevicePath \"\"" Sep 9 05:33:54.428051 containerd[1581]: time="2025-09-09T05:33:54.427752304Z" level=info msg="RemoveContainer for \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\"" Sep 9 05:33:54.436325 containerd[1581]: time="2025-09-09T05:33:54.436296525Z" level=info msg="RemoveContainer for \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" returns successfully" Sep 9 05:33:54.440973 kubelet[2758]: I0909 05:33:54.440946 2758 scope.go:117] "RemoveContainer" containerID="df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e" Sep 9 05:33:54.441271 containerd[1581]: time="2025-09-09T05:33:54.441243921Z" level=error msg="ContainerStatus for \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\": not found" Sep 9 05:33:54.456970 kubelet[2758]: E0909 05:33:54.456897 2758 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\": not found" containerID="df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e" Sep 9 05:33:54.478417 kubelet[2758]: I0909 05:33:54.456968 2758 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e"} err="failed to get container status \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\": rpc error: code = NotFound desc = an error occurred when try to find container \"df90945b4efd0669fb7416e995c5e0befa0d56e033142134eb99c05eda103c6e\": not found" Sep 9 05:33:54.769556 kubelet[2758]: I0909 05:33:54.769451 2758 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96451131-9615-4466-8637-9110e935815d" path="/var/lib/kubelet/pods/96451131-9615-4466-8637-9110e935815d/volumes" Sep 9 05:33:54.824879 systemd[1]: var-lib-kubelet-pods-96451131\x2d9615\x2d4466\x2d8637\x2d9110e935815d-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 9 05:33:54.950446 containerd[1581]: time="2025-09-09T05:33:54.950369763Z" level=info msg="TaskExit event in podsandbox handler exit_status:137 exited_at:{seconds:1757396033 nanos:867538527}" Sep 9 05:33:55.180276 containerd[1581]: time="2025-09-09T05:33:55.180238682Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\" id:\"266bcc822b952b17fecde106196c1da6bfc1d9dec06c10451874c623402a0722\" pid:5658 exited_at:{seconds:1757396035 nanos:177953509}" Sep 9 05:33:56.624484 containerd[1581]: time="2025-09-09T05:33:56.624327756Z" level=info msg="TaskExit event in podsandbox handler container_id:\"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\" id:\"0905469eb0ea21eb4c62ccfff46caccc69db62f83497f24806e3d3f7d416e268\" pid:5682 exited_at:{seconds:1757396036 nanos:623504708}" Sep 9 05:33:57.069090 containerd[1581]: time="2025-09-09T05:33:57.068779083Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\" id:\"503a443d271dfa89e6662f7bbb988a89632c5492ccdc0492c6089094ad271a5c\" pid:5704 exited_at:{seconds:1757396037 nanos:68072612}" Sep 9 05:34:01.300835 containerd[1581]: time="2025-09-09T05:34:01.300758855Z" level=info msg="TaskExit event in podsandbox handler container_id:\"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\" id:\"ffd0d023d00ce9bb2235d6c7a2e2a02d9382fe12e084f0e831d01a862e21e96a\" pid:5735 exited_at:{seconds:1757396041 nanos:300314924}" Sep 9 05:34:15.895613 systemd[1]: Started sshd@7-65.109.237.121:22-147.75.109.163:39930.service - OpenSSH per-connection server daemon (147.75.109.163:39930). Sep 9 05:34:16.932922 sshd[5757]: Accepted publickey for core from 147.75.109.163 port 39930 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:16.935851 sshd-session[5757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:16.943083 systemd-logind[1561]: New session 8 of user core. Sep 9 05:34:16.948230 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 05:34:18.107167 sshd[5762]: Connection closed by 147.75.109.163 port 39930 Sep 9 05:34:18.108658 sshd-session[5757]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:18.120719 systemd[1]: sshd@7-65.109.237.121:22-147.75.109.163:39930.service: Deactivated successfully. Sep 9 05:34:18.125852 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 05:34:18.130718 systemd-logind[1561]: Session 8 logged out. Waiting for processes to exit. Sep 9 05:34:18.133553 systemd-logind[1561]: Removed session 8. Sep 9 05:34:23.312034 systemd[1]: Started sshd@8-65.109.237.121:22-147.75.109.163:46026.service - OpenSSH per-connection server daemon (147.75.109.163:46026). Sep 9 05:34:24.420636 sshd[5777]: Accepted publickey for core from 147.75.109.163 port 46026 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:24.422556 sshd-session[5777]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:24.429146 systemd-logind[1561]: New session 9 of user core. Sep 9 05:34:24.434172 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 05:34:25.070207 containerd[1581]: time="2025-09-09T05:34:25.070159263Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\" id:\"fb8d453fbf4e8d1b8048be1a3515dd494540d66e433264e100a365710a839eab\" pid:5794 exited_at:{seconds:1757396065 nanos:69408308}" Sep 9 05:34:25.379054 sshd[5780]: Connection closed by 147.75.109.163 port 46026 Sep 9 05:34:25.381140 sshd-session[5777]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:25.388989 systemd[1]: sshd@8-65.109.237.121:22-147.75.109.163:46026.service: Deactivated successfully. Sep 9 05:34:25.393803 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 05:34:25.397837 systemd-logind[1561]: Session 9 logged out. Waiting for processes to exit. Sep 9 05:34:25.399721 systemd-logind[1561]: Removed session 9. Sep 9 05:34:25.567192 systemd[1]: Started sshd@9-65.109.237.121:22-147.75.109.163:46034.service - OpenSSH per-connection server daemon (147.75.109.163:46034). Sep 9 05:34:26.662895 sshd[5818]: Accepted publickey for core from 147.75.109.163 port 46034 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:26.664724 sshd-session[5818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:26.670666 systemd-logind[1561]: New session 10 of user core. Sep 9 05:34:26.678162 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 05:34:27.059740 containerd[1581]: time="2025-09-09T05:34:27.053413215Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\" id:\"32de53a454175d275303cb3999de76bda3fe85ddd8b7b9bff3c66488b2b6d375\" pid:5836 exited_at:{seconds:1757396067 nanos:53167446}" Sep 9 05:34:27.528633 sshd[5821]: Connection closed by 147.75.109.163 port 46034 Sep 9 05:34:27.531176 sshd-session[5818]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:27.535851 systemd-logind[1561]: Session 10 logged out. Waiting for processes to exit. Sep 9 05:34:27.535942 systemd[1]: sshd@9-65.109.237.121:22-147.75.109.163:46034.service: Deactivated successfully. Sep 9 05:34:27.537917 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 05:34:27.540180 systemd-logind[1561]: Removed session 10. Sep 9 05:34:27.681581 systemd[1]: Started sshd@10-65.109.237.121:22-147.75.109.163:46046.service - OpenSSH per-connection server daemon (147.75.109.163:46046). Sep 9 05:34:28.678092 sshd[5853]: Accepted publickey for core from 147.75.109.163 port 46046 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:28.679624 sshd-session[5853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:28.684306 systemd-logind[1561]: New session 11 of user core. Sep 9 05:34:28.688150 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 05:34:29.479189 sshd[5856]: Connection closed by 147.75.109.163 port 46046 Sep 9 05:34:29.479780 sshd-session[5853]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:29.486227 systemd[1]: sshd@10-65.109.237.121:22-147.75.109.163:46046.service: Deactivated successfully. Sep 9 05:34:29.488952 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 05:34:29.490440 systemd-logind[1561]: Session 11 logged out. Waiting for processes to exit. Sep 9 05:34:29.491925 systemd-logind[1561]: Removed session 11. Sep 9 05:34:31.246503 containerd[1581]: time="2025-09-09T05:34:31.246419722Z" level=info msg="TaskExit event in podsandbox handler container_id:\"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\" id:\"3b168f45c0422872e1ec68bcff1988fe79e934a2dfac3b5b9da5c7f38c6121cc\" pid:5883 exited_at:{seconds:1757396071 nanos:245856968}" Sep 9 05:34:34.648721 systemd[1]: Started sshd@11-65.109.237.121:22-147.75.109.163:58118.service - OpenSSH per-connection server daemon (147.75.109.163:58118). Sep 9 05:34:35.678290 sshd[5896]: Accepted publickey for core from 147.75.109.163 port 58118 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:35.681624 sshd-session[5896]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:35.691395 systemd-logind[1561]: New session 12 of user core. Sep 9 05:34:35.696268 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 05:34:36.499389 sshd[5899]: Connection closed by 147.75.109.163 port 58118 Sep 9 05:34:36.500281 sshd-session[5896]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:36.505662 systemd-logind[1561]: Session 12 logged out. Waiting for processes to exit. Sep 9 05:34:36.506868 systemd[1]: sshd@11-65.109.237.121:22-147.75.109.163:58118.service: Deactivated successfully. Sep 9 05:34:36.510420 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 05:34:36.514205 systemd-logind[1561]: Removed session 12. Sep 9 05:34:39.085006 kubelet[2758]: I0909 05:34:39.084941 2758 scope.go:117] "RemoveContainer" containerID="3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf" Sep 9 05:34:39.087463 containerd[1581]: time="2025-09-09T05:34:39.087421613Z" level=info msg="RemoveContainer for \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\"" Sep 9 05:34:39.099872 containerd[1581]: time="2025-09-09T05:34:39.099827074Z" level=info msg="RemoveContainer for \"3bf6eff7ed7821c25c9a4adf30928b56fcd37b11af8cf74865208df9705ee1cf\" returns successfully" Sep 9 05:34:39.105008 containerd[1581]: time="2025-09-09T05:34:39.104950584Z" level=info msg="StopPodSandbox for \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\"" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.327 [WARNING][5925] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.329 [INFO][5925] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.329 [INFO][5925] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" iface="eth0" netns="" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.329 [INFO][5925] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.329 [INFO][5925] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.499 [INFO][5932] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.501 [INFO][5932] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.502 [INFO][5932] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.514 [WARNING][5932] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.514 [INFO][5932] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.516 [INFO][5932] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:34:39.521680 containerd[1581]: 2025-09-09 05:34:39.518 [INFO][5925] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:34:39.528005 containerd[1581]: time="2025-09-09T05:34:39.527932988Z" level=info msg="TearDown network for sandbox \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" successfully" Sep 9 05:34:39.528005 containerd[1581]: time="2025-09-09T05:34:39.527981758Z" level=info msg="StopPodSandbox for \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" returns successfully" Sep 9 05:34:39.687590 containerd[1581]: time="2025-09-09T05:34:39.687532249Z" level=info msg="RemovePodSandbox for \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\"" Sep 9 05:34:39.724363 containerd[1581]: time="2025-09-09T05:34:39.724162683Z" level=info msg="Forcibly stopping sandbox \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\"" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.775 [WARNING][5947] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.775 [INFO][5947] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.775 [INFO][5947] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" iface="eth0" netns="" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.775 [INFO][5947] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.775 [INFO][5947] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.800 [INFO][5954] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.801 [INFO][5954] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.801 [INFO][5954] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.807 [WARNING][5954] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.807 [INFO][5954] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" HandleID="k8s-pod-network.bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--csjg4-eth0" Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.808 [INFO][5954] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:34:39.813966 containerd[1581]: 2025-09-09 05:34:39.811 [INFO][5947] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f" Sep 9 05:34:39.814956 containerd[1581]: time="2025-09-09T05:34:39.814035924Z" level=info msg="TearDown network for sandbox \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" successfully" Sep 9 05:34:39.835793 containerd[1581]: time="2025-09-09T05:34:39.835745780Z" level=info msg="Ensure that sandbox bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f in task-service has been cleanup successfully" Sep 9 05:34:39.855103 containerd[1581]: time="2025-09-09T05:34:39.854859486Z" level=info msg="RemovePodSandbox \"bd6bc2a6137fd494834723b660acf920e8de03f9f7ba197e8b4569995ac3ef0f\" returns successfully" Sep 9 05:34:39.861963 containerd[1581]: time="2025-09-09T05:34:39.861925011Z" level=info msg="StopPodSandbox for \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\"" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.908 [WARNING][5968] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.908 [INFO][5968] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.908 [INFO][5968] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" iface="eth0" netns="" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.908 [INFO][5968] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.908 [INFO][5968] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.938 [INFO][5975] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.938 [INFO][5975] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.938 [INFO][5975] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.947 [WARNING][5975] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.947 [INFO][5975] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.954 [INFO][5975] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:34:39.962246 containerd[1581]: 2025-09-09 05:34:39.958 [INFO][5968] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:34:39.964489 containerd[1581]: time="2025-09-09T05:34:39.962544787Z" level=info msg="TearDown network for sandbox \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" successfully" Sep 9 05:34:39.964489 containerd[1581]: time="2025-09-09T05:34:39.962576978Z" level=info msg="StopPodSandbox for \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" returns successfully" Sep 9 05:34:39.964489 containerd[1581]: time="2025-09-09T05:34:39.963151874Z" level=info msg="RemovePodSandbox for \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\"" Sep 9 05:34:39.964489 containerd[1581]: time="2025-09-09T05:34:39.963184905Z" level=info msg="Forcibly stopping sandbox \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\"" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.012 [WARNING][5989] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" WorkloadEndpoint="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.012 [INFO][5989] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.012 [INFO][5989] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" iface="eth0" netns="" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.012 [INFO][5989] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.012 [INFO][5989] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.040 [INFO][5997] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.040 [INFO][5997] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.040 [INFO][5997] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.051 [WARNING][5997] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.051 [INFO][5997] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" HandleID="k8s-pod-network.64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Workload="ci--4452--0--0--n--de00512edc-k8s-calico--apiserver--6b8c85d7cc--mbm69-eth0" Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.053 [INFO][5997] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 05:34:40.060082 containerd[1581]: 2025-09-09 05:34:40.056 [INFO][5989] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1" Sep 9 05:34:40.060082 containerd[1581]: time="2025-09-09T05:34:40.059595736Z" level=info msg="TearDown network for sandbox \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" successfully" Sep 9 05:34:40.062684 containerd[1581]: time="2025-09-09T05:34:40.062626078Z" level=info msg="Ensure that sandbox 64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1 in task-service has been cleanup successfully" Sep 9 05:34:40.066783 containerd[1581]: time="2025-09-09T05:34:40.065875581Z" level=info msg="RemovePodSandbox \"64c5cf42f0d1a3bbab85cb03536f7d9ef7092026b71f13c66078f12ad20303e1\" returns successfully" Sep 9 05:34:40.707240 update_engine[1566]: I20250909 05:34:40.707132 1566 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 9 05:34:40.707240 update_engine[1566]: I20250909 05:34:40.707218 1566 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 9 05:34:40.710124 update_engine[1566]: I20250909 05:34:40.710006 1566 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 9 05:34:40.712095 update_engine[1566]: I20250909 05:34:40.711836 1566 omaha_request_params.cc:62] Current group set to developer Sep 9 05:34:40.712095 update_engine[1566]: I20250909 05:34:40.711988 1566 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 9 05:34:40.712095 update_engine[1566]: I20250909 05:34:40.711997 1566 update_attempter.cc:643] Scheduling an action processor start. Sep 9 05:34:40.712543 update_engine[1566]: I20250909 05:34:40.712049 1566 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 9 05:34:40.712543 update_engine[1566]: I20250909 05:34:40.712320 1566 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 9 05:34:40.712543 update_engine[1566]: I20250909 05:34:40.712371 1566 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 9 05:34:40.712543 update_engine[1566]: I20250909 05:34:40.712378 1566 omaha_request_action.cc:272] Request: Sep 9 05:34:40.712543 update_engine[1566]: Sep 9 05:34:40.712543 update_engine[1566]: Sep 9 05:34:40.712543 update_engine[1566]: Sep 9 05:34:40.712543 update_engine[1566]: Sep 9 05:34:40.712543 update_engine[1566]: Sep 9 05:34:40.712543 update_engine[1566]: Sep 9 05:34:40.712543 update_engine[1566]: Sep 9 05:34:40.712543 update_engine[1566]: Sep 9 05:34:40.712543 update_engine[1566]: I20250909 05:34:40.712383 1566 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 05:34:40.737671 locksmithd[1602]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 9 05:34:40.744785 update_engine[1566]: I20250909 05:34:40.744695 1566 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 05:34:40.747561 update_engine[1566]: I20250909 05:34:40.746290 1566 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 05:34:40.748286 update_engine[1566]: E20250909 05:34:40.747751 1566 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 05:34:40.748286 update_engine[1566]: I20250909 05:34:40.747855 1566 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 9 05:34:41.669590 systemd[1]: Started sshd@12-65.109.237.121:22-147.75.109.163:59192.service - OpenSSH per-connection server daemon (147.75.109.163:59192). Sep 9 05:34:42.761267 sshd[6005]: Accepted publickey for core from 147.75.109.163 port 59192 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:42.763963 sshd-session[6005]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:42.770249 systemd-logind[1561]: New session 13 of user core. Sep 9 05:34:42.778229 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 05:34:43.929591 sshd[6008]: Connection closed by 147.75.109.163 port 59192 Sep 9 05:34:43.930306 sshd-session[6005]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:43.934972 systemd[1]: sshd@12-65.109.237.121:22-147.75.109.163:59192.service: Deactivated successfully. Sep 9 05:34:43.937618 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 05:34:43.939130 systemd-logind[1561]: Session 13 logged out. Waiting for processes to exit. Sep 9 05:34:43.940997 systemd-logind[1561]: Removed session 13. Sep 9 05:34:49.100361 systemd[1]: Started sshd@13-65.109.237.121:22-147.75.109.163:59208.service - OpenSSH per-connection server daemon (147.75.109.163:59208). Sep 9 05:34:49.841620 containerd[1581]: time="2025-09-09T05:34:49.841540064Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\" id:\"37deccf5b439e6b7c4c21835979a5a0a8b66bdae7c0cea5234e237beaa25c0be\" pid:6043 exited_at:{seconds:1757396089 nanos:829101399}" Sep 9 05:34:50.106194 sshd[6022]: Accepted publickey for core from 147.75.109.163 port 59208 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:50.108791 sshd-session[6022]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:50.114432 systemd-logind[1561]: New session 14 of user core. Sep 9 05:34:50.120281 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 05:34:50.642665 update_engine[1566]: I20250909 05:34:50.642603 1566 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 05:34:50.642993 update_engine[1566]: I20250909 05:34:50.642750 1566 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 05:34:50.644683 update_engine[1566]: I20250909 05:34:50.644637 1566 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 05:34:50.644744 update_engine[1566]: E20250909 05:34:50.644717 1566 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 05:34:50.645327 update_engine[1566]: I20250909 05:34:50.644793 1566 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 9 05:34:51.227949 sshd[6052]: Connection closed by 147.75.109.163 port 59208 Sep 9 05:34:51.229005 sshd-session[6022]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:51.233899 systemd[1]: sshd@13-65.109.237.121:22-147.75.109.163:59208.service: Deactivated successfully. Sep 9 05:34:51.236579 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 05:34:51.238079 systemd-logind[1561]: Session 14 logged out. Waiting for processes to exit. Sep 9 05:34:51.239532 systemd-logind[1561]: Removed session 14. Sep 9 05:34:51.402306 systemd[1]: Started sshd@14-65.109.237.121:22-147.75.109.163:41076.service - OpenSSH per-connection server daemon (147.75.109.163:41076). Sep 9 05:34:52.424656 sshd[6078]: Accepted publickey for core from 147.75.109.163 port 41076 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:52.425197 sshd-session[6078]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:52.430418 systemd-logind[1561]: New session 15 of user core. Sep 9 05:34:52.435253 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 05:34:53.370785 sshd[6081]: Connection closed by 147.75.109.163 port 41076 Sep 9 05:34:53.373309 sshd-session[6078]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:53.381913 systemd[1]: sshd@14-65.109.237.121:22-147.75.109.163:41076.service: Deactivated successfully. Sep 9 05:34:53.382648 systemd-logind[1561]: Session 15 logged out. Waiting for processes to exit. Sep 9 05:34:53.384429 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 05:34:53.386342 systemd-logind[1561]: Removed session 15. Sep 9 05:34:53.545548 systemd[1]: Started sshd@15-65.109.237.121:22-147.75.109.163:41090.service - OpenSSH per-connection server daemon (147.75.109.163:41090). Sep 9 05:34:54.574073 sshd[6091]: Accepted publickey for core from 147.75.109.163 port 41090 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:54.577371 sshd-session[6091]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:54.586002 systemd-logind[1561]: New session 16 of user core. Sep 9 05:34:54.593267 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 05:34:55.163609 containerd[1581]: time="2025-09-09T05:34:55.163569249Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\" id:\"dbdd0cc85f28f0fd25c4378615b370e2181c82915d65a943fe2cc7537f7a4db2\" pid:6107 exited_at:{seconds:1757396095 nanos:163223682}" Sep 9 05:34:55.853138 sshd[6094]: Connection closed by 147.75.109.163 port 41090 Sep 9 05:34:55.855598 sshd-session[6091]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:55.860757 systemd[1]: sshd@15-65.109.237.121:22-147.75.109.163:41090.service: Deactivated successfully. Sep 9 05:34:55.863555 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 05:34:55.866961 systemd-logind[1561]: Session 16 logged out. Waiting for processes to exit. Sep 9 05:34:55.868942 systemd-logind[1561]: Removed session 16. Sep 9 05:34:56.022991 systemd[1]: Started sshd@16-65.109.237.121:22-147.75.109.163:41092.service - OpenSSH per-connection server daemon (147.75.109.163:41092). Sep 9 05:34:56.718379 containerd[1581]: time="2025-09-09T05:34:56.718299610Z" level=info msg="TaskExit event in podsandbox handler container_id:\"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\" id:\"a2c4f7718de9c32bac836d38cbb6fe03739fb237630095a7103b328483fc2f5f\" pid:6148 exited_at:{seconds:1757396096 nanos:717937743}" Sep 9 05:34:57.054558 sshd[6133]: Accepted publickey for core from 147.75.109.163 port 41092 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:57.065096 sshd-session[6133]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:57.081086 containerd[1581]: time="2025-09-09T05:34:57.081009985Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\" id:\"a568393ebebd64ede68f9ba15c801a3a9aab9eb7f0f8f4358b1cb97e2715c642\" pid:6172 exited_at:{seconds:1757396097 nanos:80532400}" Sep 9 05:34:57.086488 systemd-logind[1561]: New session 17 of user core. Sep 9 05:34:57.093202 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 05:34:58.732612 sshd[6182]: Connection closed by 147.75.109.163 port 41092 Sep 9 05:34:58.733353 sshd-session[6133]: pam_unix(sshd:session): session closed for user core Sep 9 05:34:58.737900 systemd[1]: sshd@16-65.109.237.121:22-147.75.109.163:41092.service: Deactivated successfully. Sep 9 05:34:58.741418 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 05:34:58.745328 systemd-logind[1561]: Session 17 logged out. Waiting for processes to exit. Sep 9 05:34:58.748090 systemd-logind[1561]: Removed session 17. Sep 9 05:34:58.904105 systemd[1]: Started sshd@17-65.109.237.121:22-147.75.109.163:41104.service - OpenSSH per-connection server daemon (147.75.109.163:41104). Sep 9 05:34:59.906148 sshd[6191]: Accepted publickey for core from 147.75.109.163 port 41104 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:34:59.910381 sshd-session[6191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:34:59.923795 systemd-logind[1561]: New session 18 of user core. Sep 9 05:34:59.929250 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 05:35:00.642609 update_engine[1566]: I20250909 05:35:00.642525 1566 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 05:35:00.644416 update_engine[1566]: I20250909 05:35:00.642628 1566 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 05:35:00.644416 update_engine[1566]: I20250909 05:35:00.642998 1566 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 05:35:00.644416 update_engine[1566]: E20250909 05:35:00.643399 1566 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 05:35:00.644416 update_engine[1566]: I20250909 05:35:00.643464 1566 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 9 05:35:00.831556 sshd[6194]: Connection closed by 147.75.109.163 port 41104 Sep 9 05:35:00.832162 sshd-session[6191]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:00.837559 systemd[1]: sshd@17-65.109.237.121:22-147.75.109.163:41104.service: Deactivated successfully. Sep 9 05:35:00.838053 systemd-logind[1561]: Session 18 logged out. Waiting for processes to exit. Sep 9 05:35:00.840097 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 05:35:00.842623 systemd-logind[1561]: Removed session 18. Sep 9 05:35:01.219464 containerd[1581]: time="2025-09-09T05:35:01.219396183Z" level=info msg="TaskExit event in podsandbox handler container_id:\"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\" id:\"d3fdf44930c70459bab4071d1be8b93dc26e2e7417423f1b5f8d85d90c81c484\" pid:6220 exited_at:{seconds:1757396101 nanos:217102849}" Sep 9 05:35:06.033984 systemd[1]: Started sshd@18-65.109.237.121:22-147.75.109.163:36684.service - OpenSSH per-connection server daemon (147.75.109.163:36684). Sep 9 05:35:07.171093 sshd[6231]: Accepted publickey for core from 147.75.109.163 port 36684 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:35:07.173965 sshd-session[6231]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:07.180337 systemd-logind[1561]: New session 19 of user core. Sep 9 05:35:07.187145 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 05:35:08.287467 sshd[6234]: Connection closed by 147.75.109.163 port 36684 Sep 9 05:35:08.288053 sshd-session[6231]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:08.292098 systemd-logind[1561]: Session 19 logged out. Waiting for processes to exit. Sep 9 05:35:08.292670 systemd[1]: sshd@18-65.109.237.121:22-147.75.109.163:36684.service: Deactivated successfully. Sep 9 05:35:08.295000 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 05:35:08.296675 systemd-logind[1561]: Removed session 19. Sep 9 05:35:10.643814 update_engine[1566]: I20250909 05:35:10.643729 1566 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 05:35:10.644215 update_engine[1566]: I20250909 05:35:10.643827 1566 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 05:35:10.644302 update_engine[1566]: I20250909 05:35:10.644227 1566 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 05:35:10.644634 update_engine[1566]: E20250909 05:35:10.644589 1566 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 05:35:10.644685 update_engine[1566]: I20250909 05:35:10.644655 1566 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 9 05:35:10.647493 update_engine[1566]: I20250909 05:35:10.647447 1566 omaha_request_action.cc:617] Omaha request response: Sep 9 05:35:10.647987 update_engine[1566]: E20250909 05:35:10.647595 1566 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 9 05:35:10.674958 update_engine[1566]: I20250909 05:35:10.674300 1566 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 9 05:35:10.674958 update_engine[1566]: I20250909 05:35:10.674341 1566 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 9 05:35:10.674958 update_engine[1566]: I20250909 05:35:10.674347 1566 update_attempter.cc:306] Processing Done. Sep 9 05:35:10.675577 update_engine[1566]: E20250909 05:35:10.674371 1566 update_attempter.cc:619] Update failed. Sep 9 05:35:10.675577 update_engine[1566]: I20250909 05:35:10.675189 1566 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 9 05:35:10.675577 update_engine[1566]: I20250909 05:35:10.675199 1566 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 9 05:35:10.675577 update_engine[1566]: I20250909 05:35:10.675204 1566 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 9 05:35:10.675577 update_engine[1566]: I20250909 05:35:10.675302 1566 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 9 05:35:10.675577 update_engine[1566]: I20250909 05:35:10.675335 1566 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 9 05:35:10.675577 update_engine[1566]: I20250909 05:35:10.675340 1566 omaha_request_action.cc:272] Request: Sep 9 05:35:10.675577 update_engine[1566]: Sep 9 05:35:10.675577 update_engine[1566]: Sep 9 05:35:10.675577 update_engine[1566]: Sep 9 05:35:10.675577 update_engine[1566]: Sep 9 05:35:10.675577 update_engine[1566]: Sep 9 05:35:10.675577 update_engine[1566]: Sep 9 05:35:10.675577 update_engine[1566]: I20250909 05:35:10.675345 1566 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 9 05:35:10.675577 update_engine[1566]: I20250909 05:35:10.675381 1566 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 9 05:35:10.676309 update_engine[1566]: I20250909 05:35:10.676195 1566 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 9 05:35:10.676717 update_engine[1566]: E20250909 05:35:10.676672 1566 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 9 05:35:10.676808 update_engine[1566]: I20250909 05:35:10.676772 1566 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 9 05:35:10.676808 update_engine[1566]: I20250909 05:35:10.676793 1566 omaha_request_action.cc:617] Omaha request response: Sep 9 05:35:10.676808 update_engine[1566]: I20250909 05:35:10.676805 1566 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 9 05:35:10.676930 update_engine[1566]: I20250909 05:35:10.676812 1566 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 9 05:35:10.676930 update_engine[1566]: I20250909 05:35:10.676819 1566 update_attempter.cc:306] Processing Done. Sep 9 05:35:10.676930 update_engine[1566]: I20250909 05:35:10.676827 1566 update_attempter.cc:310] Error event sent. Sep 9 05:35:10.676930 update_engine[1566]: I20250909 05:35:10.676839 1566 update_check_scheduler.cc:74] Next update check in 43m47s Sep 9 05:35:10.695575 locksmithd[1602]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 9 05:35:10.695575 locksmithd[1602]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 9 05:35:13.473778 systemd[1]: Started sshd@19-65.109.237.121:22-147.75.109.163:39848.service - OpenSSH per-connection server daemon (147.75.109.163:39848). Sep 9 05:35:14.589112 sshd[6246]: Accepted publickey for core from 147.75.109.163 port 39848 ssh2: RSA SHA256:gIdaQphdIUmL/S12h2lteT3SgsxGhJQUaECTBDv479k Sep 9 05:35:14.594872 sshd-session[6246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 05:35:14.601467 systemd-logind[1561]: New session 20 of user core. Sep 9 05:35:14.614231 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 05:35:15.513135 sshd[6251]: Connection closed by 147.75.109.163 port 39848 Sep 9 05:35:15.514215 sshd-session[6246]: pam_unix(sshd:session): session closed for user core Sep 9 05:35:15.521956 systemd[1]: sshd@19-65.109.237.121:22-147.75.109.163:39848.service: Deactivated successfully. Sep 9 05:35:15.524901 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 05:35:15.526098 systemd-logind[1561]: Session 20 logged out. Waiting for processes to exit. Sep 9 05:35:15.528972 systemd-logind[1561]: Removed session 20. Sep 9 05:35:25.148041 containerd[1581]: time="2025-09-09T05:35:25.117955859Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ca00fe63bc3e095a74429850e5ca064876aa8f1b40f54083e292513ff55a98d\" id:\"8d22a05aa029c42e40374512cb5255292ef154b4731c65d9d4a7eaf6102d1a8e\" pid:6276 exited_at:{seconds:1757396125 nanos:53269655}" Sep 9 05:35:27.056351 containerd[1581]: time="2025-09-09T05:35:27.056259365Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d1e3944db8f08f00e2b51210e431ebd864b9c134e77e71da4ed6543ed4893210\" id:\"e664a3348e25c0f3a01fca00226d63e1846aa4ac7f0857793a8197815ab6896d\" pid:6301 exited_at:{seconds:1757396127 nanos:55637280}" Sep 9 05:35:31.165825 systemd[1]: cri-containerd-585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8.scope: Deactivated successfully. Sep 9 05:35:31.166299 systemd[1]: cri-containerd-585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8.scope: Consumed 14.365s CPU time, 106.8M memory peak, 80.7M read from disk. Sep 9 05:35:31.223884 containerd[1581]: time="2025-09-09T05:35:31.223656820Z" level=info msg="TaskExit event in podsandbox handler container_id:\"566cca73a2b2266c901de69380e75a4cf4be1064d6904ff83867b8564ba39a07\" id:\"89e43d27294b4bcd948d531f749391ec22b43a99cc45f4f5b19c9bd899e95b66\" pid:6323 exited_at:{seconds:1757396131 nanos:221252074}" Sep 9 05:35:31.328611 containerd[1581]: time="2025-09-09T05:35:31.328516788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8\" id:\"585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8\" pid:3100 exit_status:1 exited_at:{seconds:1757396131 nanos:262973339}" Sep 9 05:35:31.338724 containerd[1581]: time="2025-09-09T05:35:31.338633049Z" level=info msg="received exit event container_id:\"585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8\" id:\"585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8\" pid:3100 exit_status:1 exited_at:{seconds:1757396131 nanos:262973339}" Sep 9 05:35:31.478240 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8-rootfs.mount: Deactivated successfully. Sep 9 05:35:31.646543 kubelet[2758]: E0909 05:35:31.632427 2758 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:60516->10.0.0.2:2379: read: connection timed out" Sep 9 05:35:32.014480 kubelet[2758]: I0909 05:35:32.014427 2758 scope.go:117] "RemoveContainer" containerID="585e5b3288664dc96345a2ee4fc94adda09030709edd25be59c7e9eb6919f2c8" Sep 9 05:35:32.085469 containerd[1581]: time="2025-09-09T05:35:32.085387553Z" level=info msg="CreateContainer within sandbox \"f48115aad17413e2635af8e47e2d02d04c9475a55095d71f85603ca174d473b0\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 9 05:35:32.209085 containerd[1581]: time="2025-09-09T05:35:32.207007334Z" level=info msg="Container 6fa2975cd565235d80faf2f0519c12279982e01d059ccc1727a54b862a7e3416: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:32.211155 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1349011897.mount: Deactivated successfully. Sep 9 05:35:32.225558 containerd[1581]: time="2025-09-09T05:35:32.225521198Z" level=info msg="CreateContainer within sandbox \"f48115aad17413e2635af8e47e2d02d04c9475a55095d71f85603ca174d473b0\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"6fa2975cd565235d80faf2f0519c12279982e01d059ccc1727a54b862a7e3416\"" Sep 9 05:35:32.228503 containerd[1581]: time="2025-09-09T05:35:32.228474487Z" level=info msg="StartContainer for \"6fa2975cd565235d80faf2f0519c12279982e01d059ccc1727a54b862a7e3416\"" Sep 9 05:35:32.243338 containerd[1581]: time="2025-09-09T05:35:32.243301827Z" level=info msg="connecting to shim 6fa2975cd565235d80faf2f0519c12279982e01d059ccc1727a54b862a7e3416" address="unix:///run/containerd/s/ce089228e9cea4860f72bac773f0eb35f49c9e7870814f4c9d8ca14675dddc96" protocol=ttrpc version=3 Sep 9 05:35:32.254139 systemd[1]: cri-containerd-b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0.scope: Deactivated successfully. Sep 9 05:35:32.255086 systemd[1]: cri-containerd-b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0.scope: Consumed 2.899s CPU time, 86.1M memory peak, 105.4M read from disk. Sep 9 05:35:32.262487 containerd[1581]: time="2025-09-09T05:35:32.262251300Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0\" id:\"b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0\" pid:2611 exit_status:1 exited_at:{seconds:1757396132 nanos:261630148}" Sep 9 05:35:32.263046 containerd[1581]: time="2025-09-09T05:35:32.262852515Z" level=info msg="received exit event container_id:\"b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0\" id:\"b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0\" pid:2611 exit_status:1 exited_at:{seconds:1757396132 nanos:261630148}" Sep 9 05:35:32.289597 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0-rootfs.mount: Deactivated successfully. Sep 9 05:35:32.304167 systemd[1]: Started cri-containerd-6fa2975cd565235d80faf2f0519c12279982e01d059ccc1727a54b862a7e3416.scope - libcontainer container 6fa2975cd565235d80faf2f0519c12279982e01d059ccc1727a54b862a7e3416. Sep 9 05:35:32.346959 containerd[1581]: time="2025-09-09T05:35:32.346644576Z" level=info msg="StartContainer for \"6fa2975cd565235d80faf2f0519c12279982e01d059ccc1727a54b862a7e3416\" returns successfully" Sep 9 05:35:32.990411 kubelet[2758]: I0909 05:35:32.990388 2758 scope.go:117] "RemoveContainer" containerID="b12e25a2ca327422a4a107c07a8a366ba86c1ba9490ae5dce4055db42f516aa0" Sep 9 05:35:32.992380 containerd[1581]: time="2025-09-09T05:35:32.992326159Z" level=info msg="CreateContainer within sandbox \"507bd45eb1be74e903681327e950b35ded267a163908115371fd8e936b20c0d6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 9 05:35:33.009049 containerd[1581]: time="2025-09-09T05:35:33.008856918Z" level=info msg="Container 7994d50d5b3b5f5a4c81598b762337e140dc4fd942a52b5147bd3e635771382b: CDI devices from CRI Config.CDIDevices: []" Sep 9 05:35:33.010917 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3534694574.mount: Deactivated successfully. Sep 9 05:35:33.017309 containerd[1581]: time="2025-09-09T05:35:33.017262065Z" level=info msg="CreateContainer within sandbox \"507bd45eb1be74e903681327e950b35ded267a163908115371fd8e936b20c0d6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"7994d50d5b3b5f5a4c81598b762337e140dc4fd942a52b5147bd3e635771382b\"" Sep 9 05:35:33.018098 containerd[1581]: time="2025-09-09T05:35:33.018058670Z" level=info msg="StartContainer for \"7994d50d5b3b5f5a4c81598b762337e140dc4fd942a52b5147bd3e635771382b\"" Sep 9 05:35:33.019072 containerd[1581]: time="2025-09-09T05:35:33.019009751Z" level=info msg="connecting to shim 7994d50d5b3b5f5a4c81598b762337e140dc4fd942a52b5147bd3e635771382b" address="unix:///run/containerd/s/64d49d464caec475b2c106644615aac6eea6c2985e0a6e1c3dd655a7dd2b1524" protocol=ttrpc version=3 Sep 9 05:35:33.040186 systemd[1]: Started cri-containerd-7994d50d5b3b5f5a4c81598b762337e140dc4fd942a52b5147bd3e635771382b.scope - libcontainer container 7994d50d5b3b5f5a4c81598b762337e140dc4fd942a52b5147bd3e635771382b. Sep 9 05:35:33.104613 containerd[1581]: time="2025-09-09T05:35:33.104478035Z" level=info msg="StartContainer for \"7994d50d5b3b5f5a4c81598b762337e140dc4fd942a52b5147bd3e635771382b\" returns successfully" Sep 9 05:35:35.821767 kubelet[2758]: E0909 05:35:35.812672 2758 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:60374->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4452-0-0-n-de00512edc.18638677029c93fb kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4452-0-0-n-de00512edc,UID:ea2e3e2e3f0d27b731b4b6cda3d70855,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4452-0-0-n-de00512edc,},FirstTimestamp:2025-09-09 05:35:25.307671547 +0000 UTC m=+166.650859133,LastTimestamp:2025-09-09 05:35:25.307671547 +0000 UTC m=+166.650859133,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4452-0-0-n-de00512edc,}"