Sep 16 04:52:59.809056 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Sep 16 03:05:42 -00 2025 Sep 16 04:52:59.809079 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 04:52:59.809089 kernel: BIOS-provided physical RAM map: Sep 16 04:52:59.809095 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Sep 16 04:52:59.809101 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Sep 16 04:52:59.809107 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Sep 16 04:52:59.809115 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Sep 16 04:52:59.809121 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Sep 16 04:52:59.809127 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Sep 16 04:52:59.809133 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Sep 16 04:52:59.809139 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Sep 16 04:52:59.809145 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Sep 16 04:52:59.809151 kernel: NX (Execute Disable) protection: active Sep 16 04:52:59.809157 kernel: APIC: Static calls initialized Sep 16 04:52:59.809165 kernel: SMBIOS 2.8 present. Sep 16 04:52:59.809172 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Sep 16 04:52:59.809178 kernel: DMI: Memory slots populated: 1/1 Sep 16 04:52:59.809185 kernel: Hypervisor detected: KVM Sep 16 04:52:59.809191 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Sep 16 04:52:59.809197 kernel: kvm-clock: using sched offset of 4603343649 cycles Sep 16 04:52:59.809204 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Sep 16 04:52:59.809211 kernel: tsc: Detected 2445.406 MHz processor Sep 16 04:52:59.809219 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Sep 16 04:52:59.809226 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Sep 16 04:52:59.809233 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Sep 16 04:52:59.809239 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Sep 16 04:52:59.809246 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Sep 16 04:52:59.809253 kernel: Using GB pages for direct mapping Sep 16 04:52:59.809259 kernel: ACPI: Early table checksum verification disabled Sep 16 04:52:59.809266 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Sep 16 04:52:59.809272 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:52:59.809280 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:52:59.809287 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:52:59.809293 kernel: ACPI: FACS 0x000000007CFE0000 000040 Sep 16 04:52:59.809300 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:52:59.809318 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:52:59.809325 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:52:59.809331 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:52:59.809338 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Sep 16 04:52:59.809346 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Sep 16 04:52:59.809355 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Sep 16 04:52:59.809362 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Sep 16 04:52:59.809369 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Sep 16 04:52:59.809376 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Sep 16 04:52:59.809383 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Sep 16 04:52:59.809391 kernel: No NUMA configuration found Sep 16 04:52:59.809398 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Sep 16 04:52:59.809405 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Sep 16 04:52:59.809412 kernel: Zone ranges: Sep 16 04:52:59.809419 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Sep 16 04:52:59.809425 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Sep 16 04:52:59.809432 kernel: Normal empty Sep 16 04:52:59.809439 kernel: Device empty Sep 16 04:52:59.809445 kernel: Movable zone start for each node Sep 16 04:52:59.809452 kernel: Early memory node ranges Sep 16 04:52:59.809460 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Sep 16 04:52:59.809467 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Sep 16 04:52:59.809474 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Sep 16 04:52:59.809481 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Sep 16 04:52:59.809487 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Sep 16 04:52:59.809494 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Sep 16 04:52:59.809501 kernel: ACPI: PM-Timer IO Port: 0x608 Sep 16 04:52:59.810463 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Sep 16 04:52:59.810472 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Sep 16 04:52:59.810482 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Sep 16 04:52:59.810489 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Sep 16 04:52:59.810496 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Sep 16 04:52:59.810503 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Sep 16 04:52:59.810547 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Sep 16 04:52:59.810554 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Sep 16 04:52:59.810561 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Sep 16 04:52:59.810568 kernel: CPU topo: Max. logical packages: 1 Sep 16 04:52:59.810574 kernel: CPU topo: Max. logical dies: 1 Sep 16 04:52:59.810583 kernel: CPU topo: Max. dies per package: 1 Sep 16 04:52:59.810591 kernel: CPU topo: Max. threads per core: 1 Sep 16 04:52:59.810597 kernel: CPU topo: Num. cores per package: 2 Sep 16 04:52:59.810604 kernel: CPU topo: Num. threads per package: 2 Sep 16 04:52:59.810611 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Sep 16 04:52:59.810618 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Sep 16 04:52:59.810625 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Sep 16 04:52:59.810631 kernel: Booting paravirtualized kernel on KVM Sep 16 04:52:59.810638 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Sep 16 04:52:59.810647 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Sep 16 04:52:59.810654 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Sep 16 04:52:59.810661 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Sep 16 04:52:59.810667 kernel: pcpu-alloc: [0] 0 1 Sep 16 04:52:59.810674 kernel: kvm-guest: PV spinlocks disabled, no host support Sep 16 04:52:59.810682 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 04:52:59.810690 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 16 04:52:59.810697 kernel: random: crng init done Sep 16 04:52:59.810704 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 16 04:52:59.810713 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Sep 16 04:52:59.810719 kernel: Fallback order for Node 0: 0 Sep 16 04:52:59.810726 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Sep 16 04:52:59.810733 kernel: Policy zone: DMA32 Sep 16 04:52:59.810740 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 16 04:52:59.810747 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 16 04:52:59.810753 kernel: ftrace: allocating 40125 entries in 157 pages Sep 16 04:52:59.810760 kernel: ftrace: allocated 157 pages with 5 groups Sep 16 04:52:59.810767 kernel: Dynamic Preempt: voluntary Sep 16 04:52:59.810775 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 16 04:52:59.810783 kernel: rcu: RCU event tracing is enabled. Sep 16 04:52:59.810790 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 16 04:52:59.810797 kernel: Trampoline variant of Tasks RCU enabled. Sep 16 04:52:59.810804 kernel: Rude variant of Tasks RCU enabled. Sep 16 04:52:59.810811 kernel: Tracing variant of Tasks RCU enabled. Sep 16 04:52:59.810818 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 16 04:52:59.810825 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 16 04:52:59.810832 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 16 04:52:59.810840 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 16 04:52:59.810847 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 16 04:52:59.810854 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Sep 16 04:52:59.810861 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 16 04:52:59.810867 kernel: Console: colour VGA+ 80x25 Sep 16 04:52:59.810874 kernel: printk: legacy console [tty0] enabled Sep 16 04:52:59.810881 kernel: printk: legacy console [ttyS0] enabled Sep 16 04:52:59.810888 kernel: ACPI: Core revision 20240827 Sep 16 04:52:59.810895 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Sep 16 04:52:59.810908 kernel: APIC: Switch to symmetric I/O mode setup Sep 16 04:52:59.810915 kernel: x2apic enabled Sep 16 04:52:59.810923 kernel: APIC: Switched APIC routing to: physical x2apic Sep 16 04:52:59.810931 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Sep 16 04:52:59.810939 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Sep 16 04:52:59.810946 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) Sep 16 04:52:59.810953 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Sep 16 04:52:59.810960 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Sep 16 04:52:59.810968 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Sep 16 04:52:59.810976 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Sep 16 04:52:59.810984 kernel: Spectre V2 : Mitigation: Retpolines Sep 16 04:52:59.810991 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Sep 16 04:52:59.810998 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Sep 16 04:52:59.811005 kernel: active return thunk: retbleed_return_thunk Sep 16 04:52:59.811012 kernel: RETBleed: Mitigation: untrained return thunk Sep 16 04:52:59.811020 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Sep 16 04:52:59.811028 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Sep 16 04:52:59.811036 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Sep 16 04:52:59.811043 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Sep 16 04:52:59.811050 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Sep 16 04:52:59.811057 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Sep 16 04:52:59.811064 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Sep 16 04:52:59.811072 kernel: Freeing SMP alternatives memory: 32K Sep 16 04:52:59.811079 kernel: pid_max: default: 32768 minimum: 301 Sep 16 04:52:59.811086 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 16 04:52:59.811094 kernel: landlock: Up and running. Sep 16 04:52:59.811102 kernel: SELinux: Initializing. Sep 16 04:52:59.811109 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 16 04:52:59.811116 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Sep 16 04:52:59.811124 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Sep 16 04:52:59.811131 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Sep 16 04:52:59.811138 kernel: ... version: 0 Sep 16 04:52:59.811145 kernel: ... bit width: 48 Sep 16 04:52:59.811152 kernel: ... generic registers: 6 Sep 16 04:52:59.811160 kernel: ... value mask: 0000ffffffffffff Sep 16 04:52:59.811167 kernel: ... max period: 00007fffffffffff Sep 16 04:52:59.811174 kernel: ... fixed-purpose events: 0 Sep 16 04:52:59.811182 kernel: ... event mask: 000000000000003f Sep 16 04:52:59.811189 kernel: signal: max sigframe size: 1776 Sep 16 04:52:59.811196 kernel: rcu: Hierarchical SRCU implementation. Sep 16 04:52:59.811203 kernel: rcu: Max phase no-delay instances is 400. Sep 16 04:52:59.811211 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 16 04:52:59.811218 kernel: smp: Bringing up secondary CPUs ... Sep 16 04:52:59.811226 kernel: smpboot: x86: Booting SMP configuration: Sep 16 04:52:59.811233 kernel: .... node #0, CPUs: #1 Sep 16 04:52:59.811241 kernel: smp: Brought up 1 node, 2 CPUs Sep 16 04:52:59.811248 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) Sep 16 04:52:59.811255 kernel: Memory: 1917788K/2047464K available (14336K kernel code, 2432K rwdata, 9992K rodata, 54096K init, 2868K bss, 125140K reserved, 0K cma-reserved) Sep 16 04:52:59.811263 kernel: devtmpfs: initialized Sep 16 04:52:59.811270 kernel: x86/mm: Memory block size: 128MB Sep 16 04:52:59.811278 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 16 04:52:59.811285 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 16 04:52:59.811294 kernel: pinctrl core: initialized pinctrl subsystem Sep 16 04:52:59.811301 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 16 04:52:59.811316 kernel: audit: initializing netlink subsys (disabled) Sep 16 04:52:59.811324 kernel: audit: type=2000 audit(1757998377.359:1): state=initialized audit_enabled=0 res=1 Sep 16 04:52:59.811331 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 16 04:52:59.811338 kernel: thermal_sys: Registered thermal governor 'user_space' Sep 16 04:52:59.811345 kernel: cpuidle: using governor menu Sep 16 04:52:59.811352 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 16 04:52:59.811359 kernel: dca service started, version 1.12.1 Sep 16 04:52:59.811368 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Sep 16 04:52:59.811375 kernel: PCI: Using configuration type 1 for base access Sep 16 04:52:59.811383 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Sep 16 04:52:59.811390 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 16 04:52:59.811397 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Sep 16 04:52:59.811404 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 16 04:52:59.811411 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Sep 16 04:52:59.811418 kernel: ACPI: Added _OSI(Module Device) Sep 16 04:52:59.811425 kernel: ACPI: Added _OSI(Processor Device) Sep 16 04:52:59.811434 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 16 04:52:59.811442 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 16 04:52:59.811449 kernel: ACPI: Interpreter enabled Sep 16 04:52:59.811456 kernel: ACPI: PM: (supports S0 S5) Sep 16 04:52:59.811463 kernel: ACPI: Using IOAPIC for interrupt routing Sep 16 04:52:59.811470 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Sep 16 04:52:59.811477 kernel: PCI: Using E820 reservations for host bridge windows Sep 16 04:52:59.811484 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Sep 16 04:52:59.811492 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 16 04:52:59.811662 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 16 04:52:59.811747 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Sep 16 04:52:59.811820 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Sep 16 04:52:59.811830 kernel: PCI host bridge to bus 0000:00 Sep 16 04:52:59.811907 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Sep 16 04:52:59.811974 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Sep 16 04:52:59.812037 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Sep 16 04:52:59.812103 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Sep 16 04:52:59.812165 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Sep 16 04:52:59.812226 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Sep 16 04:52:59.812287 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 16 04:52:59.812392 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Sep 16 04:52:59.812480 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Sep 16 04:52:59.813345 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Sep 16 04:52:59.813432 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Sep 16 04:52:59.813538 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Sep 16 04:52:59.813633 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Sep 16 04:52:59.813710 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Sep 16 04:52:59.813800 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.813873 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Sep 16 04:52:59.813954 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 16 04:52:59.814027 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 16 04:52:59.814097 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 16 04:52:59.814176 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.814433 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Sep 16 04:52:59.814775 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 16 04:52:59.814854 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 16 04:52:59.814933 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 16 04:52:59.815012 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.815086 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Sep 16 04:52:59.815207 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 16 04:52:59.815297 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 16 04:52:59.815392 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 16 04:52:59.815471 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.816210 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Sep 16 04:52:59.816295 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 16 04:52:59.816395 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 16 04:52:59.816469 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 16 04:52:59.817041 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.817133 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Sep 16 04:52:59.817210 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 16 04:52:59.817291 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 16 04:52:59.817379 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 16 04:52:59.817462 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.820232 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Sep 16 04:52:59.820333 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 16 04:52:59.820410 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 16 04:52:59.820485 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 16 04:52:59.820610 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.820688 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Sep 16 04:52:59.820761 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 16 04:52:59.820835 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 16 04:52:59.820951 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 16 04:52:59.821127 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.821220 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Sep 16 04:52:59.821300 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 16 04:52:59.821403 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 16 04:52:59.821478 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 16 04:52:59.821594 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:52:59.821673 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Sep 16 04:52:59.821748 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 16 04:52:59.821828 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 16 04:52:59.821901 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 16 04:52:59.821982 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Sep 16 04:52:59.822056 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Sep 16 04:52:59.822136 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Sep 16 04:52:59.822210 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Sep 16 04:52:59.822282 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Sep 16 04:52:59.822388 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Sep 16 04:52:59.822465 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Sep 16 04:52:59.823642 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 16 04:52:59.823733 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Sep 16 04:52:59.823811 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Sep 16 04:52:59.823886 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Sep 16 04:52:59.823959 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 16 04:52:59.824048 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 16 04:52:59.824124 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Sep 16 04:52:59.824197 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 16 04:52:59.824280 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Sep 16 04:52:59.824376 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Sep 16 04:52:59.824452 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Sep 16 04:52:59.825593 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 16 04:52:59.825701 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Sep 16 04:52:59.825784 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Sep 16 04:52:59.825858 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 16 04:52:59.825942 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 16 04:52:59.826020 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Sep 16 04:52:59.826095 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 16 04:52:59.826184 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Sep 16 04:52:59.826260 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Sep 16 04:52:59.826353 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Sep 16 04:52:59.826426 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 16 04:52:59.826437 kernel: acpiphp: Slot [0] registered Sep 16 04:52:59.828570 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 16 04:52:59.828672 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Sep 16 04:52:59.828759 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Sep 16 04:52:59.828836 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Sep 16 04:52:59.828911 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 16 04:52:59.828923 kernel: acpiphp: Slot [0-2] registered Sep 16 04:52:59.828995 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 16 04:52:59.829006 kernel: acpiphp: Slot [0-3] registered Sep 16 04:52:59.829076 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 16 04:52:59.829090 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Sep 16 04:52:59.829098 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Sep 16 04:52:59.829106 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Sep 16 04:52:59.829113 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Sep 16 04:52:59.829120 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Sep 16 04:52:59.829127 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Sep 16 04:52:59.829134 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Sep 16 04:52:59.829142 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Sep 16 04:52:59.829149 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Sep 16 04:52:59.829158 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Sep 16 04:52:59.829165 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Sep 16 04:52:59.829172 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Sep 16 04:52:59.829179 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Sep 16 04:52:59.829187 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Sep 16 04:52:59.829194 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Sep 16 04:52:59.829202 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Sep 16 04:52:59.829209 kernel: iommu: Default domain type: Translated Sep 16 04:52:59.829216 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Sep 16 04:52:59.829225 kernel: PCI: Using ACPI for IRQ routing Sep 16 04:52:59.829232 kernel: PCI: pci_cache_line_size set to 64 bytes Sep 16 04:52:59.829239 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Sep 16 04:52:59.829247 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Sep 16 04:52:59.829337 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Sep 16 04:52:59.829414 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Sep 16 04:52:59.831610 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Sep 16 04:52:59.831626 kernel: vgaarb: loaded Sep 16 04:52:59.831635 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Sep 16 04:52:59.831647 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Sep 16 04:52:59.831654 kernel: clocksource: Switched to clocksource kvm-clock Sep 16 04:52:59.831662 kernel: VFS: Disk quotas dquot_6.6.0 Sep 16 04:52:59.831669 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 16 04:52:59.831677 kernel: pnp: PnP ACPI init Sep 16 04:52:59.831764 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Sep 16 04:52:59.831777 kernel: pnp: PnP ACPI: found 5 devices Sep 16 04:52:59.831785 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Sep 16 04:52:59.831795 kernel: NET: Registered PF_INET protocol family Sep 16 04:52:59.831802 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 16 04:52:59.831810 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Sep 16 04:52:59.831818 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 16 04:52:59.831825 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Sep 16 04:52:59.831832 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Sep 16 04:52:59.831840 kernel: TCP: Hash tables configured (established 16384 bind 16384) Sep 16 04:52:59.831847 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 16 04:52:59.831854 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Sep 16 04:52:59.831864 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 16 04:52:59.831871 kernel: NET: Registered PF_XDP protocol family Sep 16 04:52:59.831948 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 16 04:52:59.832024 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 16 04:52:59.832099 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 16 04:52:59.832173 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Sep 16 04:52:59.832246 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Sep 16 04:52:59.832337 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Sep 16 04:52:59.832417 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 16 04:52:59.832558 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Sep 16 04:52:59.832657 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 16 04:52:59.832733 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 16 04:52:59.832808 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Sep 16 04:52:59.832880 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Sep 16 04:52:59.832955 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 16 04:52:59.833026 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Sep 16 04:52:59.833097 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 16 04:52:59.833168 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 16 04:52:59.833244 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Sep 16 04:52:59.833329 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 16 04:52:59.833407 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 16 04:52:59.833478 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Sep 16 04:52:59.833571 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 16 04:52:59.833633 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 16 04:52:59.833695 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Sep 16 04:52:59.833754 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 16 04:52:59.833813 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 16 04:52:59.833870 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Sep 16 04:52:59.833930 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Sep 16 04:52:59.833992 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 16 04:52:59.834050 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 16 04:52:59.834107 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Sep 16 04:52:59.834165 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Sep 16 04:52:59.834222 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 16 04:52:59.834280 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 16 04:52:59.834350 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Sep 16 04:52:59.834409 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Sep 16 04:52:59.834468 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 16 04:52:59.837600 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Sep 16 04:52:59.837681 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Sep 16 04:52:59.837748 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Sep 16 04:52:59.837808 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Sep 16 04:52:59.837866 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Sep 16 04:52:59.837922 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Sep 16 04:52:59.837990 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Sep 16 04:52:59.838049 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Sep 16 04:52:59.838153 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Sep 16 04:52:59.838264 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Sep 16 04:52:59.838388 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Sep 16 04:52:59.838459 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Sep 16 04:52:59.838551 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Sep 16 04:52:59.838613 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Sep 16 04:52:59.838685 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Sep 16 04:52:59.838773 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Sep 16 04:52:59.838842 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Sep 16 04:52:59.838899 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Sep 16 04:52:59.838962 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Sep 16 04:52:59.839061 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Sep 16 04:52:59.839151 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Sep 16 04:52:59.839225 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Sep 16 04:52:59.839339 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Sep 16 04:52:59.839440 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Sep 16 04:52:59.839540 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Sep 16 04:52:59.839619 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Sep 16 04:52:59.839717 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Sep 16 04:52:59.839739 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Sep 16 04:52:59.839751 kernel: PCI: CLS 0 bytes, default 64 Sep 16 04:52:59.839763 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Sep 16 04:52:59.839774 kernel: Initialise system trusted keyrings Sep 16 04:52:59.839784 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Sep 16 04:52:59.839791 kernel: Key type asymmetric registered Sep 16 04:52:59.839797 kernel: Asymmetric key parser 'x509' registered Sep 16 04:52:59.839804 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Sep 16 04:52:59.839810 kernel: io scheduler mq-deadline registered Sep 16 04:52:59.839818 kernel: io scheduler kyber registered Sep 16 04:52:59.839824 kernel: io scheduler bfq registered Sep 16 04:52:59.839937 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Sep 16 04:52:59.840034 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Sep 16 04:52:59.840105 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Sep 16 04:52:59.840170 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Sep 16 04:52:59.840234 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Sep 16 04:52:59.840300 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Sep 16 04:52:59.840410 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Sep 16 04:52:59.840552 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Sep 16 04:52:59.840651 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Sep 16 04:52:59.840748 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Sep 16 04:52:59.840861 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Sep 16 04:52:59.840970 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Sep 16 04:52:59.841086 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Sep 16 04:52:59.841176 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Sep 16 04:52:59.841278 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Sep 16 04:52:59.841359 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Sep 16 04:52:59.841372 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Sep 16 04:52:59.841459 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Sep 16 04:52:59.841617 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Sep 16 04:52:59.841634 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Sep 16 04:52:59.841645 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Sep 16 04:52:59.841652 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 16 04:52:59.841658 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Sep 16 04:52:59.841665 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Sep 16 04:52:59.841671 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Sep 16 04:52:59.841677 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Sep 16 04:52:59.841684 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Sep 16 04:52:59.841761 kernel: rtc_cmos 00:03: RTC can wake from S4 Sep 16 04:52:59.841828 kernel: rtc_cmos 00:03: registered as rtc0 Sep 16 04:52:59.841885 kernel: rtc_cmos 00:03: setting system clock to 2025-09-16T04:52:59 UTC (1757998379) Sep 16 04:52:59.841946 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Sep 16 04:52:59.841956 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Sep 16 04:52:59.841963 kernel: NET: Registered PF_INET6 protocol family Sep 16 04:52:59.841969 kernel: Segment Routing with IPv6 Sep 16 04:52:59.841975 kernel: In-situ OAM (IOAM) with IPv6 Sep 16 04:52:59.841981 kernel: NET: Registered PF_PACKET protocol family Sep 16 04:52:59.841988 kernel: Key type dns_resolver registered Sep 16 04:52:59.841996 kernel: IPI shorthand broadcast: enabled Sep 16 04:52:59.842002 kernel: sched_clock: Marking stable (2850209946, 142684754)->(3010283748, -17389048) Sep 16 04:52:59.842009 kernel: registered taskstats version 1 Sep 16 04:52:59.842015 kernel: Loading compiled-in X.509 certificates Sep 16 04:52:59.842022 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: d1d5b0d56b9b23dabf19e645632ff93bf659b3bf' Sep 16 04:52:59.842028 kernel: Demotion targets for Node 0: null Sep 16 04:52:59.842034 kernel: Key type .fscrypt registered Sep 16 04:52:59.842040 kernel: Key type fscrypt-provisioning registered Sep 16 04:52:59.842046 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 16 04:52:59.842054 kernel: ima: Allocated hash algorithm: sha1 Sep 16 04:52:59.842060 kernel: ima: No architecture policies found Sep 16 04:52:59.842066 kernel: clk: Disabling unused clocks Sep 16 04:52:59.842073 kernel: Warning: unable to open an initial console. Sep 16 04:52:59.842079 kernel: Freeing unused kernel image (initmem) memory: 54096K Sep 16 04:52:59.842086 kernel: Write protecting the kernel read-only data: 24576k Sep 16 04:52:59.842092 kernel: Freeing unused kernel image (rodata/data gap) memory: 248K Sep 16 04:52:59.842098 kernel: Run /init as init process Sep 16 04:52:59.842106 kernel: with arguments: Sep 16 04:52:59.842113 kernel: /init Sep 16 04:52:59.842123 kernel: with environment: Sep 16 04:52:59.842134 kernel: HOME=/ Sep 16 04:52:59.842145 kernel: TERM=linux Sep 16 04:52:59.842156 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 16 04:52:59.842168 systemd[1]: Successfully made /usr/ read-only. Sep 16 04:52:59.842181 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:52:59.842195 systemd[1]: Detected virtualization kvm. Sep 16 04:52:59.842207 systemd[1]: Detected architecture x86-64. Sep 16 04:52:59.842218 systemd[1]: Running in initrd. Sep 16 04:52:59.842230 systemd[1]: No hostname configured, using default hostname. Sep 16 04:52:59.842242 systemd[1]: Hostname set to . Sep 16 04:52:59.842253 systemd[1]: Initializing machine ID from VM UUID. Sep 16 04:52:59.842259 systemd[1]: Queued start job for default target initrd.target. Sep 16 04:52:59.842271 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:52:59.842285 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:52:59.842299 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 16 04:52:59.842321 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:52:59.842333 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 16 04:52:59.842346 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 16 04:52:59.842358 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 16 04:52:59.842366 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 16 04:52:59.842380 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:52:59.842392 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:52:59.842404 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:52:59.842415 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:52:59.842427 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:52:59.842438 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:52:59.842450 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:52:59.842462 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:52:59.842474 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 16 04:52:59.842483 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 16 04:52:59.842490 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:52:59.842497 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:52:59.842522 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:52:59.842529 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:52:59.842536 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 16 04:52:59.842555 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:52:59.842562 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 16 04:52:59.842571 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 16 04:52:59.842578 systemd[1]: Starting systemd-fsck-usr.service... Sep 16 04:52:59.842585 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:52:59.842593 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:52:59.842600 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:52:59.842606 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 16 04:52:59.842639 systemd-journald[216]: Collecting audit messages is disabled. Sep 16 04:52:59.842659 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:52:59.842668 systemd[1]: Finished systemd-fsck-usr.service. Sep 16 04:52:59.842675 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 04:52:59.842682 systemd-journald[216]: Journal started Sep 16 04:52:59.842699 systemd-journald[216]: Runtime Journal (/run/log/journal/a9bb6e67bca64e5d8374efd4ced85b45) is 4.8M, max 38.6M, 33.7M free. Sep 16 04:52:59.822618 systemd-modules-load[217]: Inserted module 'overlay' Sep 16 04:52:59.879334 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:52:59.879376 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 16 04:52:59.879389 kernel: Bridge firewalling registered Sep 16 04:52:59.850216 systemd-modules-load[217]: Inserted module 'br_netfilter' Sep 16 04:52:59.880403 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:52:59.881095 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:52:59.882130 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:52:59.884708 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 16 04:52:59.886281 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:52:59.889594 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:52:59.900980 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:52:59.909432 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:52:59.913651 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:52:59.915465 systemd-tmpfiles[234]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 16 04:52:59.915554 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:52:59.918599 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 16 04:52:59.919765 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:52:59.928312 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:52:59.938051 dracut-cmdline[252]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=0b876f86a632750e9937176808a48c2452d5168964273bcfc3c72f2a26140c06 Sep 16 04:52:59.963156 systemd-resolved[256]: Positive Trust Anchors: Sep 16 04:52:59.963174 systemd-resolved[256]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:52:59.963208 systemd-resolved[256]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:52:59.968265 systemd-resolved[256]: Defaulting to hostname 'linux'. Sep 16 04:52:59.969072 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:52:59.969754 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:52:59.993547 kernel: SCSI subsystem initialized Sep 16 04:53:00.001537 kernel: Loading iSCSI transport class v2.0-870. Sep 16 04:53:00.009549 kernel: iscsi: registered transport (tcp) Sep 16 04:53:00.025763 kernel: iscsi: registered transport (qla4xxx) Sep 16 04:53:00.025808 kernel: QLogic iSCSI HBA Driver Sep 16 04:53:00.042034 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:53:00.057348 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:53:00.059906 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:53:00.096445 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 16 04:53:00.099026 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 16 04:53:00.146545 kernel: raid6: avx2x4 gen() 34297 MB/s Sep 16 04:53:00.163536 kernel: raid6: avx2x2 gen() 32607 MB/s Sep 16 04:53:00.180652 kernel: raid6: avx2x1 gen() 22256 MB/s Sep 16 04:53:00.180695 kernel: raid6: using algorithm avx2x4 gen() 34297 MB/s Sep 16 04:53:00.198734 kernel: raid6: .... xor() 4648 MB/s, rmw enabled Sep 16 04:53:00.198784 kernel: raid6: using avx2x2 recovery algorithm Sep 16 04:53:00.215541 kernel: xor: automatically using best checksumming function avx Sep 16 04:53:00.340557 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 16 04:53:00.346669 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:53:00.348742 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:53:00.373126 systemd-udevd[465]: Using default interface naming scheme 'v255'. Sep 16 04:53:00.377739 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:53:00.379756 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 16 04:53:00.404024 dracut-pre-trigger[471]: rd.md=0: removing MD RAID activation Sep 16 04:53:00.430439 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:53:00.433691 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:53:00.503404 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:53:00.506759 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 16 04:53:00.559531 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Sep 16 04:53:00.576538 kernel: ACPI: bus type USB registered Sep 16 04:53:00.576584 kernel: usbcore: registered new interface driver usbfs Sep 16 04:53:00.578669 kernel: scsi host0: Virtio SCSI HBA Sep 16 04:53:00.578816 kernel: usbcore: registered new interface driver hub Sep 16 04:53:00.581066 kernel: usbcore: registered new device driver usb Sep 16 04:53:00.598541 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 16 04:53:00.599559 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 16 04:53:00.599598 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 16 04:53:00.606616 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 16 04:53:00.609692 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 16 04:53:00.609824 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 16 04:53:00.612381 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 16 04:53:00.614541 kernel: cryptd: max_cpu_qlen set to 1000 Sep 16 04:53:00.614556 kernel: hub 1-0:1.0: USB hub found Sep 16 04:53:00.614675 kernel: hub 1-0:1.0: 4 ports detected Sep 16 04:53:00.617764 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 16 04:53:00.622841 kernel: hub 2-0:1.0: USB hub found Sep 16 04:53:00.622951 kernel: hub 2-0:1.0: 4 ports detected Sep 16 04:53:00.637535 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Sep 16 04:53:00.639524 kernel: AES CTR mode by8 optimization enabled Sep 16 04:53:00.651667 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:53:00.651776 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:53:00.673764 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:53:00.679715 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:53:00.685117 kernel: sd 0:0:0:0: Power-on or device reset occurred Sep 16 04:53:00.688530 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 16 04:53:00.688648 kernel: libata version 3.00 loaded. Sep 16 04:53:00.690690 kernel: sd 0:0:0:0: [sda] Write Protect is off Sep 16 04:53:00.690809 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Sep 16 04:53:00.693559 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 16 04:53:00.703897 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 16 04:53:00.703951 kernel: GPT:17805311 != 80003071 Sep 16 04:53:00.703960 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 16 04:53:00.705480 kernel: GPT:17805311 != 80003071 Sep 16 04:53:00.705529 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 16 04:53:00.705538 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:53:00.708552 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Sep 16 04:53:00.720543 kernel: ahci 0000:00:1f.2: version 3.0 Sep 16 04:53:00.720788 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Sep 16 04:53:00.721634 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Sep 16 04:53:00.721778 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Sep 16 04:53:00.721870 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Sep 16 04:53:00.725565 kernel: scsi host1: ahci Sep 16 04:53:00.726669 kernel: scsi host2: ahci Sep 16 04:53:00.726791 kernel: scsi host3: ahci Sep 16 04:53:00.726876 kernel: scsi host4: ahci Sep 16 04:53:00.727534 kernel: scsi host5: ahci Sep 16 04:53:00.727637 kernel: scsi host6: ahci Sep 16 04:53:00.728533 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 49 lpm-pol 1 Sep 16 04:53:00.728555 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 49 lpm-pol 1 Sep 16 04:53:00.728570 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 49 lpm-pol 1 Sep 16 04:53:00.728578 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 49 lpm-pol 1 Sep 16 04:53:00.728585 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 49 lpm-pol 1 Sep 16 04:53:00.728593 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 49 lpm-pol 1 Sep 16 04:53:00.771672 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:53:00.793850 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 16 04:53:00.800161 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 16 04:53:00.801427 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 16 04:53:00.809256 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 16 04:53:00.816752 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 16 04:53:00.817941 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 16 04:53:00.846224 disk-uuid[632]: Primary Header is updated. Sep 16 04:53:00.846224 disk-uuid[632]: Secondary Entries is updated. Sep 16 04:53:00.846224 disk-uuid[632]: Secondary Header is updated. Sep 16 04:53:00.861529 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 16 04:53:00.995547 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 16 04:53:01.042523 kernel: ata4: SATA link down (SStatus 0 SControl 300) Sep 16 04:53:01.042584 kernel: ata2: SATA link down (SStatus 0 SControl 300) Sep 16 04:53:01.042594 kernel: ata5: SATA link down (SStatus 0 SControl 300) Sep 16 04:53:01.045180 kernel: ata6: SATA link down (SStatus 0 SControl 300) Sep 16 04:53:01.045220 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Sep 16 04:53:01.045514 kernel: ata3: SATA link down (SStatus 0 SControl 300) Sep 16 04:53:01.047544 kernel: ata1.00: LPM support broken, forcing max_power Sep 16 04:53:01.048763 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Sep 16 04:53:01.048789 kernel: ata1.00: applying bridge limits Sep 16 04:53:01.050734 kernel: ata1.00: LPM support broken, forcing max_power Sep 16 04:53:01.050759 kernel: ata1.00: configured for UDMA/100 Sep 16 04:53:01.052535 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 16 04:53:01.073364 kernel: usbcore: registered new interface driver usbhid Sep 16 04:53:01.073424 kernel: usbhid: USB HID core driver Sep 16 04:53:01.081354 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Sep 16 04:53:01.081422 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 16 04:53:01.089870 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Sep 16 04:53:01.090054 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 16 04:53:01.109536 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Sep 16 04:53:01.383030 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 16 04:53:01.384830 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:53:01.385555 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:53:01.386945 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:53:01.389027 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 16 04:53:01.403110 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:53:01.895591 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:53:01.896053 disk-uuid[634]: The operation has completed successfully. Sep 16 04:53:01.948676 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 16 04:53:01.948767 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 16 04:53:01.981928 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 16 04:53:02.000137 sh[660]: Success Sep 16 04:53:02.016935 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 16 04:53:02.016982 kernel: device-mapper: uevent: version 1.0.3 Sep 16 04:53:02.016994 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 16 04:53:02.027527 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Sep 16 04:53:02.078148 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 16 04:53:02.081580 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 16 04:53:02.092519 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 16 04:53:02.106532 kernel: BTRFS: device fsid f1b91845-3914-4d21-a370-6d760ee45b2e devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (672) Sep 16 04:53:02.111673 kernel: BTRFS info (device dm-0): first mount of filesystem f1b91845-3914-4d21-a370-6d760ee45b2e Sep 16 04:53:02.111710 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Sep 16 04:53:02.124485 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 16 04:53:02.124557 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 16 04:53:02.124570 kernel: BTRFS info (device dm-0): enabling free space tree Sep 16 04:53:02.128058 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 16 04:53:02.129094 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:53:02.129762 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 16 04:53:02.130441 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 16 04:53:02.133606 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 16 04:53:02.187537 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (707) Sep 16 04:53:02.191727 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:53:02.191768 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 04:53:02.199932 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:53:02.199984 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:53:02.199995 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:53:02.205602 kernel: BTRFS info (device sda6): last unmount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:53:02.206352 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 16 04:53:02.208151 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 16 04:53:02.219059 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:53:02.221647 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:53:02.257066 systemd-networkd[841]: lo: Link UP Sep 16 04:53:02.257074 systemd-networkd[841]: lo: Gained carrier Sep 16 04:53:02.258711 systemd-networkd[841]: Enumeration completed Sep 16 04:53:02.258855 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:53:02.259564 systemd-networkd[841]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:53:02.259567 systemd-networkd[841]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:53:02.260152 systemd-networkd[841]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:53:02.260155 systemd-networkd[841]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:53:02.260813 systemd[1]: Reached target network.target - Network. Sep 16 04:53:02.261357 systemd-networkd[841]: eth0: Link UP Sep 16 04:53:02.261475 systemd-networkd[841]: eth1: Link UP Sep 16 04:53:02.261620 systemd-networkd[841]: eth0: Gained carrier Sep 16 04:53:02.261628 systemd-networkd[841]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:53:02.265975 systemd-networkd[841]: eth1: Gained carrier Sep 16 04:53:02.265983 systemd-networkd[841]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:53:02.289570 systemd-networkd[841]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 16 04:53:02.312990 systemd-networkd[841]: eth0: DHCPv4 address 157.180.68.84/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 16 04:53:02.324888 ignition[832]: Ignition 2.22.0 Sep 16 04:53:02.324901 ignition[832]: Stage: fetch-offline Sep 16 04:53:02.327035 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:53:02.324928 ignition[832]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:53:02.328362 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 16 04:53:02.324934 ignition[832]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:53:02.325001 ignition[832]: parsed url from cmdline: "" Sep 16 04:53:02.325003 ignition[832]: no config URL provided Sep 16 04:53:02.325007 ignition[832]: reading system config file "/usr/lib/ignition/user.ign" Sep 16 04:53:02.325012 ignition[832]: no config at "/usr/lib/ignition/user.ign" Sep 16 04:53:02.325018 ignition[832]: failed to fetch config: resource requires networking Sep 16 04:53:02.325205 ignition[832]: Ignition finished successfully Sep 16 04:53:02.350431 ignition[851]: Ignition 2.22.0 Sep 16 04:53:02.350440 ignition[851]: Stage: fetch Sep 16 04:53:02.352626 ignition[851]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:53:02.352637 ignition[851]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:53:02.352700 ignition[851]: parsed url from cmdline: "" Sep 16 04:53:02.352702 ignition[851]: no config URL provided Sep 16 04:53:02.352706 ignition[851]: reading system config file "/usr/lib/ignition/user.ign" Sep 16 04:53:02.352711 ignition[851]: no config at "/usr/lib/ignition/user.ign" Sep 16 04:53:02.352729 ignition[851]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 16 04:53:02.358468 ignition[851]: GET result: OK Sep 16 04:53:02.358562 ignition[851]: parsing config with SHA512: d7153a4acf84c1c306ffeddd0b0bff7b639b9e3d48c1cdc5f6c52aef0913f5262e2cba5f1f1ff113a9c968284aee8b68aa665b2e1a821d3b8a1aecd3b6e03d8c Sep 16 04:53:02.362285 unknown[851]: fetched base config from "system" Sep 16 04:53:02.362297 unknown[851]: fetched base config from "system" Sep 16 04:53:02.362634 ignition[851]: fetch: fetch complete Sep 16 04:53:02.362326 unknown[851]: fetched user config from "hetzner" Sep 16 04:53:02.362639 ignition[851]: fetch: fetch passed Sep 16 04:53:02.364410 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 16 04:53:02.362676 ignition[851]: Ignition finished successfully Sep 16 04:53:02.365833 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 16 04:53:02.394931 ignition[857]: Ignition 2.22.0 Sep 16 04:53:02.395573 ignition[857]: Stage: kargs Sep 16 04:53:02.395712 ignition[857]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:53:02.395725 ignition[857]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:53:02.396369 ignition[857]: kargs: kargs passed Sep 16 04:53:02.396406 ignition[857]: Ignition finished successfully Sep 16 04:53:02.399076 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 16 04:53:02.400616 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 16 04:53:02.426660 ignition[864]: Ignition 2.22.0 Sep 16 04:53:02.426674 ignition[864]: Stage: disks Sep 16 04:53:02.426802 ignition[864]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:53:02.426811 ignition[864]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:53:02.428534 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 16 04:53:02.427449 ignition[864]: disks: disks passed Sep 16 04:53:02.430116 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 16 04:53:02.427486 ignition[864]: Ignition finished successfully Sep 16 04:53:02.430919 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 16 04:53:02.431984 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:53:02.433216 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:53:02.434305 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:53:02.436377 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 16 04:53:02.468949 systemd-fsck[872]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 16 04:53:02.473701 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 16 04:53:02.476130 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 16 04:53:02.597537 kernel: EXT4-fs (sda9): mounted filesystem fb1cb44f-955b-4cd0-8849-33ce3640d547 r/w with ordered data mode. Quota mode: none. Sep 16 04:53:02.598656 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 16 04:53:02.600159 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 16 04:53:02.603336 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:53:02.606576 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 16 04:53:02.615606 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 16 04:53:02.617713 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 16 04:53:02.617742 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:53:02.620050 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 16 04:53:02.628015 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 16 04:53:02.644566 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (880) Sep 16 04:53:02.649573 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:53:02.652527 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 04:53:02.669412 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:53:02.669454 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:53:02.669465 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:53:02.679063 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:53:02.690569 coreos-metadata[882]: Sep 16 04:53:02.690 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 16 04:53:02.693525 coreos-metadata[882]: Sep 16 04:53:02.693 INFO Fetch successful Sep 16 04:53:02.695297 coreos-metadata[882]: Sep 16 04:53:02.694 INFO wrote hostname ci-4459-0-0-n-06f2563e85 to /sysroot/etc/hostname Sep 16 04:53:02.697879 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 16 04:53:02.699980 initrd-setup-root[908]: cut: /sysroot/etc/passwd: No such file or directory Sep 16 04:53:02.704905 initrd-setup-root[915]: cut: /sysroot/etc/group: No such file or directory Sep 16 04:53:02.708853 initrd-setup-root[922]: cut: /sysroot/etc/shadow: No such file or directory Sep 16 04:53:02.712093 initrd-setup-root[929]: cut: /sysroot/etc/gshadow: No such file or directory Sep 16 04:53:02.794173 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 16 04:53:02.796120 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 16 04:53:02.797893 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 16 04:53:02.812530 kernel: BTRFS info (device sda6): last unmount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:53:02.829666 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 16 04:53:02.839449 ignition[996]: INFO : Ignition 2.22.0 Sep 16 04:53:02.839449 ignition[996]: INFO : Stage: mount Sep 16 04:53:02.841341 ignition[996]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:53:02.841341 ignition[996]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:53:02.841341 ignition[996]: INFO : mount: mount passed Sep 16 04:53:02.841341 ignition[996]: INFO : Ignition finished successfully Sep 16 04:53:02.841785 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 16 04:53:02.843396 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 16 04:53:03.104777 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 16 04:53:03.106200 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:53:03.139539 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1008) Sep 16 04:53:03.143005 kernel: BTRFS info (device sda6): first mount of filesystem 8b047ef5-4757-404a-b211-2a505a425364 Sep 16 04:53:03.143032 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Sep 16 04:53:03.150725 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:53:03.150750 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:53:03.153282 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:53:03.155312 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:53:03.183539 ignition[1024]: INFO : Ignition 2.22.0 Sep 16 04:53:03.183539 ignition[1024]: INFO : Stage: files Sep 16 04:53:03.185472 ignition[1024]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:53:03.185472 ignition[1024]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:53:03.188376 ignition[1024]: DEBUG : files: compiled without relabeling support, skipping Sep 16 04:53:03.189589 ignition[1024]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 16 04:53:03.189589 ignition[1024]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 16 04:53:03.194549 ignition[1024]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 16 04:53:03.196058 ignition[1024]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 16 04:53:03.197714 unknown[1024]: wrote ssh authorized keys file for user: core Sep 16 04:53:03.198984 ignition[1024]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 16 04:53:03.200839 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 16 04:53:03.202571 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Sep 16 04:53:03.385248 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 16 04:53:03.440678 systemd-networkd[841]: eth0: Gained IPv6LL Sep 16 04:53:03.768751 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Sep 16 04:53:03.768751 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 16 04:53:03.771229 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 16 04:53:03.771229 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:53:03.771229 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:53:03.771229 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:53:03.771229 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:53:03.771229 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:53:03.771229 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:53:03.777458 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:53:03.777458 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:53:03.777458 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 16 04:53:03.779915 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 16 04:53:03.779915 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 16 04:53:03.779915 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Sep 16 04:53:04.206692 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 16 04:53:04.337028 systemd-networkd[841]: eth1: Gained IPv6LL Sep 16 04:53:05.973261 ignition[1024]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Sep 16 04:53:05.973261 ignition[1024]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 16 04:53:05.975584 ignition[1024]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:53:05.979716 ignition[1024]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:53:05.979716 ignition[1024]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 16 04:53:05.979716 ignition[1024]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 16 04:53:05.982257 ignition[1024]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 16 04:53:05.982257 ignition[1024]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 16 04:53:05.982257 ignition[1024]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 16 04:53:05.982257 ignition[1024]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 16 04:53:05.982257 ignition[1024]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 16 04:53:05.982257 ignition[1024]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:53:05.982257 ignition[1024]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:53:05.982257 ignition[1024]: INFO : files: files passed Sep 16 04:53:05.982257 ignition[1024]: INFO : Ignition finished successfully Sep 16 04:53:05.983599 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 16 04:53:05.988618 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 16 04:53:05.992638 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 16 04:53:06.004981 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 16 04:53:06.005758 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 16 04:53:06.009711 initrd-setup-root-after-ignition[1055]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:53:06.009711 initrd-setup-root-after-ignition[1055]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:53:06.012128 initrd-setup-root-after-ignition[1059]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:53:06.011816 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:53:06.013008 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 16 04:53:06.014746 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 16 04:53:06.054178 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 16 04:53:06.054286 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 16 04:53:06.055622 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 16 04:53:06.056744 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 16 04:53:06.057949 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 16 04:53:06.059610 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 16 04:53:06.094428 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:53:06.097039 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 16 04:53:06.122357 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:53:06.123864 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:53:06.125283 systemd[1]: Stopped target timers.target - Timer Units. Sep 16 04:53:06.125938 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 16 04:53:06.126044 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:53:06.127471 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 16 04:53:06.128295 systemd[1]: Stopped target basic.target - Basic System. Sep 16 04:53:06.129527 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 16 04:53:06.130607 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:53:06.131774 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 16 04:53:06.133042 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:53:06.134373 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 16 04:53:06.135623 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:53:06.137000 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 16 04:53:06.138210 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 16 04:53:06.139552 systemd[1]: Stopped target swap.target - Swaps. Sep 16 04:53:06.140656 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 16 04:53:06.140767 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:53:06.142133 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:53:06.142972 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:53:06.144079 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 16 04:53:06.144565 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:53:06.145358 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 16 04:53:06.145501 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 16 04:53:06.147160 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 16 04:53:06.147323 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:53:06.148570 systemd[1]: ignition-files.service: Deactivated successfully. Sep 16 04:53:06.148668 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 16 04:53:06.149918 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 16 04:53:06.150051 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 16 04:53:06.152592 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 16 04:53:06.162328 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 16 04:53:06.162905 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 16 04:53:06.163022 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:53:06.165741 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 16 04:53:06.165879 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:53:06.171254 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 16 04:53:06.177879 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 16 04:53:06.185927 ignition[1079]: INFO : Ignition 2.22.0 Sep 16 04:53:06.185927 ignition[1079]: INFO : Stage: umount Sep 16 04:53:06.189360 ignition[1079]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:53:06.189360 ignition[1079]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:53:06.189360 ignition[1079]: INFO : umount: umount passed Sep 16 04:53:06.189360 ignition[1079]: INFO : Ignition finished successfully Sep 16 04:53:06.188122 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 16 04:53:06.188211 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 16 04:53:06.189258 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 16 04:53:06.189324 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 16 04:53:06.190079 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 16 04:53:06.190120 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 16 04:53:06.192438 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 16 04:53:06.192476 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 16 04:53:06.194757 systemd[1]: Stopped target network.target - Network. Sep 16 04:53:06.195919 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 16 04:53:06.195963 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:53:06.196566 systemd[1]: Stopped target paths.target - Path Units. Sep 16 04:53:06.199059 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 16 04:53:06.202552 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:53:06.203734 systemd[1]: Stopped target slices.target - Slice Units. Sep 16 04:53:06.204692 systemd[1]: Stopped target sockets.target - Socket Units. Sep 16 04:53:06.205758 systemd[1]: iscsid.socket: Deactivated successfully. Sep 16 04:53:06.205794 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:53:06.206897 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 16 04:53:06.206926 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:53:06.207900 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 16 04:53:06.207948 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 16 04:53:06.208986 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 16 04:53:06.209023 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 16 04:53:06.210049 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 16 04:53:06.211034 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 16 04:53:06.213175 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 16 04:53:06.213782 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 16 04:53:06.213869 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 16 04:53:06.215041 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 16 04:53:06.215128 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 16 04:53:06.218196 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 16 04:53:06.218818 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 16 04:53:06.218882 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 16 04:53:06.219842 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 16 04:53:06.219885 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:53:06.221910 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:53:06.224944 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 16 04:53:06.225024 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 16 04:53:06.227403 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 16 04:53:06.227689 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 16 04:53:06.228723 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 16 04:53:06.228751 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:53:06.230582 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 16 04:53:06.232756 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 16 04:53:06.232796 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:53:06.233793 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 16 04:53:06.233825 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:53:06.235916 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 16 04:53:06.235956 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 16 04:53:06.236854 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:53:06.239693 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 16 04:53:06.245718 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 16 04:53:06.245841 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:53:06.247113 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 16 04:53:06.247151 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 16 04:53:06.248285 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 16 04:53:06.248309 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:53:06.249424 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 16 04:53:06.249462 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:53:06.251122 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 16 04:53:06.251154 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 16 04:53:06.252606 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 16 04:53:06.252644 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:53:06.255630 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 16 04:53:06.256400 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 16 04:53:06.256449 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:53:06.259687 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 16 04:53:06.259734 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:53:06.261581 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 16 04:53:06.261616 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:53:06.262578 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 16 04:53:06.262617 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:53:06.263931 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:53:06.263964 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:53:06.265009 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 16 04:53:06.265078 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 16 04:53:06.268955 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 16 04:53:06.269017 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 16 04:53:06.271027 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 16 04:53:06.274607 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 16 04:53:06.293164 systemd[1]: Switching root. Sep 16 04:53:06.327366 systemd-journald[216]: Journal stopped Sep 16 04:53:07.233295 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). Sep 16 04:53:07.233343 kernel: SELinux: policy capability network_peer_controls=1 Sep 16 04:53:07.233354 kernel: SELinux: policy capability open_perms=1 Sep 16 04:53:07.233365 kernel: SELinux: policy capability extended_socket_class=1 Sep 16 04:53:07.233373 kernel: SELinux: policy capability always_check_network=0 Sep 16 04:53:07.233380 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 16 04:53:07.233388 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 16 04:53:07.233397 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 16 04:53:07.233405 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 16 04:53:07.233412 kernel: SELinux: policy capability userspace_initial_context=0 Sep 16 04:53:07.233420 kernel: audit: type=1403 audit(1757998386.516:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 16 04:53:07.233437 systemd[1]: Successfully loaded SELinux policy in 61.543ms. Sep 16 04:53:07.233457 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.424ms. Sep 16 04:53:07.233466 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:53:07.233474 systemd[1]: Detected virtualization kvm. Sep 16 04:53:07.233483 systemd[1]: Detected architecture x86-64. Sep 16 04:53:07.233492 systemd[1]: Detected first boot. Sep 16 04:53:07.233501 systemd[1]: Hostname set to . Sep 16 04:53:07.233537 systemd[1]: Initializing machine ID from VM UUID. Sep 16 04:53:07.233547 zram_generator::config[1122]: No configuration found. Sep 16 04:53:07.233557 kernel: Guest personality initialized and is inactive Sep 16 04:53:07.233565 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Sep 16 04:53:07.233573 kernel: Initialized host personality Sep 16 04:53:07.233584 kernel: NET: Registered PF_VSOCK protocol family Sep 16 04:53:07.233594 systemd[1]: Populated /etc with preset unit settings. Sep 16 04:53:07.233606 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 16 04:53:07.233614 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 16 04:53:07.233622 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 16 04:53:07.233630 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 16 04:53:07.233639 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 16 04:53:07.233647 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 16 04:53:07.233656 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 16 04:53:07.233664 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 16 04:53:07.233674 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 16 04:53:07.233682 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 16 04:53:07.233691 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 16 04:53:07.233701 systemd[1]: Created slice user.slice - User and Session Slice. Sep 16 04:53:07.233711 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:53:07.233720 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:53:07.233730 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 16 04:53:07.233739 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 16 04:53:07.233747 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 16 04:53:07.233756 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:53:07.233764 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Sep 16 04:53:07.233774 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:53:07.233782 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:53:07.233790 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 16 04:53:07.233798 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 16 04:53:07.233806 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 16 04:53:07.233814 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 16 04:53:07.233823 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:53:07.233831 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:53:07.233839 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:53:07.233848 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:53:07.233858 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 16 04:53:07.233866 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 16 04:53:07.233874 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 16 04:53:07.233882 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:53:07.233891 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:53:07.233899 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:53:07.233907 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 16 04:53:07.233916 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 16 04:53:07.233924 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 16 04:53:07.233934 systemd[1]: Mounting media.mount - External Media Directory... Sep 16 04:53:07.233942 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 04:53:07.233951 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 16 04:53:07.233959 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 16 04:53:07.233967 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 16 04:53:07.233976 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 16 04:53:07.233984 systemd[1]: Reached target machines.target - Containers. Sep 16 04:53:07.233993 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 16 04:53:07.234002 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:53:07.234010 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:53:07.234019 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 16 04:53:07.234027 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:53:07.234036 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 04:53:07.234044 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:53:07.234053 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 16 04:53:07.234061 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:53:07.234070 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 16 04:53:07.234079 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 16 04:53:07.234087 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 16 04:53:07.234096 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 16 04:53:07.234104 systemd[1]: Stopped systemd-fsck-usr.service. Sep 16 04:53:07.234113 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:53:07.234121 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:53:07.234129 kernel: fuse: init (API version 7.41) Sep 16 04:53:07.234138 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:53:07.234148 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:53:07.234156 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 16 04:53:07.234165 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 16 04:53:07.234173 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:53:07.234183 systemd[1]: verity-setup.service: Deactivated successfully. Sep 16 04:53:07.234192 systemd[1]: Stopped verity-setup.service. Sep 16 04:53:07.234200 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 04:53:07.234222 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 16 04:53:07.234232 kernel: loop: module loaded Sep 16 04:53:07.234240 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 16 04:53:07.234248 systemd[1]: Mounted media.mount - External Media Directory. Sep 16 04:53:07.234258 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 16 04:53:07.234268 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 16 04:53:07.234276 kernel: ACPI: bus type drm_connector registered Sep 16 04:53:07.234284 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 16 04:53:07.234292 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:53:07.234300 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 16 04:53:07.234309 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 16 04:53:07.234318 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:53:07.234327 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:53:07.234335 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 16 04:53:07.234343 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 04:53:07.234352 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 04:53:07.234375 systemd-journald[1206]: Collecting audit messages is disabled. Sep 16 04:53:07.234396 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:53:07.234406 systemd-journald[1206]: Journal started Sep 16 04:53:07.234424 systemd-journald[1206]: Runtime Journal (/run/log/journal/a9bb6e67bca64e5d8374efd4ced85b45) is 4.8M, max 38.6M, 33.7M free. Sep 16 04:53:06.958974 systemd[1]: Queued start job for default target multi-user.target. Sep 16 04:53:06.967135 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 16 04:53:06.967488 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 16 04:53:07.236714 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:53:07.237600 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:53:07.239187 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 16 04:53:07.239322 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 16 04:53:07.240068 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:53:07.240180 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:53:07.240938 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:53:07.241627 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:53:07.242328 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 16 04:53:07.243019 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 16 04:53:07.248943 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:53:07.251590 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 16 04:53:07.255874 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 16 04:53:07.256353 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 16 04:53:07.256376 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:53:07.258255 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 16 04:53:07.261592 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 16 04:53:07.262206 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:53:07.264317 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 16 04:53:07.271187 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 16 04:53:07.271769 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:53:07.272720 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 16 04:53:07.273585 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 04:53:07.278595 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:53:07.283696 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 16 04:53:07.294653 systemd-journald[1206]: Time spent on flushing to /var/log/journal/a9bb6e67bca64e5d8374efd4ced85b45 is 26.270ms for 1161 entries. Sep 16 04:53:07.294653 systemd-journald[1206]: System Journal (/var/log/journal/a9bb6e67bca64e5d8374efd4ced85b45) is 8M, max 584.8M, 576.8M free. Sep 16 04:53:07.331399 systemd-journald[1206]: Received client request to flush runtime journal. Sep 16 04:53:07.331496 kernel: loop0: detected capacity change from 0 to 128016 Sep 16 04:53:07.291696 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 04:53:07.296780 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:53:07.299458 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 16 04:53:07.301191 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 16 04:53:07.302748 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 16 04:53:07.308055 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 16 04:53:07.311080 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 16 04:53:07.322591 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:53:07.332788 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 16 04:53:07.346825 systemd-tmpfiles[1248]: ACLs are not supported, ignoring. Sep 16 04:53:07.346839 systemd-tmpfiles[1248]: ACLs are not supported, ignoring. Sep 16 04:53:07.350453 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:53:07.362803 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 16 04:53:07.361878 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 16 04:53:07.373527 kernel: loop1: detected capacity change from 0 to 8 Sep 16 04:53:07.386337 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 16 04:53:07.391581 kernel: loop2: detected capacity change from 0 to 110984 Sep 16 04:53:07.405628 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 16 04:53:07.408818 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:53:07.427590 kernel: loop3: detected capacity change from 0 to 221472 Sep 16 04:53:07.430776 systemd-tmpfiles[1271]: ACLs are not supported, ignoring. Sep 16 04:53:07.430790 systemd-tmpfiles[1271]: ACLs are not supported, ignoring. Sep 16 04:53:07.433524 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:53:07.467604 kernel: loop4: detected capacity change from 0 to 128016 Sep 16 04:53:07.483581 kernel: loop5: detected capacity change from 0 to 8 Sep 16 04:53:07.487537 kernel: loop6: detected capacity change from 0 to 110984 Sep 16 04:53:07.510547 kernel: loop7: detected capacity change from 0 to 221472 Sep 16 04:53:07.536404 (sd-merge)[1275]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 16 04:53:07.537988 (sd-merge)[1275]: Merged extensions into '/usr'. Sep 16 04:53:07.544300 systemd[1]: Reload requested from client PID 1247 ('systemd-sysext') (unit systemd-sysext.service)... Sep 16 04:53:07.544651 systemd[1]: Reloading... Sep 16 04:53:07.596533 zram_generator::config[1301]: No configuration found. Sep 16 04:53:07.759391 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 16 04:53:07.759645 systemd[1]: Reloading finished in 214 ms. Sep 16 04:53:07.779969 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 16 04:53:07.786255 systemd[1]: Starting ensure-sysext.service... Sep 16 04:53:07.787613 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:53:07.805917 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 16 04:53:07.806176 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 16 04:53:07.806449 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 16 04:53:07.806748 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 16 04:53:07.807331 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 16 04:53:07.807634 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Sep 16 04:53:07.807675 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Sep 16 04:53:07.809744 systemd-tmpfiles[1344]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:53:07.809808 systemd-tmpfiles[1344]: Skipping /boot Sep 16 04:53:07.811803 systemd[1]: Reload requested from client PID 1343 ('systemctl') (unit ensure-sysext.service)... Sep 16 04:53:07.811819 systemd[1]: Reloading... Sep 16 04:53:07.814909 systemd-tmpfiles[1344]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:53:07.814968 systemd-tmpfiles[1344]: Skipping /boot Sep 16 04:53:07.830869 ldconfig[1242]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 16 04:53:07.862567 zram_generator::config[1372]: No configuration found. Sep 16 04:53:08.005724 systemd[1]: Reloading finished in 193 ms. Sep 16 04:53:08.029597 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 16 04:53:08.030321 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 16 04:53:08.034308 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:53:08.040611 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:53:08.042572 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 16 04:53:08.048239 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 16 04:53:08.053630 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:53:08.055656 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:53:08.059660 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 16 04:53:08.072152 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 16 04:53:08.076227 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 04:53:08.076384 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:53:08.078616 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:53:08.081753 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:53:08.086883 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:53:08.087579 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:53:08.087678 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:53:08.087768 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 04:53:08.089764 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 16 04:53:08.098644 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 16 04:53:08.103775 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 16 04:53:08.107353 systemd-udevd[1422]: Using default interface naming scheme 'v255'. Sep 16 04:53:08.108318 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 04:53:08.108461 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:53:08.109104 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:53:08.109177 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:53:08.109263 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 04:53:08.113105 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 04:53:08.113667 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:53:08.119379 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 04:53:08.120034 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:53:08.120112 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:53:08.120228 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Sep 16 04:53:08.125279 systemd[1]: Finished ensure-sysext.service. Sep 16 04:53:08.128781 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 16 04:53:08.131988 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:53:08.135711 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:53:08.136360 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 16 04:53:08.140752 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 16 04:53:08.141463 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 04:53:08.141615 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 04:53:08.153541 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:53:08.153755 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:53:08.154705 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:53:08.154827 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:53:08.155721 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:53:08.155773 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 04:53:08.164486 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:53:08.168334 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:53:08.170771 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 16 04:53:08.176987 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 16 04:53:08.177728 augenrules[1467]: No rules Sep 16 04:53:08.179001 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:53:08.179531 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:53:08.254629 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Sep 16 04:53:08.289740 systemd-resolved[1421]: Positive Trust Anchors: Sep 16 04:53:08.289752 systemd-resolved[1421]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:53:08.289775 systemd-resolved[1421]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:53:08.293368 systemd-resolved[1421]: Using system hostname 'ci-4459-0-0-n-06f2563e85'. Sep 16 04:53:08.294839 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:53:08.295623 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:53:08.316437 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 16 04:53:08.317011 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:53:08.317488 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 16 04:53:08.318275 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 16 04:53:08.319150 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Sep 16 04:53:08.319628 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 16 04:53:08.320517 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 16 04:53:08.320547 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:53:08.321939 systemd[1]: Reached target time-set.target - System Time Set. Sep 16 04:53:08.322783 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 16 04:53:08.323619 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 16 04:53:08.324260 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:53:08.325909 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 16 04:53:08.328236 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 16 04:53:08.330410 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 16 04:53:08.330425 systemd-networkd[1468]: lo: Link UP Sep 16 04:53:08.330428 systemd-networkd[1468]: lo: Gained carrier Sep 16 04:53:08.332129 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 16 04:53:08.333462 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 16 04:53:08.334597 systemd-networkd[1468]: Enumeration completed Sep 16 04:53:08.335020 systemd-timesyncd[1446]: No network connectivity, watching for changes. Sep 16 04:53:08.335463 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 16 04:53:08.335675 systemd-networkd[1468]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:53:08.335679 systemd-networkd[1468]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:53:08.336109 systemd-networkd[1468]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:53:08.336113 systemd-networkd[1468]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:53:08.336605 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 16 04:53:08.336764 systemd-networkd[1468]: eth0: Link UP Sep 16 04:53:08.336867 systemd-networkd[1468]: eth0: Gained carrier Sep 16 04:53:08.336878 systemd-networkd[1468]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:53:08.338221 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:53:08.339930 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 16 04:53:08.341182 systemd[1]: Reached target network.target - Network. Sep 16 04:53:08.341887 systemd-networkd[1468]: eth1: Link UP Sep 16 04:53:08.342348 systemd-networkd[1468]: eth1: Gained carrier Sep 16 04:53:08.342367 systemd-networkd[1468]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:53:08.342468 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:53:08.343299 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:53:08.344240 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 16 04:53:08.344261 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 16 04:53:08.345683 systemd[1]: Starting containerd.service - containerd container runtime... Sep 16 04:53:08.348763 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 16 04:53:08.352665 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 16 04:53:08.355895 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 16 04:53:08.363633 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 16 04:53:08.365370 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 16 04:53:08.366609 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 16 04:53:08.369651 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Sep 16 04:53:08.371553 jq[1523]: false Sep 16 04:53:08.372008 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 16 04:53:08.374635 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 16 04:53:08.378616 systemd-networkd[1468]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 16 04:53:08.379706 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 16 04:53:08.390145 google_oslogin_nss_cache[1525]: oslogin_cache_refresh[1525]: Refreshing passwd entry cache Sep 16 04:53:08.387103 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 16 04:53:08.386760 oslogin_cache_refresh[1525]: Refreshing passwd entry cache Sep 16 04:53:08.388189 systemd-timesyncd[1446]: Network configuration changed, trying to establish connection. Sep 16 04:53:08.391053 google_oslogin_nss_cache[1525]: oslogin_cache_refresh[1525]: Failure getting users, quitting Sep 16 04:53:08.391050 oslogin_cache_refresh[1525]: Failure getting users, quitting Sep 16 04:53:08.391112 google_oslogin_nss_cache[1525]: oslogin_cache_refresh[1525]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 16 04:53:08.391112 google_oslogin_nss_cache[1525]: oslogin_cache_refresh[1525]: Refreshing group entry cache Sep 16 04:53:08.391062 oslogin_cache_refresh[1525]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Sep 16 04:53:08.391089 oslogin_cache_refresh[1525]: Refreshing group entry cache Sep 16 04:53:08.391303 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 16 04:53:08.391409 oslogin_cache_refresh[1525]: Failure getting groups, quitting Sep 16 04:53:08.391842 google_oslogin_nss_cache[1525]: oslogin_cache_refresh[1525]: Failure getting groups, quitting Sep 16 04:53:08.391842 google_oslogin_nss_cache[1525]: oslogin_cache_refresh[1525]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 16 04:53:08.391415 oslogin_cache_refresh[1525]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Sep 16 04:53:08.393462 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 16 04:53:08.394575 systemd-networkd[1468]: eth0: DHCPv4 address 157.180.68.84/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 16 04:53:08.397608 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 16 04:53:08.400147 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 16 04:53:08.400555 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 16 04:53:08.404905 systemd[1]: Starting update-engine.service - Update Engine... Sep 16 04:53:08.411904 extend-filesystems[1524]: Found /dev/sda6 Sep 16 04:53:08.414560 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 16 04:53:08.418601 extend-filesystems[1524]: Found /dev/sda9 Sep 16 04:53:08.419095 coreos-metadata[1518]: Sep 16 04:53:08.417 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 16 04:53:08.419337 extend-filesystems[1524]: Checking size of /dev/sda9 Sep 16 04:53:08.421428 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 16 04:53:08.426858 coreos-metadata[1518]: Sep 16 04:53:08.419 INFO Fetch successful Sep 16 04:53:08.426858 coreos-metadata[1518]: Sep 16 04:53:08.419 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 16 04:53:08.426858 coreos-metadata[1518]: Sep 16 04:53:08.419 INFO Fetch successful Sep 16 04:53:08.435535 extend-filesystems[1524]: Resized partition /dev/sda9 Sep 16 04:53:08.431800 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 16 04:53:08.437366 extend-filesystems[1550]: resize2fs 1.47.3 (8-Jul-2025) Sep 16 04:53:08.446577 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 16 04:53:08.431966 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 16 04:53:08.432174 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Sep 16 04:53:08.433567 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Sep 16 04:53:08.438098 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 16 04:53:08.440303 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 16 04:53:08.461571 update_engine[1534]: I20250916 04:53:08.455173 1534 main.cc:92] Flatcar Update Engine starting Sep 16 04:53:08.474331 jq[1540]: true Sep 16 04:53:08.500235 tar[1551]: linux-amd64/helm Sep 16 04:53:08.511655 systemd[1]: motdgen.service: Deactivated successfully. Sep 16 04:53:08.511908 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 16 04:53:08.518551 (ntainerd)[1564]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 16 04:53:08.556610 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Sep 16 04:53:08.554166 dbus-daemon[1519]: [system] SELinux support is enabled Sep 16 04:53:08.520066 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 16 04:53:08.554336 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 16 04:53:08.557022 jq[1568]: true Sep 16 04:53:08.559761 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 16 04:53:08.559788 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 16 04:53:08.560419 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 16 04:53:08.560434 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 16 04:53:08.577764 systemd[1]: Started update-engine.service - Update Engine. Sep 16 04:53:08.579615 update_engine[1534]: I20250916 04:53:08.578527 1534 update_check_scheduler.cc:74] Next update check in 3m44s Sep 16 04:53:08.583280 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 16 04:53:08.595758 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 16 04:53:08.596422 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 16 04:53:08.615246 kernel: mousedev: PS/2 mouse device common for all mice Sep 16 04:53:08.628567 kernel: ACPI: button: Power Button [PWRF] Sep 16 04:53:09.302132 systemd-resolved[1421]: Clock change detected. Flushing caches. Sep 16 04:53:09.304002 systemd-timesyncd[1446]: Contacted time server 141.144.230.32:123 (0.flatcar.pool.ntp.org). Sep 16 04:53:09.304048 systemd-timesyncd[1446]: Initial clock synchronization to Tue 2025-09-16 04:53:09.302070 UTC. Sep 16 04:53:09.305331 bash[1594]: Updated "/home/core/.ssh/authorized_keys" Sep 16 04:53:09.310931 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 16 04:53:09.306968 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 16 04:53:09.313917 systemd[1]: Starting sshkeys.service... Sep 16 04:53:09.326519 systemd-logind[1530]: New seat seat0. Sep 16 04:53:09.328573 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 16 04:53:09.329485 extend-filesystems[1550]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 16 04:53:09.329485 extend-filesystems[1550]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 16 04:53:09.329485 extend-filesystems[1550]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 16 04:53:09.335562 extend-filesystems[1524]: Resized filesystem in /dev/sda9 Sep 16 04:53:09.331698 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 16 04:53:09.336773 sshd_keygen[1569]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 16 04:53:09.334013 systemd[1]: Started systemd-logind.service - User Login Management. Sep 16 04:53:09.336196 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 16 04:53:09.336341 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 16 04:53:09.358123 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 16 04:53:09.367970 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 16 04:53:09.402116 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 16 04:53:09.412982 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 16 04:53:09.416146 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 16 04:53:09.421105 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Sep 16 04:53:09.421147 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Sep 16 04:53:09.423397 kernel: Console: switching to colour dummy device 80x25 Sep 16 04:53:09.425432 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 16 04:53:09.425470 kernel: [drm] features: -context_init Sep 16 04:53:09.444649 kernel: [drm] number of scanouts: 1 Sep 16 04:53:09.444697 kernel: [drm] number of cap sets: 0 Sep 16 04:53:09.444708 containerd[1564]: time="2025-09-16T04:53:09Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 16 04:53:09.444708 containerd[1564]: time="2025-09-16T04:53:09.444394228Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 16 04:53:09.449137 coreos-metadata[1610]: Sep 16 04:53:09.448 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 16 04:53:09.449841 coreos-metadata[1610]: Sep 16 04:53:09.449 INFO Fetch successful Sep 16 04:53:09.451700 unknown[1610]: wrote ssh authorized keys file for user: core Sep 16 04:53:09.453921 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Sep 16 04:53:09.459939 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Sep 16 04:53:09.459992 kernel: Console: switching to colour frame buffer device 160x50 Sep 16 04:53:09.463712 locksmithd[1579]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 16 04:53:09.466942 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 16 04:53:09.465142 systemd[1]: issuegen.service: Deactivated successfully. Sep 16 04:53:09.465331 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 16 04:53:09.470112 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 16 04:53:09.476008 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 16 04:53:09.481953 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 16 04:53:09.496240 containerd[1564]: time="2025-09-16T04:53:09.496065116Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.158µs" Sep 16 04:53:09.496240 containerd[1564]: time="2025-09-16T04:53:09.496093920Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 16 04:53:09.496240 containerd[1564]: time="2025-09-16T04:53:09.496112204Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 16 04:53:09.497391 containerd[1564]: time="2025-09-16T04:53:09.497314850Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 16 04:53:09.497391 containerd[1564]: time="2025-09-16T04:53:09.497341800Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 16 04:53:09.497391 containerd[1564]: time="2025-09-16T04:53:09.497366256Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 04:53:09.497726 containerd[1564]: time="2025-09-16T04:53:09.497616305Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 04:53:09.497726 containerd[1564]: time="2025-09-16T04:53:09.497631904Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 04:53:09.498326 containerd[1564]: time="2025-09-16T04:53:09.498307271Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 04:53:09.499055 containerd[1564]: time="2025-09-16T04:53:09.498937552Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 04:53:09.499055 containerd[1564]: time="2025-09-16T04:53:09.498959814Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 04:53:09.499055 containerd[1564]: time="2025-09-16T04:53:09.498967789Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 16 04:53:09.499276 containerd[1564]: time="2025-09-16T04:53:09.499261490Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 16 04:53:09.500080 containerd[1564]: time="2025-09-16T04:53:09.499479990Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 04:53:09.500080 containerd[1564]: time="2025-09-16T04:53:09.499508983Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 04:53:09.500080 containerd[1564]: time="2025-09-16T04:53:09.499518652Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 16 04:53:09.500370 containerd[1564]: time="2025-09-16T04:53:09.500355191Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 16 04:53:09.503593 update-ssh-keys[1627]: Updated "/home/core/.ssh/authorized_keys" Sep 16 04:53:09.502165 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 16 04:53:09.504339 systemd[1]: Finished sshkeys.service. Sep 16 04:53:09.506623 containerd[1564]: time="2025-09-16T04:53:09.504156819Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 16 04:53:09.506623 containerd[1564]: time="2025-09-16T04:53:09.506104110Z" level=info msg="metadata content store policy set" policy=shared Sep 16 04:53:09.513949 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 16 04:53:09.520290 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 16 04:53:09.525117 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Sep 16 04:53:09.527011 systemd[1]: Reached target getty.target - Login Prompts. Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532268752Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532330718Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532344193Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532354292Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532367487Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532376173Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532385521Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532396061Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532405578Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532413202Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532420246Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532429924Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532533538Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 16 04:53:09.533181 containerd[1564]: time="2025-09-16T04:53:09.532551522Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532562723Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532572261Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532597258Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532606104Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532614520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532622946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532632403Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532644917Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532652842Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532709178Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532721441Z" level=info msg="Start snapshots syncer" Sep 16 04:53:09.533449 containerd[1564]: time="2025-09-16T04:53:09.532739064Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 16 04:53:09.533632 containerd[1564]: time="2025-09-16T04:53:09.533043103Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 16 04:53:09.533632 containerd[1564]: time="2025-09-16T04:53:09.533101563Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 16 04:53:09.533724 containerd[1564]: time="2025-09-16T04:53:09.533157839Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 16 04:53:09.533908 containerd[1564]: time="2025-09-16T04:53:09.533846921Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 16 04:53:09.533908 containerd[1564]: time="2025-09-16T04:53:09.533872950Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 16 04:53:09.533908 containerd[1564]: time="2025-09-16T04:53:09.533884171Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 16 04:53:09.534484 containerd[1564]: time="2025-09-16T04:53:09.534007172Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 16 04:53:09.534484 containerd[1564]: time="2025-09-16T04:53:09.534028752Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 16 04:53:09.534484 containerd[1564]: time="2025-09-16T04:53:09.534038350Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 16 04:53:09.534484 containerd[1564]: time="2025-09-16T04:53:09.534047266Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 16 04:53:09.534484 containerd[1564]: time="2025-09-16T04:53:09.534068967Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 16 04:53:09.534484 containerd[1564]: time="2025-09-16T04:53:09.534426167Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 16 04:53:09.534484 containerd[1564]: time="2025-09-16T04:53:09.534439733Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 16 04:53:09.534801 containerd[1564]: time="2025-09-16T04:53:09.534640609Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 04:53:09.534801 containerd[1564]: time="2025-09-16T04:53:09.534660486Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 04:53:09.534801 containerd[1564]: time="2025-09-16T04:53:09.534669022Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 04:53:09.534801 containerd[1564]: time="2025-09-16T04:53:09.534676727Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 04:53:09.535417 containerd[1564]: time="2025-09-16T04:53:09.534682688Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 16 04:53:09.535417 containerd[1564]: time="2025-09-16T04:53:09.535319482Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 16 04:53:09.535417 containerd[1564]: time="2025-09-16T04:53:09.535347544Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 16 04:53:09.535417 containerd[1564]: time="2025-09-16T04:53:09.535362072Z" level=info msg="runtime interface created" Sep 16 04:53:09.535417 containerd[1564]: time="2025-09-16T04:53:09.535366150Z" level=info msg="created NRI interface" Sep 16 04:53:09.535417 containerd[1564]: time="2025-09-16T04:53:09.535373233Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 16 04:53:09.535417 containerd[1564]: time="2025-09-16T04:53:09.535382270Z" level=info msg="Connect containerd service" Sep 16 04:53:09.535991 containerd[1564]: time="2025-09-16T04:53:09.535682944Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 16 04:53:09.541784 containerd[1564]: time="2025-09-16T04:53:09.541763936Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 04:53:09.571288 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Sep 16 04:53:09.571516 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Sep 16 04:53:09.579983 kernel: EDAC MC: Ver: 3.0.0 Sep 16 04:53:09.652623 systemd-logind[1530]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653133065Z" level=info msg="Start subscribing containerd event" Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653172369Z" level=info msg="Start recovering state" Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653240035Z" level=info msg="Start event monitor" Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653251426Z" level=info msg="Start cni network conf syncer for default" Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653257298Z" level=info msg="Start streaming server" Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653264891Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653270733Z" level=info msg="runtime interface starting up..." Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653275001Z" level=info msg="starting plugins..." Sep 16 04:53:09.653448 containerd[1564]: time="2025-09-16T04:53:09.653285300Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 16 04:53:09.654905 containerd[1564]: time="2025-09-16T04:53:09.654687419Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 16 04:53:09.654905 containerd[1564]: time="2025-09-16T04:53:09.654751409Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 16 04:53:09.655174 containerd[1564]: time="2025-09-16T04:53:09.655131522Z" level=info msg="containerd successfully booted in 0.211606s" Sep 16 04:53:09.655252 systemd[1]: Started containerd.service - containerd container runtime. Sep 16 04:53:09.664219 systemd-logind[1530]: Watching system buttons on /dev/input/event3 (Power Button) Sep 16 04:53:09.687307 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:53:09.701140 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:53:09.701289 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:53:09.705046 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:53:09.725005 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:53:09.725163 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:53:09.728017 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:53:09.731867 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:53:09.809485 tar[1551]: linux-amd64/LICENSE Sep 16 04:53:09.809565 tar[1551]: linux-amd64/README.md Sep 16 04:53:09.822415 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 16 04:53:09.824494 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:53:10.421072 systemd-networkd[1468]: eth1: Gained IPv6LL Sep 16 04:53:10.423783 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 16 04:53:10.425075 systemd[1]: Reached target network-online.target - Network is Online. Sep 16 04:53:10.428504 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:53:10.432138 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 16 04:53:10.453025 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 16 04:53:10.997065 systemd-networkd[1468]: eth0: Gained IPv6LL Sep 16 04:53:11.341697 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:11.342499 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 16 04:53:11.343977 systemd[1]: Startup finished in 2.928s (kernel) + 6.864s (initrd) + 4.243s (userspace) = 14.037s. Sep 16 04:53:11.345457 (kubelet)[1703]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:53:11.937819 kubelet[1703]: E0916 04:53:11.937759 1703 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:53:11.940045 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:53:11.940186 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:53:11.940463 systemd[1]: kubelet.service: Consumed 921ms CPU time, 266.6M memory peak. Sep 16 04:53:16.161815 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 16 04:53:16.163190 systemd[1]: Started sshd@0-157.180.68.84:22-139.178.89.65:45010.service - OpenSSH per-connection server daemon (139.178.89.65:45010). Sep 16 04:53:17.158748 sshd[1715]: Accepted publickey for core from 139.178.89.65 port 45010 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:53:17.160269 sshd-session[1715]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:53:17.168126 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 16 04:53:17.169949 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 16 04:53:17.182710 systemd-logind[1530]: New session 1 of user core. Sep 16 04:53:17.194444 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 16 04:53:17.199298 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 16 04:53:17.214426 (systemd)[1720]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 16 04:53:17.218219 systemd-logind[1530]: New session c1 of user core. Sep 16 04:53:17.397082 systemd[1720]: Queued start job for default target default.target. Sep 16 04:53:17.408656 systemd[1720]: Created slice app.slice - User Application Slice. Sep 16 04:53:17.408688 systemd[1720]: Reached target paths.target - Paths. Sep 16 04:53:17.408723 systemd[1720]: Reached target timers.target - Timers. Sep 16 04:53:17.409785 systemd[1720]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 16 04:53:17.418956 systemd[1720]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 16 04:53:17.418994 systemd[1720]: Reached target sockets.target - Sockets. Sep 16 04:53:17.419029 systemd[1720]: Reached target basic.target - Basic System. Sep 16 04:53:17.419055 systemd[1720]: Reached target default.target - Main User Target. Sep 16 04:53:17.419074 systemd[1720]: Startup finished in 192ms. Sep 16 04:53:17.419153 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 16 04:53:17.421432 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 16 04:53:18.151810 systemd[1]: Started sshd@1-157.180.68.84:22-139.178.89.65:45024.service - OpenSSH per-connection server daemon (139.178.89.65:45024). Sep 16 04:53:19.258087 sshd[1731]: Accepted publickey for core from 139.178.89.65 port 45024 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:53:19.260164 sshd-session[1731]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:53:19.268106 systemd-logind[1530]: New session 2 of user core. Sep 16 04:53:19.272139 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 16 04:53:20.004698 sshd[1734]: Connection closed by 139.178.89.65 port 45024 Sep 16 04:53:20.005523 sshd-session[1731]: pam_unix(sshd:session): session closed for user core Sep 16 04:53:20.011384 systemd-logind[1530]: Session 2 logged out. Waiting for processes to exit. Sep 16 04:53:20.011504 systemd[1]: sshd@1-157.180.68.84:22-139.178.89.65:45024.service: Deactivated successfully. Sep 16 04:53:20.014238 systemd[1]: session-2.scope: Deactivated successfully. Sep 16 04:53:20.016649 systemd-logind[1530]: Removed session 2. Sep 16 04:53:20.161178 systemd[1]: Started sshd@2-157.180.68.84:22-139.178.89.65:38474.service - OpenSSH per-connection server daemon (139.178.89.65:38474). Sep 16 04:53:21.164579 sshd[1740]: Accepted publickey for core from 139.178.89.65 port 38474 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:53:21.166407 sshd-session[1740]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:53:21.172490 systemd-logind[1530]: New session 3 of user core. Sep 16 04:53:21.181140 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 16 04:53:21.834812 sshd[1743]: Connection closed by 139.178.89.65 port 38474 Sep 16 04:53:21.835504 sshd-session[1740]: pam_unix(sshd:session): session closed for user core Sep 16 04:53:21.840190 systemd[1]: sshd@2-157.180.68.84:22-139.178.89.65:38474.service: Deactivated successfully. Sep 16 04:53:21.840214 systemd-logind[1530]: Session 3 logged out. Waiting for processes to exit. Sep 16 04:53:21.842524 systemd[1]: session-3.scope: Deactivated successfully. Sep 16 04:53:21.844616 systemd-logind[1530]: Removed session 3. Sep 16 04:53:22.014138 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 16 04:53:22.015739 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:53:22.018103 systemd[1]: Started sshd@3-157.180.68.84:22-139.178.89.65:38476.service - OpenSSH per-connection server daemon (139.178.89.65:38476). Sep 16 04:53:22.212882 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:22.223215 (kubelet)[1760]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:53:22.275148 kubelet[1760]: E0916 04:53:22.275070 1760 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:53:22.280347 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:53:22.280522 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:53:22.281127 systemd[1]: kubelet.service: Consumed 163ms CPU time, 108.8M memory peak. Sep 16 04:53:23.008216 sshd[1750]: Accepted publickey for core from 139.178.89.65 port 38476 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:53:23.009594 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:53:23.014124 systemd-logind[1530]: New session 4 of user core. Sep 16 04:53:23.021068 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 16 04:53:23.683502 sshd[1767]: Connection closed by 139.178.89.65 port 38476 Sep 16 04:53:23.684067 sshd-session[1750]: pam_unix(sshd:session): session closed for user core Sep 16 04:53:23.687350 systemd-logind[1530]: Session 4 logged out. Waiting for processes to exit. Sep 16 04:53:23.688037 systemd[1]: sshd@3-157.180.68.84:22-139.178.89.65:38476.service: Deactivated successfully. Sep 16 04:53:23.689650 systemd[1]: session-4.scope: Deactivated successfully. Sep 16 04:53:23.690874 systemd-logind[1530]: Removed session 4. Sep 16 04:53:23.885598 systemd[1]: Started sshd@4-157.180.68.84:22-139.178.89.65:38482.service - OpenSSH per-connection server daemon (139.178.89.65:38482). Sep 16 04:53:24.984342 sshd[1773]: Accepted publickey for core from 139.178.89.65 port 38482 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:53:24.985848 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:53:24.990963 systemd-logind[1530]: New session 5 of user core. Sep 16 04:53:24.996014 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 16 04:53:25.560275 sudo[1777]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 16 04:53:25.560603 sudo[1777]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:53:25.577622 sudo[1777]: pam_unix(sudo:session): session closed for user root Sep 16 04:53:25.752454 sshd[1776]: Connection closed by 139.178.89.65 port 38482 Sep 16 04:53:25.753121 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Sep 16 04:53:25.756156 systemd[1]: sshd@4-157.180.68.84:22-139.178.89.65:38482.service: Deactivated successfully. Sep 16 04:53:25.757626 systemd[1]: session-5.scope: Deactivated successfully. Sep 16 04:53:25.759450 systemd-logind[1530]: Session 5 logged out. Waiting for processes to exit. Sep 16 04:53:25.760527 systemd-logind[1530]: Removed session 5. Sep 16 04:53:25.906543 systemd[1]: Started sshd@5-157.180.68.84:22-139.178.89.65:38490.service - OpenSSH per-connection server daemon (139.178.89.65:38490). Sep 16 04:53:26.894218 sshd[1783]: Accepted publickey for core from 139.178.89.65 port 38490 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:53:26.895677 sshd-session[1783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:53:26.900213 systemd-logind[1530]: New session 6 of user core. Sep 16 04:53:26.913123 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 16 04:53:27.410072 sudo[1788]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 16 04:53:27.410341 sudo[1788]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:53:27.414828 sudo[1788]: pam_unix(sudo:session): session closed for user root Sep 16 04:53:27.420229 sudo[1787]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 16 04:53:27.420599 sudo[1787]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:53:27.431508 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:53:27.464023 augenrules[1810]: No rules Sep 16 04:53:27.465179 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:53:27.465405 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:53:27.466515 sudo[1787]: pam_unix(sudo:session): session closed for user root Sep 16 04:53:27.623666 sshd[1786]: Connection closed by 139.178.89.65 port 38490 Sep 16 04:53:27.624187 sshd-session[1783]: pam_unix(sshd:session): session closed for user core Sep 16 04:53:27.628796 systemd[1]: sshd@5-157.180.68.84:22-139.178.89.65:38490.service: Deactivated successfully. Sep 16 04:53:27.630643 systemd[1]: session-6.scope: Deactivated successfully. Sep 16 04:53:27.632957 systemd-logind[1530]: Session 6 logged out. Waiting for processes to exit. Sep 16 04:53:27.635336 systemd[1]: Started sshd@6-157.180.68.84:22-45.64.112.117:34036.service - OpenSSH per-connection server daemon (45.64.112.117:34036). Sep 16 04:53:27.637028 systemd-logind[1530]: Removed session 6. Sep 16 04:53:27.790414 systemd[1]: Started sshd@7-157.180.68.84:22-139.178.89.65:38502.service - OpenSSH per-connection server daemon (139.178.89.65:38502). Sep 16 04:53:28.768405 sshd[1823]: Accepted publickey for core from 139.178.89.65 port 38502 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:53:28.770381 sshd-session[1823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:53:28.777126 systemd-logind[1530]: New session 7 of user core. Sep 16 04:53:28.784089 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 16 04:53:29.003485 sshd[1819]: Received disconnect from 45.64.112.117 port 34036:11: Bye Bye [preauth] Sep 16 04:53:29.003485 sshd[1819]: Disconnected from authenticating user root 45.64.112.117 port 34036 [preauth] Sep 16 04:53:29.005419 systemd[1]: sshd@6-157.180.68.84:22-45.64.112.117:34036.service: Deactivated successfully. Sep 16 04:53:29.287090 sudo[1829]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 16 04:53:29.287465 sudo[1829]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:53:29.669760 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 16 04:53:29.687408 (dockerd)[1847]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 16 04:53:29.943850 dockerd[1847]: time="2025-09-16T04:53:29.943678213Z" level=info msg="Starting up" Sep 16 04:53:29.945243 dockerd[1847]: time="2025-09-16T04:53:29.944937494Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 16 04:53:29.956790 dockerd[1847]: time="2025-09-16T04:53:29.956717452Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 16 04:53:29.998615 systemd[1]: var-lib-docker-metacopy\x2dcheck998003450-merged.mount: Deactivated successfully. Sep 16 04:53:30.022924 dockerd[1847]: time="2025-09-16T04:53:30.022826314Z" level=info msg="Loading containers: start." Sep 16 04:53:30.032972 kernel: Initializing XFRM netlink socket Sep 16 04:53:30.356199 systemd-networkd[1468]: docker0: Link UP Sep 16 04:53:30.360959 dockerd[1847]: time="2025-09-16T04:53:30.360857647Z" level=info msg="Loading containers: done." Sep 16 04:53:30.377628 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1172383716-merged.mount: Deactivated successfully. Sep 16 04:53:30.379969 dockerd[1847]: time="2025-09-16T04:53:30.379852564Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 16 04:53:30.380085 dockerd[1847]: time="2025-09-16T04:53:30.379984511Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 16 04:53:30.380085 dockerd[1847]: time="2025-09-16T04:53:30.380077877Z" level=info msg="Initializing buildkit" Sep 16 04:53:30.408080 dockerd[1847]: time="2025-09-16T04:53:30.408007237Z" level=info msg="Completed buildkit initialization" Sep 16 04:53:30.416587 dockerd[1847]: time="2025-09-16T04:53:30.416530920Z" level=info msg="Daemon has completed initialization" Sep 16 04:53:30.416796 dockerd[1847]: time="2025-09-16T04:53:30.416697983Z" level=info msg="API listen on /run/docker.sock" Sep 16 04:53:30.417007 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 16 04:53:31.577936 containerd[1564]: time="2025-09-16T04:53:31.577866092Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 16 04:53:32.225528 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3516812216.mount: Deactivated successfully. Sep 16 04:53:32.530930 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 16 04:53:32.532694 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:53:32.636161 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:32.643356 (kubelet)[2101]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:53:32.682524 kubelet[2101]: E0916 04:53:32.682474 2101 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:53:32.684784 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:53:32.684998 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:53:32.685656 systemd[1]: kubelet.service: Consumed 115ms CPU time, 110.1M memory peak. Sep 16 04:53:33.175975 containerd[1564]: time="2025-09-16T04:53:33.175883858Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:33.179355 containerd[1564]: time="2025-09-16T04:53:33.179310823Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=28117224" Sep 16 04:53:33.180437 containerd[1564]: time="2025-09-16T04:53:33.180388695Z" level=info msg="ImageCreate event name:\"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:33.188605 containerd[1564]: time="2025-09-16T04:53:33.188544798Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:33.189949 containerd[1564]: time="2025-09-16T04:53:33.189452941Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"28113723\" in 1.61152888s" Sep 16 04:53:33.189949 containerd[1564]: time="2025-09-16T04:53:33.189509357Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:368da3301bb03f4bef9f7dc2084f5fc5954b0ac1bf1e49ca502e3a7604011e54\"" Sep 16 04:53:33.190555 containerd[1564]: time="2025-09-16T04:53:33.190515934Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 16 04:53:34.425579 containerd[1564]: time="2025-09-16T04:53:34.425531835Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:34.426680 containerd[1564]: time="2025-09-16T04:53:34.426417786Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=24716654" Sep 16 04:53:34.427452 containerd[1564]: time="2025-09-16T04:53:34.427423051Z" level=info msg="ImageCreate event name:\"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:34.430887 containerd[1564]: time="2025-09-16T04:53:34.430853423Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:34.431845 containerd[1564]: time="2025-09-16T04:53:34.431812051Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"26351311\" in 1.241264708s" Sep 16 04:53:34.432064 containerd[1564]: time="2025-09-16T04:53:34.431934911Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:cbd19105c6bcbedf394f51c8bb963def5195c300fc7d04bc39d48d14d23c0ff0\"" Sep 16 04:53:34.432468 containerd[1564]: time="2025-09-16T04:53:34.432422185Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 16 04:53:35.513033 containerd[1564]: time="2025-09-16T04:53:35.512929496Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:35.517463 containerd[1564]: time="2025-09-16T04:53:35.517411390Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=18787720" Sep 16 04:53:35.519434 containerd[1564]: time="2025-09-16T04:53:35.519378368Z" level=info msg="ImageCreate event name:\"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:35.524101 containerd[1564]: time="2025-09-16T04:53:35.523962564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:35.524994 containerd[1564]: time="2025-09-16T04:53:35.524809613Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"20422395\" in 1.092358203s" Sep 16 04:53:35.524994 containerd[1564]: time="2025-09-16T04:53:35.524844618Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:d019d989e2b1f0b08ea7eebd4dd7673bdd6ba2218a3c5a6bd53f6848d5fc1af6\"" Sep 16 04:53:35.525375 containerd[1564]: time="2025-09-16T04:53:35.525315772Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 16 04:53:36.603843 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount83845225.mount: Deactivated successfully. Sep 16 04:53:36.895713 containerd[1564]: time="2025-09-16T04:53:36.895595376Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:36.896749 containerd[1564]: time="2025-09-16T04:53:36.896708833Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=30410280" Sep 16 04:53:36.898147 containerd[1564]: time="2025-09-16T04:53:36.898110732Z" level=info msg="ImageCreate event name:\"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:36.899547 containerd[1564]: time="2025-09-16T04:53:36.899508933Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:36.900033 containerd[1564]: time="2025-09-16T04:53:36.899839894Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"30409271\" in 1.374478136s" Sep 16 04:53:36.900033 containerd[1564]: time="2025-09-16T04:53:36.899861074Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:21d97a49eeb0b08ecaba421a84a79ca44cf2bc57773c085bbfda537488790ad7\"" Sep 16 04:53:36.900496 containerd[1564]: time="2025-09-16T04:53:36.900469655Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 16 04:53:37.393386 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2811879052.mount: Deactivated successfully. Sep 16 04:53:38.156846 containerd[1564]: time="2025-09-16T04:53:38.156779695Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:38.158663 containerd[1564]: time="2025-09-16T04:53:38.158421383Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Sep 16 04:53:38.160914 containerd[1564]: time="2025-09-16T04:53:38.160877378Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:38.163332 containerd[1564]: time="2025-09-16T04:53:38.163303528Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:38.164147 containerd[1564]: time="2025-09-16T04:53:38.164123685Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.263629493s" Sep 16 04:53:38.164201 containerd[1564]: time="2025-09-16T04:53:38.164150425Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Sep 16 04:53:38.165062 containerd[1564]: time="2025-09-16T04:53:38.164916261Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 16 04:53:38.614094 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount232403318.mount: Deactivated successfully. Sep 16 04:53:38.619635 containerd[1564]: time="2025-09-16T04:53:38.619584671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:53:38.620420 containerd[1564]: time="2025-09-16T04:53:38.620294713Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Sep 16 04:53:38.621242 containerd[1564]: time="2025-09-16T04:53:38.621216551Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:53:38.623666 containerd[1564]: time="2025-09-16T04:53:38.623172950Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:53:38.623666 containerd[1564]: time="2025-09-16T04:53:38.623562500Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 458.624197ms" Sep 16 04:53:38.623666 containerd[1564]: time="2025-09-16T04:53:38.623586986Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Sep 16 04:53:38.624359 containerd[1564]: time="2025-09-16T04:53:38.624340098Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 16 04:53:39.099477 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1812033718.mount: Deactivated successfully. Sep 16 04:53:40.445939 containerd[1564]: time="2025-09-16T04:53:40.445874591Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:40.447875 containerd[1564]: time="2025-09-16T04:53:40.447625584Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56910785" Sep 16 04:53:40.450495 containerd[1564]: time="2025-09-16T04:53:40.450447866Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:40.454713 containerd[1564]: time="2025-09-16T04:53:40.454662308Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:40.455467 containerd[1564]: time="2025-09-16T04:53:40.455439997Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 1.831074291s" Sep 16 04:53:40.455537 containerd[1564]: time="2025-09-16T04:53:40.455524375Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Sep 16 04:53:40.734061 systemd[1]: Started sshd@8-157.180.68.84:22-5.195.226.17:37454.service - OpenSSH per-connection server daemon (5.195.226.17:37454). Sep 16 04:53:41.407089 systemd[1]: Started sshd@9-157.180.68.84:22-14.103.90.30:14178.service - OpenSSH per-connection server daemon (14.103.90.30:14178). Sep 16 04:53:41.731426 sshd[2266]: Received disconnect from 5.195.226.17 port 37454:11: Bye Bye [preauth] Sep 16 04:53:41.731426 sshd[2266]: Disconnected from authenticating user root 5.195.226.17 port 37454 [preauth] Sep 16 04:53:41.733684 systemd[1]: sshd@8-157.180.68.84:22-5.195.226.17:37454.service: Deactivated successfully. Sep 16 04:53:42.606111 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:42.606294 systemd[1]: kubelet.service: Consumed 115ms CPU time, 110.1M memory peak. Sep 16 04:53:42.619043 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:53:42.632956 systemd[1]: Reload requested from client PID 2293 ('systemctl') (unit session-7.scope)... Sep 16 04:53:42.632966 systemd[1]: Reloading... Sep 16 04:53:42.700928 zram_generator::config[2338]: No configuration found. Sep 16 04:53:42.768070 sshd[2283]: Received disconnect from 14.103.90.30 port 14178:11: Bye Bye [preauth] Sep 16 04:53:42.768070 sshd[2283]: Disconnected from authenticating user root 14.103.90.30 port 14178 [preauth] Sep 16 04:53:42.875496 systemd[1]: Reloading finished in 242 ms. Sep 16 04:53:42.885740 systemd[1]: sshd@9-157.180.68.84:22-14.103.90.30:14178.service: Deactivated successfully. Sep 16 04:53:42.915052 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:42.918175 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:53:42.919550 systemd[1]: kubelet.service: Deactivated successfully. Sep 16 04:53:42.919792 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:42.919832 systemd[1]: kubelet.service: Consumed 76ms CPU time, 98.4M memory peak. Sep 16 04:53:42.921133 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:53:43.027990 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:43.036267 (kubelet)[2398]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 04:53:43.085771 kubelet[2398]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:53:43.086173 kubelet[2398]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 16 04:53:43.086232 kubelet[2398]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:53:43.086386 kubelet[2398]: I0916 04:53:43.086356 2398 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 04:53:43.395406 kubelet[2398]: I0916 04:53:43.395364 2398 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 16 04:53:43.395406 kubelet[2398]: I0916 04:53:43.395390 2398 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 04:53:43.395654 kubelet[2398]: I0916 04:53:43.395633 2398 server.go:934] "Client rotation is on, will bootstrap in background" Sep 16 04:53:43.425252 kubelet[2398]: I0916 04:53:43.424732 2398 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:53:43.427829 kubelet[2398]: E0916 04:53:43.427798 2398 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://157.180.68.84:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 157.180.68.84:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:53:43.440469 kubelet[2398]: I0916 04:53:43.440441 2398 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 04:53:43.444551 kubelet[2398]: I0916 04:53:43.444409 2398 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 04:53:43.446211 kubelet[2398]: I0916 04:53:43.446182 2398 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 16 04:53:43.446339 kubelet[2398]: I0916 04:53:43.446308 2398 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 04:53:43.446496 kubelet[2398]: I0916 04:53:43.446336 2398 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-0-0-n-06f2563e85","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 04:53:43.446593 kubelet[2398]: I0916 04:53:43.446498 2398 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 04:53:43.446593 kubelet[2398]: I0916 04:53:43.446506 2398 container_manager_linux.go:300] "Creating device plugin manager" Sep 16 04:53:43.446593 kubelet[2398]: I0916 04:53:43.446590 2398 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:53:43.449632 kubelet[2398]: I0916 04:53:43.449425 2398 kubelet.go:408] "Attempting to sync node with API server" Sep 16 04:53:43.449632 kubelet[2398]: I0916 04:53:43.449446 2398 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 04:53:43.449632 kubelet[2398]: I0916 04:53:43.449473 2398 kubelet.go:314] "Adding apiserver pod source" Sep 16 04:53:43.449632 kubelet[2398]: I0916 04:53:43.449490 2398 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 04:53:43.454958 kubelet[2398]: W0916 04:53:43.454920 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://157.180.68.84:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-n-06f2563e85&limit=500&resourceVersion=0": dial tcp 157.180.68.84:6443: connect: connection refused Sep 16 04:53:43.455397 kubelet[2398]: E0916 04:53:43.455225 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://157.180.68.84:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-n-06f2563e85&limit=500&resourceVersion=0\": dial tcp 157.180.68.84:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:53:43.455397 kubelet[2398]: I0916 04:53:43.455300 2398 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 04:53:43.458716 kubelet[2398]: I0916 04:53:43.458699 2398 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 16 04:53:43.460354 kubelet[2398]: W0916 04:53:43.459666 2398 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 16 04:53:43.462947 kubelet[2398]: I0916 04:53:43.462394 2398 server.go:1274] "Started kubelet" Sep 16 04:53:43.463995 kubelet[2398]: W0916 04:53:43.463960 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://157.180.68.84:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 157.180.68.84:6443: connect: connection refused Sep 16 04:53:43.464104 kubelet[2398]: E0916 04:53:43.464087 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://157.180.68.84:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.68.84:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:53:43.464233 kubelet[2398]: I0916 04:53:43.464200 2398 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 04:53:43.464592 kubelet[2398]: I0916 04:53:43.464563 2398 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 04:53:43.466823 kubelet[2398]: I0916 04:53:43.466752 2398 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 04:53:43.470483 kubelet[2398]: E0916 04:53:43.468948 2398 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.68.84:6443/api/v1/namespaces/default/events\": dial tcp 157.180.68.84:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-0-0-n-06f2563e85.1865aa407a2beec2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-0-0-n-06f2563e85,UID:ci-4459-0-0-n-06f2563e85,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-n-06f2563e85,},FirstTimestamp:2025-09-16 04:53:43.462375106 +0000 UTC m=+0.423326687,LastTimestamp:2025-09-16 04:53:43.462375106 +0000 UTC m=+0.423326687,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-n-06f2563e85,}" Sep 16 04:53:43.473277 kubelet[2398]: I0916 04:53:43.472302 2398 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 04:53:43.474999 kubelet[2398]: I0916 04:53:43.474980 2398 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 16 04:53:43.475306 kubelet[2398]: I0916 04:53:43.475293 2398 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 16 04:53:43.475414 kubelet[2398]: I0916 04:53:43.475404 2398 reconciler.go:26] "Reconciler: start to sync state" Sep 16 04:53:43.476216 kubelet[2398]: I0916 04:53:43.476199 2398 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 04:53:43.478958 kubelet[2398]: I0916 04:53:43.478946 2398 server.go:449] "Adding debug handlers to kubelet server" Sep 16 04:53:43.480951 kubelet[2398]: E0916 04:53:43.476342 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:43.482354 kubelet[2398]: E0916 04:53:43.482295 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.68.84:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-n-06f2563e85?timeout=10s\": dial tcp 157.180.68.84:6443: connect: connection refused" interval="200ms" Sep 16 04:53:43.482667 kubelet[2398]: W0916 04:53:43.482603 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://157.180.68.84:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 157.180.68.84:6443: connect: connection refused Sep 16 04:53:43.482750 kubelet[2398]: E0916 04:53:43.482734 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://157.180.68.84:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.68.84:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:53:43.483262 kubelet[2398]: I0916 04:53:43.483247 2398 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 04:53:43.484030 kubelet[2398]: E0916 04:53:43.484017 2398 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 04:53:43.484466 kubelet[2398]: I0916 04:53:43.484454 2398 factory.go:221] Registration of the containerd container factory successfully Sep 16 04:53:43.484526 kubelet[2398]: I0916 04:53:43.484519 2398 factory.go:221] Registration of the systemd container factory successfully Sep 16 04:53:43.489571 kubelet[2398]: I0916 04:53:43.489551 2398 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 16 04:53:43.490660 kubelet[2398]: I0916 04:53:43.490648 2398 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 16 04:53:43.490727 kubelet[2398]: I0916 04:53:43.490720 2398 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 16 04:53:43.490781 kubelet[2398]: I0916 04:53:43.490774 2398 kubelet.go:2321] "Starting kubelet main sync loop" Sep 16 04:53:43.490858 kubelet[2398]: E0916 04:53:43.490840 2398 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 04:53:43.496593 kubelet[2398]: W0916 04:53:43.496563 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://157.180.68.84:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 157.180.68.84:6443: connect: connection refused Sep 16 04:53:43.496691 kubelet[2398]: E0916 04:53:43.496677 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://157.180.68.84:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.68.84:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:53:43.514148 kubelet[2398]: I0916 04:53:43.514124 2398 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 16 04:53:43.514148 kubelet[2398]: I0916 04:53:43.514137 2398 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 16 04:53:43.514148 kubelet[2398]: I0916 04:53:43.514150 2398 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:53:43.518407 kubelet[2398]: I0916 04:53:43.518380 2398 policy_none.go:49] "None policy: Start" Sep 16 04:53:43.518821 kubelet[2398]: I0916 04:53:43.518788 2398 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 16 04:53:43.518821 kubelet[2398]: I0916 04:53:43.518805 2398 state_mem.go:35] "Initializing new in-memory state store" Sep 16 04:53:43.525315 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 16 04:53:43.538586 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 16 04:53:43.547198 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 16 04:53:43.549197 kubelet[2398]: I0916 04:53:43.549176 2398 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 16 04:53:43.549389 kubelet[2398]: I0916 04:53:43.549317 2398 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 04:53:43.549389 kubelet[2398]: I0916 04:53:43.549333 2398 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 04:53:43.551241 kubelet[2398]: I0916 04:53:43.551084 2398 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 04:53:43.552169 kubelet[2398]: E0916 04:53:43.552154 2398 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:43.601731 systemd[1]: Created slice kubepods-burstable-pod9cbdbb531defae468b627539a48488fc.slice - libcontainer container kubepods-burstable-pod9cbdbb531defae468b627539a48488fc.slice. Sep 16 04:53:43.623764 systemd[1]: Created slice kubepods-burstable-podaa99f62d6d8a0da3f377201a1a8e451d.slice - libcontainer container kubepods-burstable-podaa99f62d6d8a0da3f377201a1a8e451d.slice. Sep 16 04:53:43.627201 systemd[1]: Created slice kubepods-burstable-podc5ff0c6ee77b860cd738523b9cbe24c1.slice - libcontainer container kubepods-burstable-podc5ff0c6ee77b860cd738523b9cbe24c1.slice. Sep 16 04:53:43.653156 kubelet[2398]: I0916 04:53:43.653051 2398 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.654260 kubelet[2398]: E0916 04:53:43.654212 2398 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://157.180.68.84:6443/api/v1/nodes\": dial tcp 157.180.68.84:6443: connect: connection refused" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.683890 kubelet[2398]: E0916 04:53:43.683835 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.68.84:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-n-06f2563e85?timeout=10s\": dial tcp 157.180.68.84:6443: connect: connection refused" interval="400ms" Sep 16 04:53:43.777365 kubelet[2398]: I0916 04:53:43.777323 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-k8s-certs\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.777365 kubelet[2398]: I0916 04:53:43.777362 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-kubeconfig\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.777365 kubelet[2398]: I0916 04:53:43.777383 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/aa99f62d6d8a0da3f377201a1a8e451d-kubeconfig\") pod \"kube-scheduler-ci-4459-0-0-n-06f2563e85\" (UID: \"aa99f62d6d8a0da3f377201a1a8e451d\") " pod="kube-system/kube-scheduler-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.777567 kubelet[2398]: I0916 04:53:43.777398 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9cbdbb531defae468b627539a48488fc-k8s-certs\") pod \"kube-apiserver-ci-4459-0-0-n-06f2563e85\" (UID: \"9cbdbb531defae468b627539a48488fc\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.777567 kubelet[2398]: I0916 04:53:43.777432 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-ca-certs\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.777567 kubelet[2398]: I0916 04:53:43.777453 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.777567 kubelet[2398]: I0916 04:53:43.777468 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9cbdbb531defae468b627539a48488fc-ca-certs\") pod \"kube-apiserver-ci-4459-0-0-n-06f2563e85\" (UID: \"9cbdbb531defae468b627539a48488fc\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.777567 kubelet[2398]: I0916 04:53:43.777483 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9cbdbb531defae468b627539a48488fc-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-0-0-n-06f2563e85\" (UID: \"9cbdbb531defae468b627539a48488fc\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.777788 kubelet[2398]: I0916 04:53:43.777500 2398 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.855805 kubelet[2398]: I0916 04:53:43.855776 2398 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.856233 kubelet[2398]: E0916 04:53:43.856202 2398 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://157.180.68.84:6443/api/v1/nodes\": dial tcp 157.180.68.84:6443: connect: connection refused" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:43.921586 containerd[1564]: time="2025-09-16T04:53:43.921477142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-0-0-n-06f2563e85,Uid:9cbdbb531defae468b627539a48488fc,Namespace:kube-system,Attempt:0,}" Sep 16 04:53:43.928919 containerd[1564]: time="2025-09-16T04:53:43.928872603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-0-0-n-06f2563e85,Uid:aa99f62d6d8a0da3f377201a1a8e451d,Namespace:kube-system,Attempt:0,}" Sep 16 04:53:43.929752 containerd[1564]: time="2025-09-16T04:53:43.929723022Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-0-0-n-06f2563e85,Uid:c5ff0c6ee77b860cd738523b9cbe24c1,Namespace:kube-system,Attempt:0,}" Sep 16 04:53:44.084701 kubelet[2398]: E0916 04:53:44.084646 2398 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.68.84:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-n-06f2563e85?timeout=10s\": dial tcp 157.180.68.84:6443: connect: connection refused" interval="800ms" Sep 16 04:53:44.167560 containerd[1564]: time="2025-09-16T04:53:44.167500688Z" level=info msg="connecting to shim ea975ff57baeb13fb695e7958183bb6c1ac026823a2333e5f085b1a7a5dbd8ac" address="unix:///run/containerd/s/64cd60305cd1ea5d1831325534ac088f0eec4a25c1d030bbb0baaf53d02ce4c9" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:53:44.170130 containerd[1564]: time="2025-09-16T04:53:44.170080587Z" level=info msg="connecting to shim 47095e6f3e19b92e3fa785b824ccf99541ecd32d0ffdee3dacb98bc96b525765" address="unix:///run/containerd/s/f2475380930b1d2a953307ec563427154262633410236082587f452d67c9510f" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:53:44.192288 containerd[1564]: time="2025-09-16T04:53:44.192031603Z" level=info msg="connecting to shim 838f4b90a2089d122c19836505665799b1275cbd3d2a0c6562a596098cfc1239" address="unix:///run/containerd/s/7ffcca86d9551cb94c1c65e1e34896acf1964a7af588d5c1cafab7d14969cfa7" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:53:44.254042 systemd[1]: Started cri-containerd-47095e6f3e19b92e3fa785b824ccf99541ecd32d0ffdee3dacb98bc96b525765.scope - libcontainer container 47095e6f3e19b92e3fa785b824ccf99541ecd32d0ffdee3dacb98bc96b525765. Sep 16 04:53:44.259163 systemd[1]: Started cri-containerd-838f4b90a2089d122c19836505665799b1275cbd3d2a0c6562a596098cfc1239.scope - libcontainer container 838f4b90a2089d122c19836505665799b1275cbd3d2a0c6562a596098cfc1239. Sep 16 04:53:44.261022 systemd[1]: Started cri-containerd-ea975ff57baeb13fb695e7958183bb6c1ac026823a2333e5f085b1a7a5dbd8ac.scope - libcontainer container ea975ff57baeb13fb695e7958183bb6c1ac026823a2333e5f085b1a7a5dbd8ac. Sep 16 04:53:44.267884 kubelet[2398]: I0916 04:53:44.267658 2398 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:44.269964 kubelet[2398]: E0916 04:53:44.269736 2398 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://157.180.68.84:6443/api/v1/nodes\": dial tcp 157.180.68.84:6443: connect: connection refused" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:44.313502 containerd[1564]: time="2025-09-16T04:53:44.313321922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-0-0-n-06f2563e85,Uid:aa99f62d6d8a0da3f377201a1a8e451d,Namespace:kube-system,Attempt:0,} returns sandbox id \"47095e6f3e19b92e3fa785b824ccf99541ecd32d0ffdee3dacb98bc96b525765\"" Sep 16 04:53:44.322911 containerd[1564]: time="2025-09-16T04:53:44.322422522Z" level=info msg="CreateContainer within sandbox \"47095e6f3e19b92e3fa785b824ccf99541ecd32d0ffdee3dacb98bc96b525765\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 16 04:53:44.352065 containerd[1564]: time="2025-09-16T04:53:44.352013459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-0-0-n-06f2563e85,Uid:c5ff0c6ee77b860cd738523b9cbe24c1,Namespace:kube-system,Attempt:0,} returns sandbox id \"838f4b90a2089d122c19836505665799b1275cbd3d2a0c6562a596098cfc1239\"" Sep 16 04:53:44.355203 containerd[1564]: time="2025-09-16T04:53:44.355166147Z" level=info msg="CreateContainer within sandbox \"838f4b90a2089d122c19836505665799b1275cbd3d2a0c6562a596098cfc1239\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 16 04:53:44.382982 containerd[1564]: time="2025-09-16T04:53:44.382947062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-0-0-n-06f2563e85,Uid:9cbdbb531defae468b627539a48488fc,Namespace:kube-system,Attempt:0,} returns sandbox id \"ea975ff57baeb13fb695e7958183bb6c1ac026823a2333e5f085b1a7a5dbd8ac\"" Sep 16 04:53:44.384783 containerd[1564]: time="2025-09-16T04:53:44.384747776Z" level=info msg="CreateContainer within sandbox \"ea975ff57baeb13fb695e7958183bb6c1ac026823a2333e5f085b1a7a5dbd8ac\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 16 04:53:44.405008 containerd[1564]: time="2025-09-16T04:53:44.404965663Z" level=info msg="Container cca651983fc8cdb78f71921efc1a2ed60926dd3096dc7997bea45105d090ef00: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:53:44.408393 kubelet[2398]: W0916 04:53:44.408335 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://157.180.68.84:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 157.180.68.84:6443: connect: connection refused Sep 16 04:53:44.408471 kubelet[2398]: E0916 04:53:44.408404 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://157.180.68.84:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.68.84:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:53:44.435852 containerd[1564]: time="2025-09-16T04:53:44.435806985Z" level=info msg="Container b280f6df89c692e299f0c755c0dced5ebc9f80a0edbfe72447e56991dfd479b4: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:53:44.456886 containerd[1564]: time="2025-09-16T04:53:44.456671391Z" level=info msg="Container 311182d9afd43c0c1714655ffcf6a4f045da64cea62fc43ab4b57d86eed03df4: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:53:44.464646 kubelet[2398]: W0916 04:53:44.464541 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://157.180.68.84:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-n-06f2563e85&limit=500&resourceVersion=0": dial tcp 157.180.68.84:6443: connect: connection refused Sep 16 04:53:44.464646 kubelet[2398]: E0916 04:53:44.464626 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://157.180.68.84:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-n-06f2563e85&limit=500&resourceVersion=0\": dial tcp 157.180.68.84:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:53:44.469807 containerd[1564]: time="2025-09-16T04:53:44.469772963Z" level=info msg="CreateContainer within sandbox \"47095e6f3e19b92e3fa785b824ccf99541ecd32d0ffdee3dacb98bc96b525765\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"cca651983fc8cdb78f71921efc1a2ed60926dd3096dc7997bea45105d090ef00\"" Sep 16 04:53:44.470403 containerd[1564]: time="2025-09-16T04:53:44.470372594Z" level=info msg="StartContainer for \"cca651983fc8cdb78f71921efc1a2ed60926dd3096dc7997bea45105d090ef00\"" Sep 16 04:53:44.471339 containerd[1564]: time="2025-09-16T04:53:44.471311918Z" level=info msg="connecting to shim cca651983fc8cdb78f71921efc1a2ed60926dd3096dc7997bea45105d090ef00" address="unix:///run/containerd/s/f2475380930b1d2a953307ec563427154262633410236082587f452d67c9510f" protocol=ttrpc version=3 Sep 16 04:53:44.481246 containerd[1564]: time="2025-09-16T04:53:44.481145918Z" level=info msg="CreateContainer within sandbox \"838f4b90a2089d122c19836505665799b1275cbd3d2a0c6562a596098cfc1239\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b280f6df89c692e299f0c755c0dced5ebc9f80a0edbfe72447e56991dfd479b4\"" Sep 16 04:53:44.482038 containerd[1564]: time="2025-09-16T04:53:44.482018037Z" level=info msg="StartContainer for \"b280f6df89c692e299f0c755c0dced5ebc9f80a0edbfe72447e56991dfd479b4\"" Sep 16 04:53:44.484747 kubelet[2398]: W0916 04:53:44.484692 2398 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://157.180.68.84:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 157.180.68.84:6443: connect: connection refused Sep 16 04:53:44.484807 kubelet[2398]: E0916 04:53:44.484759 2398 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://157.180.68.84:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.68.84:6443: connect: connection refused" logger="UnhandledError" Sep 16 04:53:44.484964 containerd[1564]: time="2025-09-16T04:53:44.484929777Z" level=info msg="connecting to shim b280f6df89c692e299f0c755c0dced5ebc9f80a0edbfe72447e56991dfd479b4" address="unix:///run/containerd/s/7ffcca86d9551cb94c1c65e1e34896acf1964a7af588d5c1cafab7d14969cfa7" protocol=ttrpc version=3 Sep 16 04:53:44.487838 containerd[1564]: time="2025-09-16T04:53:44.487805568Z" level=info msg="CreateContainer within sandbox \"ea975ff57baeb13fb695e7958183bb6c1ac026823a2333e5f085b1a7a5dbd8ac\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"311182d9afd43c0c1714655ffcf6a4f045da64cea62fc43ab4b57d86eed03df4\"" Sep 16 04:53:44.488309 containerd[1564]: time="2025-09-16T04:53:44.488273493Z" level=info msg="StartContainer for \"311182d9afd43c0c1714655ffcf6a4f045da64cea62fc43ab4b57d86eed03df4\"" Sep 16 04:53:44.489069 systemd[1]: Started cri-containerd-cca651983fc8cdb78f71921efc1a2ed60926dd3096dc7997bea45105d090ef00.scope - libcontainer container cca651983fc8cdb78f71921efc1a2ed60926dd3096dc7997bea45105d090ef00. Sep 16 04:53:44.489157 containerd[1564]: time="2025-09-16T04:53:44.489119102Z" level=info msg="connecting to shim 311182d9afd43c0c1714655ffcf6a4f045da64cea62fc43ab4b57d86eed03df4" address="unix:///run/containerd/s/64cd60305cd1ea5d1831325534ac088f0eec4a25c1d030bbb0baaf53d02ce4c9" protocol=ttrpc version=3 Sep 16 04:53:44.506071 systemd[1]: Started cri-containerd-b280f6df89c692e299f0c755c0dced5ebc9f80a0edbfe72447e56991dfd479b4.scope - libcontainer container b280f6df89c692e299f0c755c0dced5ebc9f80a0edbfe72447e56991dfd479b4. Sep 16 04:53:44.521218 systemd[1]: Started cri-containerd-311182d9afd43c0c1714655ffcf6a4f045da64cea62fc43ab4b57d86eed03df4.scope - libcontainer container 311182d9afd43c0c1714655ffcf6a4f045da64cea62fc43ab4b57d86eed03df4. Sep 16 04:53:44.577105 containerd[1564]: time="2025-09-16T04:53:44.577007940Z" level=info msg="StartContainer for \"cca651983fc8cdb78f71921efc1a2ed60926dd3096dc7997bea45105d090ef00\" returns successfully" Sep 16 04:53:44.595360 containerd[1564]: time="2025-09-16T04:53:44.595322981Z" level=info msg="StartContainer for \"b280f6df89c692e299f0c755c0dced5ebc9f80a0edbfe72447e56991dfd479b4\" returns successfully" Sep 16 04:53:44.614715 containerd[1564]: time="2025-09-16T04:53:44.614666345Z" level=info msg="StartContainer for \"311182d9afd43c0c1714655ffcf6a4f045da64cea62fc43ab4b57d86eed03df4\" returns successfully" Sep 16 04:53:45.072305 kubelet[2398]: I0916 04:53:45.072279 2398 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:45.897153 kubelet[2398]: E0916 04:53:45.897068 2398 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-0-0-n-06f2563e85\" not found" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:45.971741 kubelet[2398]: I0916 04:53:45.971471 2398 kubelet_node_status.go:75] "Successfully registered node" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:45.971741 kubelet[2398]: E0916 04:53:45.971499 2398 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4459-0-0-n-06f2563e85\": node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:46.000761 kubelet[2398]: E0916 04:53:46.000730 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:46.101293 kubelet[2398]: E0916 04:53:46.101253 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:46.201964 kubelet[2398]: E0916 04:53:46.201842 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:46.302565 kubelet[2398]: E0916 04:53:46.302517 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:46.403266 kubelet[2398]: E0916 04:53:46.403226 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:46.503811 kubelet[2398]: E0916 04:53:46.503750 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:46.604364 kubelet[2398]: E0916 04:53:46.604317 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:46.704922 kubelet[2398]: E0916 04:53:46.704848 2398 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:47.464851 kubelet[2398]: I0916 04:53:47.464794 2398 apiserver.go:52] "Watching apiserver" Sep 16 04:53:47.476121 kubelet[2398]: I0916 04:53:47.476082 2398 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 16 04:53:47.984263 systemd[1]: Reload requested from client PID 2666 ('systemctl') (unit session-7.scope)... Sep 16 04:53:47.984282 systemd[1]: Reloading... Sep 16 04:53:48.057034 zram_generator::config[2707]: No configuration found. Sep 16 04:53:48.241802 systemd[1]: Reloading finished in 257 ms. Sep 16 04:53:48.267229 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:53:48.273110 systemd[1]: kubelet.service: Deactivated successfully. Sep 16 04:53:48.273282 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:48.273321 systemd[1]: kubelet.service: Consumed 659ms CPU time, 126.9M memory peak. Sep 16 04:53:48.275007 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:53:48.461861 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:53:48.470141 (kubelet)[2761]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 04:53:48.527850 kubelet[2761]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:53:48.527850 kubelet[2761]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 16 04:53:48.527850 kubelet[2761]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:53:48.528226 kubelet[2761]: I0916 04:53:48.527845 2761 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 04:53:48.535824 kubelet[2761]: I0916 04:53:48.535795 2761 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 16 04:53:48.536106 kubelet[2761]: I0916 04:53:48.536076 2761 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 04:53:48.536401 kubelet[2761]: I0916 04:53:48.536380 2761 server.go:934] "Client rotation is on, will bootstrap in background" Sep 16 04:53:48.538271 kubelet[2761]: I0916 04:53:48.537942 2761 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 16 04:53:48.563580 kubelet[2761]: I0916 04:53:48.563549 2761 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:53:48.571072 kubelet[2761]: I0916 04:53:48.571048 2761 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 04:53:48.575539 kubelet[2761]: I0916 04:53:48.575507 2761 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 04:53:48.575662 kubelet[2761]: I0916 04:53:48.575637 2761 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 16 04:53:48.575758 kubelet[2761]: I0916 04:53:48.575727 2761 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 04:53:48.575983 kubelet[2761]: I0916 04:53:48.575756 2761 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-0-0-n-06f2563e85","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 04:53:48.576095 kubelet[2761]: I0916 04:53:48.575985 2761 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 04:53:48.576095 kubelet[2761]: I0916 04:53:48.575996 2761 container_manager_linux.go:300] "Creating device plugin manager" Sep 16 04:53:48.576095 kubelet[2761]: I0916 04:53:48.576023 2761 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:53:48.576162 kubelet[2761]: I0916 04:53:48.576103 2761 kubelet.go:408] "Attempting to sync node with API server" Sep 16 04:53:48.576162 kubelet[2761]: I0916 04:53:48.576113 2761 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 04:53:48.576162 kubelet[2761]: I0916 04:53:48.576134 2761 kubelet.go:314] "Adding apiserver pod source" Sep 16 04:53:48.576162 kubelet[2761]: I0916 04:53:48.576142 2761 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 04:53:48.578970 kubelet[2761]: I0916 04:53:48.578935 2761 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 04:53:48.579377 kubelet[2761]: I0916 04:53:48.579362 2761 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 16 04:53:48.579752 kubelet[2761]: I0916 04:53:48.579739 2761 server.go:1274] "Started kubelet" Sep 16 04:53:48.581546 kubelet[2761]: I0916 04:53:48.581532 2761 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 04:53:48.582181 kubelet[2761]: I0916 04:53:48.582141 2761 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 04:53:48.586458 kubelet[2761]: I0916 04:53:48.586429 2761 server.go:449] "Adding debug handlers to kubelet server" Sep 16 04:53:48.588922 kubelet[2761]: I0916 04:53:48.587098 2761 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 04:53:48.588922 kubelet[2761]: I0916 04:53:48.587339 2761 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 04:53:48.588922 kubelet[2761]: I0916 04:53:48.587728 2761 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 04:53:48.592018 kubelet[2761]: I0916 04:53:48.592003 2761 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 16 04:53:48.592263 kubelet[2761]: E0916 04:53:48.592248 2761 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-06f2563e85\" not found" Sep 16 04:53:48.593804 kubelet[2761]: I0916 04:53:48.592644 2761 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 16 04:53:48.595934 kubelet[2761]: I0916 04:53:48.595059 2761 reconciler.go:26] "Reconciler: start to sync state" Sep 16 04:53:48.596251 kubelet[2761]: I0916 04:53:48.596081 2761 factory.go:221] Registration of the systemd container factory successfully Sep 16 04:53:48.596416 kubelet[2761]: I0916 04:53:48.596393 2761 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 04:53:48.608555 kubelet[2761]: I0916 04:53:48.608324 2761 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 16 04:53:48.608776 kubelet[2761]: I0916 04:53:48.608757 2761 factory.go:221] Registration of the containerd container factory successfully Sep 16 04:53:48.609846 kubelet[2761]: I0916 04:53:48.609815 2761 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 16 04:53:48.609846 kubelet[2761]: I0916 04:53:48.609837 2761 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 16 04:53:48.610043 kubelet[2761]: I0916 04:53:48.610016 2761 kubelet.go:2321] "Starting kubelet main sync loop" Sep 16 04:53:48.610220 kubelet[2761]: E0916 04:53:48.610068 2761 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 04:53:48.656655 kubelet[2761]: I0916 04:53:48.656387 2761 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 16 04:53:48.656655 kubelet[2761]: I0916 04:53:48.656401 2761 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 16 04:53:48.656655 kubelet[2761]: I0916 04:53:48.656416 2761 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:53:48.656655 kubelet[2761]: I0916 04:53:48.656548 2761 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 16 04:53:48.656655 kubelet[2761]: I0916 04:53:48.656557 2761 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 16 04:53:48.656655 kubelet[2761]: I0916 04:53:48.656577 2761 policy_none.go:49] "None policy: Start" Sep 16 04:53:48.657562 kubelet[2761]: I0916 04:53:48.657547 2761 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 16 04:53:48.657592 kubelet[2761]: I0916 04:53:48.657570 2761 state_mem.go:35] "Initializing new in-memory state store" Sep 16 04:53:48.658112 kubelet[2761]: I0916 04:53:48.658080 2761 state_mem.go:75] "Updated machine memory state" Sep 16 04:53:48.662442 kubelet[2761]: I0916 04:53:48.662427 2761 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 16 04:53:48.662825 kubelet[2761]: I0916 04:53:48.662774 2761 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 04:53:48.662945 kubelet[2761]: I0916 04:53:48.662914 2761 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 04:53:48.663865 kubelet[2761]: I0916 04:53:48.663762 2761 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 04:53:48.718769 kubelet[2761]: E0916 04:53:48.718737 2761 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4459-0-0-n-06f2563e85\" already exists" pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.766747 kubelet[2761]: I0916 04:53:48.766396 2761 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.773508 kubelet[2761]: I0916 04:53:48.773458 2761 kubelet_node_status.go:111] "Node was previously registered" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.773584 kubelet[2761]: I0916 04:53:48.773559 2761 kubelet_node_status.go:75] "Successfully registered node" node="ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898405 kubelet[2761]: I0916 04:53:48.898256 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-ca-certs\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898405 kubelet[2761]: I0916 04:53:48.898301 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-k8s-certs\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898405 kubelet[2761]: I0916 04:53:48.898317 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-kubeconfig\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898405 kubelet[2761]: I0916 04:53:48.898339 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898405 kubelet[2761]: I0916 04:53:48.898355 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/aa99f62d6d8a0da3f377201a1a8e451d-kubeconfig\") pod \"kube-scheduler-ci-4459-0-0-n-06f2563e85\" (UID: \"aa99f62d6d8a0da3f377201a1a8e451d\") " pod="kube-system/kube-scheduler-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898702 kubelet[2761]: I0916 04:53:48.898375 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/9cbdbb531defae468b627539a48488fc-ca-certs\") pod \"kube-apiserver-ci-4459-0-0-n-06f2563e85\" (UID: \"9cbdbb531defae468b627539a48488fc\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898702 kubelet[2761]: I0916 04:53:48.898388 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/9cbdbb531defae468b627539a48488fc-k8s-certs\") pod \"kube-apiserver-ci-4459-0-0-n-06f2563e85\" (UID: \"9cbdbb531defae468b627539a48488fc\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898702 kubelet[2761]: I0916 04:53:48.898404 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/9cbdbb531defae468b627539a48488fc-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-0-0-n-06f2563e85\" (UID: \"9cbdbb531defae468b627539a48488fc\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:48.898702 kubelet[2761]: I0916 04:53:48.898420 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c5ff0c6ee77b860cd738523b9cbe24c1-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-0-0-n-06f2563e85\" (UID: \"c5ff0c6ee77b860cd738523b9cbe24c1\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:49.584795 kubelet[2761]: I0916 04:53:49.584759 2761 apiserver.go:52] "Watching apiserver" Sep 16 04:53:49.597214 kubelet[2761]: I0916 04:53:49.597175 2761 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 16 04:53:49.646927 kubelet[2761]: E0916 04:53:49.646616 2761 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4459-0-0-n-06f2563e85\" already exists" pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" Sep 16 04:53:49.671254 kubelet[2761]: I0916 04:53:49.671158 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-0-0-n-06f2563e85" podStartSLOduration=1.670930513 podStartE2EDuration="1.670930513s" podCreationTimestamp="2025-09-16 04:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:53:49.662530986 +0000 UTC m=+1.186781617" watchObservedRunningTime="2025-09-16 04:53:49.670930513 +0000 UTC m=+1.195181153" Sep 16 04:53:49.679093 kubelet[2761]: I0916 04:53:49.679046 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-0-0-n-06f2563e85" podStartSLOduration=2.679027765 podStartE2EDuration="2.679027765s" podCreationTimestamp="2025-09-16 04:53:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:53:49.67158163 +0000 UTC m=+1.195832261" watchObservedRunningTime="2025-09-16 04:53:49.679027765 +0000 UTC m=+1.203278405" Sep 16 04:53:49.679238 kubelet[2761]: I0916 04:53:49.679130 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-0-0-n-06f2563e85" podStartSLOduration=1.679124475 podStartE2EDuration="1.679124475s" podCreationTimestamp="2025-09-16 04:53:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:53:49.678615804 +0000 UTC m=+1.202866444" watchObservedRunningTime="2025-09-16 04:53:49.679124475 +0000 UTC m=+1.203375125" Sep 16 04:53:54.310924 update_engine[1534]: I20250916 04:53:54.310231 1534 update_attempter.cc:509] Updating boot flags... Sep 16 04:53:54.352924 kubelet[2761]: I0916 04:53:54.351163 2761 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 16 04:53:54.353323 containerd[1564]: time="2025-09-16T04:53:54.352424312Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 16 04:53:54.355268 kubelet[2761]: I0916 04:53:54.354310 2761 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 16 04:53:55.148375 systemd[1]: Created slice kubepods-besteffort-podd2f32e9e_88de_416a_9a83_042054519dc0.slice - libcontainer container kubepods-besteffort-podd2f32e9e_88de_416a_9a83_042054519dc0.slice. Sep 16 04:53:55.239178 kubelet[2761]: I0916 04:53:55.239113 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhwn\" (UniqueName: \"kubernetes.io/projected/d2f32e9e-88de-416a-9a83-042054519dc0-kube-api-access-xwhwn\") pod \"kube-proxy-wwtpd\" (UID: \"d2f32e9e-88de-416a-9a83-042054519dc0\") " pod="kube-system/kube-proxy-wwtpd" Sep 16 04:53:55.239327 kubelet[2761]: I0916 04:53:55.239190 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d2f32e9e-88de-416a-9a83-042054519dc0-kube-proxy\") pod \"kube-proxy-wwtpd\" (UID: \"d2f32e9e-88de-416a-9a83-042054519dc0\") " pod="kube-system/kube-proxy-wwtpd" Sep 16 04:53:55.239327 kubelet[2761]: I0916 04:53:55.239220 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d2f32e9e-88de-416a-9a83-042054519dc0-xtables-lock\") pod \"kube-proxy-wwtpd\" (UID: \"d2f32e9e-88de-416a-9a83-042054519dc0\") " pod="kube-system/kube-proxy-wwtpd" Sep 16 04:53:55.239327 kubelet[2761]: I0916 04:53:55.239241 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d2f32e9e-88de-416a-9a83-042054519dc0-lib-modules\") pod \"kube-proxy-wwtpd\" (UID: \"d2f32e9e-88de-416a-9a83-042054519dc0\") " pod="kube-system/kube-proxy-wwtpd" Sep 16 04:53:55.274468 systemd[1]: Created slice kubepods-besteffort-podc4d3d4cf_eb77_49cc_aedd_635009e60f8f.slice - libcontainer container kubepods-besteffort-podc4d3d4cf_eb77_49cc_aedd_635009e60f8f.slice. Sep 16 04:53:55.339650 kubelet[2761]: I0916 04:53:55.339582 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbfn2\" (UniqueName: \"kubernetes.io/projected/c4d3d4cf-eb77-49cc-aedd-635009e60f8f-kube-api-access-vbfn2\") pod \"tigera-operator-58fc44c59b-rtm26\" (UID: \"c4d3d4cf-eb77-49cc-aedd-635009e60f8f\") " pod="tigera-operator/tigera-operator-58fc44c59b-rtm26" Sep 16 04:53:55.339786 kubelet[2761]: I0916 04:53:55.339662 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c4d3d4cf-eb77-49cc-aedd-635009e60f8f-var-lib-calico\") pod \"tigera-operator-58fc44c59b-rtm26\" (UID: \"c4d3d4cf-eb77-49cc-aedd-635009e60f8f\") " pod="tigera-operator/tigera-operator-58fc44c59b-rtm26" Sep 16 04:53:55.460963 containerd[1564]: time="2025-09-16T04:53:55.460792798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-wwtpd,Uid:d2f32e9e-88de-416a-9a83-042054519dc0,Namespace:kube-system,Attempt:0,}" Sep 16 04:53:55.491402 containerd[1564]: time="2025-09-16T04:53:55.491341894Z" level=info msg="connecting to shim 9610d714ab052dcca82704100cfa3fa30c87445e5472353055c6ca86dea0f56d" address="unix:///run/containerd/s/66b211167377f23a21772a4c666994624f5462ded95da18d593bd14e9ba04a1b" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:53:55.527125 systemd[1]: Started cri-containerd-9610d714ab052dcca82704100cfa3fa30c87445e5472353055c6ca86dea0f56d.scope - libcontainer container 9610d714ab052dcca82704100cfa3fa30c87445e5472353055c6ca86dea0f56d. Sep 16 04:53:55.559238 containerd[1564]: time="2025-09-16T04:53:55.559153821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-wwtpd,Uid:d2f32e9e-88de-416a-9a83-042054519dc0,Namespace:kube-system,Attempt:0,} returns sandbox id \"9610d714ab052dcca82704100cfa3fa30c87445e5472353055c6ca86dea0f56d\"" Sep 16 04:53:55.564267 containerd[1564]: time="2025-09-16T04:53:55.564211898Z" level=info msg="CreateContainer within sandbox \"9610d714ab052dcca82704100cfa3fa30c87445e5472353055c6ca86dea0f56d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 16 04:53:55.582542 containerd[1564]: time="2025-09-16T04:53:55.582219521Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-rtm26,Uid:c4d3d4cf-eb77-49cc-aedd-635009e60f8f,Namespace:tigera-operator,Attempt:0,}" Sep 16 04:53:55.585163 containerd[1564]: time="2025-09-16T04:53:55.585123286Z" level=info msg="Container d27e635a2ddfff272f23c8fafec9155799798dfdd179868fd61579f495b8b352: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:53:55.610484 containerd[1564]: time="2025-09-16T04:53:55.610421735Z" level=info msg="CreateContainer within sandbox \"9610d714ab052dcca82704100cfa3fa30c87445e5472353055c6ca86dea0f56d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d27e635a2ddfff272f23c8fafec9155799798dfdd179868fd61579f495b8b352\"" Sep 16 04:53:55.611197 containerd[1564]: time="2025-09-16T04:53:55.611121434Z" level=info msg="StartContainer for \"d27e635a2ddfff272f23c8fafec9155799798dfdd179868fd61579f495b8b352\"" Sep 16 04:53:55.614074 containerd[1564]: time="2025-09-16T04:53:55.614022945Z" level=info msg="connecting to shim d27e635a2ddfff272f23c8fafec9155799798dfdd179868fd61579f495b8b352" address="unix:///run/containerd/s/66b211167377f23a21772a4c666994624f5462ded95da18d593bd14e9ba04a1b" protocol=ttrpc version=3 Sep 16 04:53:55.637130 systemd[1]: Started cri-containerd-d27e635a2ddfff272f23c8fafec9155799798dfdd179868fd61579f495b8b352.scope - libcontainer container d27e635a2ddfff272f23c8fafec9155799798dfdd179868fd61579f495b8b352. Sep 16 04:53:55.644338 containerd[1564]: time="2025-09-16T04:53:55.644267641Z" level=info msg="connecting to shim b332df4aef054432efb7348518a3fccd99e1c5f6fefd6e6841da9bd66178fab6" address="unix:///run/containerd/s/e7ce15c7230aa87b3ce0c200e85974dc0717fd1a80dc66dc2a786db336f615d5" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:53:55.681144 systemd[1]: Started cri-containerd-b332df4aef054432efb7348518a3fccd99e1c5f6fefd6e6841da9bd66178fab6.scope - libcontainer container b332df4aef054432efb7348518a3fccd99e1c5f6fefd6e6841da9bd66178fab6. Sep 16 04:53:55.722166 containerd[1564]: time="2025-09-16T04:53:55.721712427Z" level=info msg="StartContainer for \"d27e635a2ddfff272f23c8fafec9155799798dfdd179868fd61579f495b8b352\" returns successfully" Sep 16 04:53:55.748550 containerd[1564]: time="2025-09-16T04:53:55.748500805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-rtm26,Uid:c4d3d4cf-eb77-49cc-aedd-635009e60f8f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"b332df4aef054432efb7348518a3fccd99e1c5f6fefd6e6841da9bd66178fab6\"" Sep 16 04:53:55.752135 containerd[1564]: time="2025-09-16T04:53:55.752096145Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 16 04:53:56.356875 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2648665600.mount: Deactivated successfully. Sep 16 04:53:56.669718 kubelet[2761]: I0916 04:53:56.669608 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-wwtpd" podStartSLOduration=1.669590723 podStartE2EDuration="1.669590723s" podCreationTimestamp="2025-09-16 04:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:53:56.668560716 +0000 UTC m=+8.192811345" watchObservedRunningTime="2025-09-16 04:53:56.669590723 +0000 UTC m=+8.193841352" Sep 16 04:53:57.805754 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount167032662.mount: Deactivated successfully. Sep 16 04:53:58.307586 containerd[1564]: time="2025-09-16T04:53:58.307523039Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:58.311242 containerd[1564]: time="2025-09-16T04:53:58.311075460Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=25062609" Sep 16 04:53:58.314294 containerd[1564]: time="2025-09-16T04:53:58.314252980Z" level=info msg="ImageCreate event name:\"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:58.320936 containerd[1564]: time="2025-09-16T04:53:58.320539730Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:53:58.321387 containerd[1564]: time="2025-09-16T04:53:58.321354616Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"25058604\" in 2.569090497s" Sep 16 04:53:58.321468 containerd[1564]: time="2025-09-16T04:53:58.321454522Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:1911afdd8478c6ca3036ff85614050d5d19acc0f0c3f6a5a7b3e34b38dd309c9\"" Sep 16 04:53:58.324109 containerd[1564]: time="2025-09-16T04:53:58.324063798Z" level=info msg="CreateContainer within sandbox \"b332df4aef054432efb7348518a3fccd99e1c5f6fefd6e6841da9bd66178fab6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 16 04:53:58.346432 containerd[1564]: time="2025-09-16T04:53:58.346392783Z" level=info msg="Container 1d0dcbfd14ab66f4b4d78c7e88b77409f83cec3e9c04abb6c06623f1494f13c8: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:53:58.364866 containerd[1564]: time="2025-09-16T04:53:58.364817979Z" level=info msg="CreateContainer within sandbox \"b332df4aef054432efb7348518a3fccd99e1c5f6fefd6e6841da9bd66178fab6\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1d0dcbfd14ab66f4b4d78c7e88b77409f83cec3e9c04abb6c06623f1494f13c8\"" Sep 16 04:53:58.365636 containerd[1564]: time="2025-09-16T04:53:58.365582610Z" level=info msg="StartContainer for \"1d0dcbfd14ab66f4b4d78c7e88b77409f83cec3e9c04abb6c06623f1494f13c8\"" Sep 16 04:53:58.366526 containerd[1564]: time="2025-09-16T04:53:58.366490421Z" level=info msg="connecting to shim 1d0dcbfd14ab66f4b4d78c7e88b77409f83cec3e9c04abb6c06623f1494f13c8" address="unix:///run/containerd/s/e7ce15c7230aa87b3ce0c200e85974dc0717fd1a80dc66dc2a786db336f615d5" protocol=ttrpc version=3 Sep 16 04:53:58.390142 systemd[1]: Started cri-containerd-1d0dcbfd14ab66f4b4d78c7e88b77409f83cec3e9c04abb6c06623f1494f13c8.scope - libcontainer container 1d0dcbfd14ab66f4b4d78c7e88b77409f83cec3e9c04abb6c06623f1494f13c8. Sep 16 04:53:58.428919 containerd[1564]: time="2025-09-16T04:53:58.428865521Z" level=info msg="StartContainer for \"1d0dcbfd14ab66f4b4d78c7e88b77409f83cec3e9c04abb6c06623f1494f13c8\" returns successfully" Sep 16 04:53:58.681669 kubelet[2761]: I0916 04:53:58.681160 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-rtm26" podStartSLOduration=1.109228207 podStartE2EDuration="3.681145315s" podCreationTimestamp="2025-09-16 04:53:55 +0000 UTC" firstStartedPulling="2025-09-16 04:53:55.750364132 +0000 UTC m=+7.274614772" lastFinishedPulling="2025-09-16 04:53:58.32228124 +0000 UTC m=+9.846531880" observedRunningTime="2025-09-16 04:53:58.673521351 +0000 UTC m=+10.197771981" watchObservedRunningTime="2025-09-16 04:53:58.681145315 +0000 UTC m=+10.205395945" Sep 16 04:54:03.962472 sudo[1829]: pam_unix(sudo:session): session closed for user root Sep 16 04:54:04.120471 sshd[1826]: Connection closed by 139.178.89.65 port 38502 Sep 16 04:54:04.122768 sshd-session[1823]: pam_unix(sshd:session): session closed for user core Sep 16 04:54:04.127539 systemd[1]: sshd@7-157.180.68.84:22-139.178.89.65:38502.service: Deactivated successfully. Sep 16 04:54:04.132371 systemd[1]: session-7.scope: Deactivated successfully. Sep 16 04:54:04.132971 systemd[1]: session-7.scope: Consumed 3.668s CPU time, 162.8M memory peak. Sep 16 04:54:04.136847 systemd-logind[1530]: Session 7 logged out. Waiting for processes to exit. Sep 16 04:54:04.137954 systemd-logind[1530]: Removed session 7. Sep 16 04:54:07.297874 systemd[1]: Created slice kubepods-besteffort-podb86b3b09_9475_44fe_a4ee_837815bb2c4b.slice - libcontainer container kubepods-besteffort-podb86b3b09_9475_44fe_a4ee_837815bb2c4b.slice. Sep 16 04:54:07.419577 kubelet[2761]: I0916 04:54:07.419491 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b86b3b09-9475-44fe-a4ee-837815bb2c4b-typha-certs\") pod \"calico-typha-68777b76f5-l67dm\" (UID: \"b86b3b09-9475-44fe-a4ee-837815bb2c4b\") " pod="calico-system/calico-typha-68777b76f5-l67dm" Sep 16 04:54:07.419577 kubelet[2761]: I0916 04:54:07.419532 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86b3b09-9475-44fe-a4ee-837815bb2c4b-tigera-ca-bundle\") pod \"calico-typha-68777b76f5-l67dm\" (UID: \"b86b3b09-9475-44fe-a4ee-837815bb2c4b\") " pod="calico-system/calico-typha-68777b76f5-l67dm" Sep 16 04:54:07.419577 kubelet[2761]: I0916 04:54:07.419551 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b84mr\" (UniqueName: \"kubernetes.io/projected/b86b3b09-9475-44fe-a4ee-837815bb2c4b-kube-api-access-b84mr\") pod \"calico-typha-68777b76f5-l67dm\" (UID: \"b86b3b09-9475-44fe-a4ee-837815bb2c4b\") " pod="calico-system/calico-typha-68777b76f5-l67dm" Sep 16 04:54:07.602693 containerd[1564]: time="2025-09-16T04:54:07.602560993Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-68777b76f5-l67dm,Uid:b86b3b09-9475-44fe-a4ee-837815bb2c4b,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:07.668927 containerd[1564]: time="2025-09-16T04:54:07.668809810Z" level=info msg="connecting to shim 0c068d0c71adee4a929c74ffe5a7facf2a73d972a645b0fd690781d58acfb5b5" address="unix:///run/containerd/s/1f79fbf810d99c44e41e8fda4fb45572d318b0b076e26550ec872dd7144c8494" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:07.710074 systemd[1]: Started cri-containerd-0c068d0c71adee4a929c74ffe5a7facf2a73d972a645b0fd690781d58acfb5b5.scope - libcontainer container 0c068d0c71adee4a929c74ffe5a7facf2a73d972a645b0fd690781d58acfb5b5. Sep 16 04:54:07.715471 systemd[1]: Created slice kubepods-besteffort-podced916e1_2331_4faf_aff0_26284d0b3fd2.slice - libcontainer container kubepods-besteffort-podced916e1_2331_4faf_aff0_26284d0b3fd2.slice. Sep 16 04:54:07.822707 kubelet[2761]: I0916 04:54:07.822660 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-cni-net-dir\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822707 kubelet[2761]: I0916 04:54:07.822712 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ced916e1-2331-4faf-aff0-26284d0b3fd2-node-certs\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822873 kubelet[2761]: I0916 04:54:07.822727 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-cni-bin-dir\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822873 kubelet[2761]: I0916 04:54:07.822744 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-cni-log-dir\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822873 kubelet[2761]: I0916 04:54:07.822759 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-flexvol-driver-host\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822873 kubelet[2761]: I0916 04:54:07.822773 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-xtables-lock\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822873 kubelet[2761]: I0916 04:54:07.822785 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-var-run-calico\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822983 kubelet[2761]: I0916 04:54:07.822799 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-var-lib-calico\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822983 kubelet[2761]: I0916 04:54:07.822812 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqnxv\" (UniqueName: \"kubernetes.io/projected/ced916e1-2331-4faf-aff0-26284d0b3fd2-kube-api-access-gqnxv\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822983 kubelet[2761]: I0916 04:54:07.822827 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ced916e1-2331-4faf-aff0-26284d0b3fd2-tigera-ca-bundle\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822983 kubelet[2761]: I0916 04:54:07.822838 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-lib-modules\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.822983 kubelet[2761]: I0916 04:54:07.822851 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ced916e1-2331-4faf-aff0-26284d0b3fd2-policysync\") pod \"calico-node-59jns\" (UID: \"ced916e1-2331-4faf-aff0-26284d0b3fd2\") " pod="calico-system/calico-node-59jns" Sep 16 04:54:07.876395 containerd[1564]: time="2025-09-16T04:54:07.876250709Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-68777b76f5-l67dm,Uid:b86b3b09-9475-44fe-a4ee-837815bb2c4b,Namespace:calico-system,Attempt:0,} returns sandbox id \"0c068d0c71adee4a929c74ffe5a7facf2a73d972a645b0fd690781d58acfb5b5\"" Sep 16 04:54:07.877856 containerd[1564]: time="2025-09-16T04:54:07.877803978Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 16 04:54:07.926276 kubelet[2761]: E0916 04:54:07.926179 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:07.926276 kubelet[2761]: W0916 04:54:07.926206 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:07.926276 kubelet[2761]: E0916 04:54:07.926225 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:07.933106 kubelet[2761]: E0916 04:54:07.933072 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:07.933175 kubelet[2761]: W0916 04:54:07.933101 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:07.933175 kubelet[2761]: E0916 04:54:07.933129 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:07.945583 kubelet[2761]: E0916 04:54:07.945512 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:07.945583 kubelet[2761]: W0916 04:54:07.945537 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:07.945583 kubelet[2761]: E0916 04:54:07.945557 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:07.978723 kubelet[2761]: E0916 04:54:07.978671 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zbfw2" podUID="d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c" Sep 16 04:54:08.019464 containerd[1564]: time="2025-09-16T04:54:08.019411591Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-59jns,Uid:ced916e1-2331-4faf-aff0-26284d0b3fd2,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:08.024483 kubelet[2761]: E0916 04:54:08.024449 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.024483 kubelet[2761]: W0916 04:54:08.024472 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.024613 kubelet[2761]: E0916 04:54:08.024494 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.024725 kubelet[2761]: E0916 04:54:08.024708 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.024725 kubelet[2761]: W0916 04:54:08.024722 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.024788 kubelet[2761]: E0916 04:54:08.024733 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.024956 kubelet[2761]: E0916 04:54:08.024933 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.024956 kubelet[2761]: W0916 04:54:08.024947 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.024956 kubelet[2761]: E0916 04:54:08.024957 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.025101 kubelet[2761]: E0916 04:54:08.025094 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.025126 kubelet[2761]: W0916 04:54:08.025102 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.025126 kubelet[2761]: E0916 04:54:08.025111 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.025358 kubelet[2761]: E0916 04:54:08.025321 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.025358 kubelet[2761]: W0916 04:54:08.025344 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.025358 kubelet[2761]: E0916 04:54:08.025366 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.025664 kubelet[2761]: E0916 04:54:08.025601 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.025664 kubelet[2761]: W0916 04:54:08.025612 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.025664 kubelet[2761]: E0916 04:54:08.025625 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.025799 kubelet[2761]: E0916 04:54:08.025779 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.025799 kubelet[2761]: W0916 04:54:08.025791 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.026012 kubelet[2761]: E0916 04:54:08.025802 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.026012 kubelet[2761]: E0916 04:54:08.025979 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.026012 kubelet[2761]: W0916 04:54:08.025987 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.026012 kubelet[2761]: E0916 04:54:08.025996 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.026206 kubelet[2761]: E0916 04:54:08.026188 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.026206 kubelet[2761]: W0916 04:54:08.026201 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.026264 kubelet[2761]: E0916 04:54:08.026209 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.026393 kubelet[2761]: E0916 04:54:08.026376 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.026393 kubelet[2761]: W0916 04:54:08.026389 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.026455 kubelet[2761]: E0916 04:54:08.026398 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.026587 kubelet[2761]: E0916 04:54:08.026565 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.026587 kubelet[2761]: W0916 04:54:08.026578 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.026587 kubelet[2761]: E0916 04:54:08.026586 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.026802 kubelet[2761]: E0916 04:54:08.026771 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.026802 kubelet[2761]: W0916 04:54:08.026794 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.026802 kubelet[2761]: E0916 04:54:08.026802 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.027246 kubelet[2761]: E0916 04:54:08.027053 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.027246 kubelet[2761]: W0916 04:54:08.027063 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.027246 kubelet[2761]: E0916 04:54:08.027072 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.027454 kubelet[2761]: E0916 04:54:08.027402 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.027454 kubelet[2761]: W0916 04:54:08.027443 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.027454 kubelet[2761]: E0916 04:54:08.027451 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.028031 kubelet[2761]: E0916 04:54:08.027672 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.028031 kubelet[2761]: W0916 04:54:08.027681 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.028031 kubelet[2761]: E0916 04:54:08.027722 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.028031 kubelet[2761]: E0916 04:54:08.027929 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.028031 kubelet[2761]: W0916 04:54:08.027938 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.028031 kubelet[2761]: E0916 04:54:08.027947 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.028433 kubelet[2761]: E0916 04:54:08.028335 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.028433 kubelet[2761]: W0916 04:54:08.028352 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.028433 kubelet[2761]: E0916 04:54:08.028365 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.028583 kubelet[2761]: E0916 04:54:08.028561 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.028583 kubelet[2761]: W0916 04:54:08.028572 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.028583 kubelet[2761]: E0916 04:54:08.028582 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.028969 kubelet[2761]: E0916 04:54:08.028915 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.028969 kubelet[2761]: W0916 04:54:08.028930 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.028969 kubelet[2761]: E0916 04:54:08.028939 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.029209 kubelet[2761]: E0916 04:54:08.029181 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.029209 kubelet[2761]: W0916 04:54:08.029196 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.029209 kubelet[2761]: E0916 04:54:08.029205 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.095144 containerd[1564]: time="2025-09-16T04:54:08.094938849Z" level=info msg="connecting to shim 8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438" address="unix:///run/containerd/s/64abbf14477ae60e5427a6ce7c2aaf147d21c5ccdb4bc873b263a8887efa4611" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:08.125958 kubelet[2761]: E0916 04:54:08.125920 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.125958 kubelet[2761]: W0916 04:54:08.125948 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.125958 kubelet[2761]: E0916 04:54:08.125970 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.126148 kubelet[2761]: I0916 04:54:08.125999 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c-kubelet-dir\") pod \"csi-node-driver-zbfw2\" (UID: \"d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c\") " pod="calico-system/csi-node-driver-zbfw2" Sep 16 04:54:08.126380 kubelet[2761]: E0916 04:54:08.126347 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.126380 kubelet[2761]: W0916 04:54:08.126365 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.126843 kubelet[2761]: E0916 04:54:08.126654 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.126843 kubelet[2761]: W0916 04:54:08.126670 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.126843 kubelet[2761]: E0916 04:54:08.126681 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.127278 kubelet[2761]: E0916 04:54:08.127089 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.127278 kubelet[2761]: I0916 04:54:08.127115 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28sl2\" (UniqueName: \"kubernetes.io/projected/d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c-kube-api-access-28sl2\") pod \"csi-node-driver-zbfw2\" (UID: \"d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c\") " pod="calico-system/csi-node-driver-zbfw2" Sep 16 04:54:08.127358 kubelet[2761]: E0916 04:54:08.127344 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.127358 kubelet[2761]: W0916 04:54:08.127353 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.127405 kubelet[2761]: E0916 04:54:08.127363 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.128447 kubelet[2761]: E0916 04:54:08.128238 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.128447 kubelet[2761]: W0916 04:54:08.128254 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.128447 kubelet[2761]: E0916 04:54:08.128410 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.129358 kubelet[2761]: E0916 04:54:08.129343 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.129358 kubelet[2761]: W0916 04:54:08.129355 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.129435 kubelet[2761]: E0916 04:54:08.129367 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.129581 kubelet[2761]: E0916 04:54:08.129559 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.129581 kubelet[2761]: W0916 04:54:08.129576 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.129709 kubelet[2761]: E0916 04:54:08.129584 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.129709 kubelet[2761]: I0916 04:54:08.129629 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c-registration-dir\") pod \"csi-node-driver-zbfw2\" (UID: \"d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c\") " pod="calico-system/csi-node-driver-zbfw2" Sep 16 04:54:08.130272 kubelet[2761]: E0916 04:54:08.130192 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.130272 kubelet[2761]: W0916 04:54:08.130208 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.130340 kubelet[2761]: E0916 04:54:08.130298 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.131540 kubelet[2761]: I0916 04:54:08.130316 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c-socket-dir\") pod \"csi-node-driver-zbfw2\" (UID: \"d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c\") " pod="calico-system/csi-node-driver-zbfw2" Sep 16 04:54:08.131540 kubelet[2761]: E0916 04:54:08.130947 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.131540 kubelet[2761]: W0916 04:54:08.130955 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.131540 kubelet[2761]: E0916 04:54:08.130977 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.131540 kubelet[2761]: E0916 04:54:08.131227 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.131540 kubelet[2761]: W0916 04:54:08.131235 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.131540 kubelet[2761]: E0916 04:54:08.131250 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.131540 kubelet[2761]: E0916 04:54:08.131439 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.131540 kubelet[2761]: W0916 04:54:08.131452 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.131871 kubelet[2761]: E0916 04:54:08.131461 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.131871 kubelet[2761]: I0916 04:54:08.131475 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c-varrun\") pod \"csi-node-driver-zbfw2\" (UID: \"d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c\") " pod="calico-system/csi-node-driver-zbfw2" Sep 16 04:54:08.132931 kubelet[2761]: E0916 04:54:08.132123 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.132931 kubelet[2761]: W0916 04:54:08.132135 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.132931 kubelet[2761]: E0916 04:54:08.132154 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.133035 kubelet[2761]: E0916 04:54:08.133019 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.133035 kubelet[2761]: W0916 04:54:08.133027 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.133077 kubelet[2761]: E0916 04:54:08.133047 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.133497 kubelet[2761]: E0916 04:54:08.133312 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.133497 kubelet[2761]: W0916 04:54:08.133335 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.133497 kubelet[2761]: E0916 04:54:08.133360 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.133588 kubelet[2761]: E0916 04:54:08.133574 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.133588 kubelet[2761]: W0916 04:54:08.133583 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.133629 kubelet[2761]: E0916 04:54:08.133592 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.146054 systemd[1]: Started cri-containerd-8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438.scope - libcontainer container 8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438. Sep 16 04:54:08.189790 containerd[1564]: time="2025-09-16T04:54:08.189745567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-59jns,Uid:ced916e1-2331-4faf-aff0-26284d0b3fd2,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438\"" Sep 16 04:54:08.233668 kubelet[2761]: E0916 04:54:08.233611 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.233668 kubelet[2761]: W0916 04:54:08.233639 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.234384 kubelet[2761]: E0916 04:54:08.233698 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.234384 kubelet[2761]: E0916 04:54:08.234057 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.234384 kubelet[2761]: W0916 04:54:08.234068 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.234384 kubelet[2761]: E0916 04:54:08.234114 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.234631 kubelet[2761]: E0916 04:54:08.234404 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.234631 kubelet[2761]: W0916 04:54:08.234415 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.234631 kubelet[2761]: E0916 04:54:08.234548 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.234726 kubelet[2761]: E0916 04:54:08.234714 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.234773 kubelet[2761]: W0916 04:54:08.234726 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.234773 kubelet[2761]: E0916 04:54:08.234744 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.235244 kubelet[2761]: E0916 04:54:08.234928 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.235244 kubelet[2761]: W0916 04:54:08.234936 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.235244 kubelet[2761]: E0916 04:54:08.234948 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.235244 kubelet[2761]: E0916 04:54:08.235215 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.235244 kubelet[2761]: W0916 04:54:08.235224 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.235244 kubelet[2761]: E0916 04:54:08.235236 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.235774 kubelet[2761]: E0916 04:54:08.235498 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.235774 kubelet[2761]: W0916 04:54:08.235507 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.235774 kubelet[2761]: E0916 04:54:08.235552 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.235884 kubelet[2761]: E0916 04:54:08.235848 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.235884 kubelet[2761]: W0916 04:54:08.235858 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.235991 kubelet[2761]: E0916 04:54:08.235930 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.236289 kubelet[2761]: E0916 04:54:08.236262 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.236289 kubelet[2761]: W0916 04:54:08.236280 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.236465 kubelet[2761]: E0916 04:54:08.236448 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.236613 kubelet[2761]: E0916 04:54:08.236582 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.236613 kubelet[2761]: W0916 04:54:08.236593 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.236983 kubelet[2761]: E0916 04:54:08.236822 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.236983 kubelet[2761]: E0916 04:54:08.236869 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.236983 kubelet[2761]: W0916 04:54:08.236879 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.236983 kubelet[2761]: E0916 04:54:08.236915 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.238561 kubelet[2761]: E0916 04:54:08.238038 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.238561 kubelet[2761]: W0916 04:54:08.238053 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.238561 kubelet[2761]: E0916 04:54:08.238070 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.238561 kubelet[2761]: E0916 04:54:08.238297 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.238561 kubelet[2761]: W0916 04:54:08.238306 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.238561 kubelet[2761]: E0916 04:54:08.238411 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.238561 kubelet[2761]: E0916 04:54:08.238551 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.238813 kubelet[2761]: W0916 04:54:08.238675 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.238981 kubelet[2761]: E0916 04:54:08.238957 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.238981 kubelet[2761]: W0916 04:54:08.238974 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.239138 kubelet[2761]: E0916 04:54:08.239097 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.239175 kubelet[2761]: E0916 04:54:08.239159 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.239940 kubelet[2761]: E0916 04:54:08.239223 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.239940 kubelet[2761]: W0916 04:54:08.239233 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.239940 kubelet[2761]: E0916 04:54:08.239253 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.239940 kubelet[2761]: E0916 04:54:08.239436 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.239940 kubelet[2761]: W0916 04:54:08.239445 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.239940 kubelet[2761]: E0916 04:54:08.239482 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.239940 kubelet[2761]: E0916 04:54:08.239721 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.239940 kubelet[2761]: W0916 04:54:08.239729 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.239940 kubelet[2761]: E0916 04:54:08.239759 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.240192 kubelet[2761]: E0916 04:54:08.239962 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.240192 kubelet[2761]: W0916 04:54:08.239970 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.240192 kubelet[2761]: E0916 04:54:08.239986 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.240192 kubelet[2761]: E0916 04:54:08.240157 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.240192 kubelet[2761]: W0916 04:54:08.240165 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.240192 kubelet[2761]: E0916 04:54:08.240189 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.240344 kubelet[2761]: E0916 04:54:08.240308 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.240370 kubelet[2761]: W0916 04:54:08.240351 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.240457 kubelet[2761]: E0916 04:54:08.240434 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.240588 kubelet[2761]: E0916 04:54:08.240564 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.240588 kubelet[2761]: W0916 04:54:08.240578 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.240668 kubelet[2761]: E0916 04:54:08.240620 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.240829 kubelet[2761]: E0916 04:54:08.240807 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.240829 kubelet[2761]: W0916 04:54:08.240822 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.240877 kubelet[2761]: E0916 04:54:08.240835 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.241141 kubelet[2761]: E0916 04:54:08.241118 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.241141 kubelet[2761]: W0916 04:54:08.241134 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.241204 kubelet[2761]: E0916 04:54:08.241143 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.241766 kubelet[2761]: E0916 04:54:08.241608 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.241766 kubelet[2761]: W0916 04:54:08.241621 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.241766 kubelet[2761]: E0916 04:54:08.241630 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:08.249224 kubelet[2761]: E0916 04:54:08.249192 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:08.249224 kubelet[2761]: W0916 04:54:08.249217 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:08.249224 kubelet[2761]: E0916 04:54:08.249230 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:09.611115 kubelet[2761]: E0916 04:54:09.611048 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zbfw2" podUID="d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c" Sep 16 04:54:09.847502 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2099888173.mount: Deactivated successfully. Sep 16 04:54:10.718908 containerd[1564]: time="2025-09-16T04:54:10.718835370Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:10.723873 containerd[1564]: time="2025-09-16T04:54:10.723842522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=35237389" Sep 16 04:54:10.727536 containerd[1564]: time="2025-09-16T04:54:10.727489296Z" level=info msg="ImageCreate event name:\"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:10.729924 containerd[1564]: time="2025-09-16T04:54:10.729873364Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:10.730818 containerd[1564]: time="2025-09-16T04:54:10.730684773Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"35237243\" in 2.852680599s" Sep 16 04:54:10.730818 containerd[1564]: time="2025-09-16T04:54:10.730716202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:1d7bb7b0cce2924d35c7c26f6b6600409ea7c9535074c3d2e517ffbb3a0e0b36\"" Sep 16 04:54:10.733028 containerd[1564]: time="2025-09-16T04:54:10.732985224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 16 04:54:10.763953 containerd[1564]: time="2025-09-16T04:54:10.763827956Z" level=info msg="CreateContainer within sandbox \"0c068d0c71adee4a929c74ffe5a7facf2a73d972a645b0fd690781d58acfb5b5\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 16 04:54:10.834419 containerd[1564]: time="2025-09-16T04:54:10.832306579Z" level=info msg="Container 43dfdf14d3aa066bf4e7cb364aa5e61d55d10c3ced0b451197ec06cdd27c8b67: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:10.907547 containerd[1564]: time="2025-09-16T04:54:10.907478466Z" level=info msg="CreateContainer within sandbox \"0c068d0c71adee4a929c74ffe5a7facf2a73d972a645b0fd690781d58acfb5b5\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"43dfdf14d3aa066bf4e7cb364aa5e61d55d10c3ced0b451197ec06cdd27c8b67\"" Sep 16 04:54:10.908245 containerd[1564]: time="2025-09-16T04:54:10.908189728Z" level=info msg="StartContainer for \"43dfdf14d3aa066bf4e7cb364aa5e61d55d10c3ced0b451197ec06cdd27c8b67\"" Sep 16 04:54:10.920130 containerd[1564]: time="2025-09-16T04:54:10.920095097Z" level=info msg="connecting to shim 43dfdf14d3aa066bf4e7cb364aa5e61d55d10c3ced0b451197ec06cdd27c8b67" address="unix:///run/containerd/s/1f79fbf810d99c44e41e8fda4fb45572d318b0b076e26550ec872dd7144c8494" protocol=ttrpc version=3 Sep 16 04:54:10.940127 systemd[1]: Started cri-containerd-43dfdf14d3aa066bf4e7cb364aa5e61d55d10c3ced0b451197ec06cdd27c8b67.scope - libcontainer container 43dfdf14d3aa066bf4e7cb364aa5e61d55d10c3ced0b451197ec06cdd27c8b67. Sep 16 04:54:11.006716 containerd[1564]: time="2025-09-16T04:54:11.006669487Z" level=info msg="StartContainer for \"43dfdf14d3aa066bf4e7cb364aa5e61d55d10c3ced0b451197ec06cdd27c8b67\" returns successfully" Sep 16 04:54:11.611620 kubelet[2761]: E0916 04:54:11.611555 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zbfw2" podUID="d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c" Sep 16 04:54:11.728398 kubelet[2761]: I0916 04:54:11.726566 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-68777b76f5-l67dm" podStartSLOduration=1.871380498 podStartE2EDuration="4.726550683s" podCreationTimestamp="2025-09-16 04:54:07 +0000 UTC" firstStartedPulling="2025-09-16 04:54:07.877590518 +0000 UTC m=+19.401841149" lastFinishedPulling="2025-09-16 04:54:10.732760684 +0000 UTC m=+22.257011334" observedRunningTime="2025-09-16 04:54:11.725138336 +0000 UTC m=+23.249388976" watchObservedRunningTime="2025-09-16 04:54:11.726550683 +0000 UTC m=+23.250801313" Sep 16 04:54:11.762704 kubelet[2761]: E0916 04:54:11.762658 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.762704 kubelet[2761]: W0916 04:54:11.762687 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.762704 kubelet[2761]: E0916 04:54:11.762709 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.762912 kubelet[2761]: E0916 04:54:11.762878 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.762912 kubelet[2761]: W0916 04:54:11.762887 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.762991 kubelet[2761]: E0916 04:54:11.762923 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.763051 kubelet[2761]: E0916 04:54:11.763041 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.763051 kubelet[2761]: W0916 04:54:11.763049 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.763108 kubelet[2761]: E0916 04:54:11.763057 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.763212 kubelet[2761]: E0916 04:54:11.763186 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.763212 kubelet[2761]: W0916 04:54:11.763205 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.763287 kubelet[2761]: E0916 04:54:11.763216 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.763370 kubelet[2761]: E0916 04:54:11.763358 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.763370 kubelet[2761]: W0916 04:54:11.763369 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.763430 kubelet[2761]: E0916 04:54:11.763378 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.763507 kubelet[2761]: E0916 04:54:11.763494 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.763507 kubelet[2761]: W0916 04:54:11.763505 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.763559 kubelet[2761]: E0916 04:54:11.763512 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.763647 kubelet[2761]: E0916 04:54:11.763635 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.763647 kubelet[2761]: W0916 04:54:11.763646 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.763707 kubelet[2761]: E0916 04:54:11.763654 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.763776 kubelet[2761]: E0916 04:54:11.763763 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.763776 kubelet[2761]: W0916 04:54:11.763775 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.763826 kubelet[2761]: E0916 04:54:11.763783 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.763937 kubelet[2761]: E0916 04:54:11.763918 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.763937 kubelet[2761]: W0916 04:54:11.763930 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.763937 kubelet[2761]: E0916 04:54:11.763938 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.764073 kubelet[2761]: E0916 04:54:11.764053 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.764073 kubelet[2761]: W0916 04:54:11.764067 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.764073 kubelet[2761]: E0916 04:54:11.764074 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.764191 kubelet[2761]: E0916 04:54:11.764180 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.764191 kubelet[2761]: W0916 04:54:11.764190 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.764247 kubelet[2761]: E0916 04:54:11.764198 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.764311 kubelet[2761]: E0916 04:54:11.764300 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.764311 kubelet[2761]: W0916 04:54:11.764310 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.764362 kubelet[2761]: E0916 04:54:11.764317 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.764477 kubelet[2761]: E0916 04:54:11.764452 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.764477 kubelet[2761]: W0916 04:54:11.764470 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.764975 kubelet[2761]: E0916 04:54:11.764481 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.764975 kubelet[2761]: E0916 04:54:11.764620 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.764975 kubelet[2761]: W0916 04:54:11.764629 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.764975 kubelet[2761]: E0916 04:54:11.764637 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.764975 kubelet[2761]: E0916 04:54:11.764768 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.764975 kubelet[2761]: W0916 04:54:11.764776 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.764975 kubelet[2761]: E0916 04:54:11.764783 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.770293 kubelet[2761]: E0916 04:54:11.770190 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.770293 kubelet[2761]: W0916 04:54:11.770203 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.770293 kubelet[2761]: E0916 04:54:11.770212 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.770509 kubelet[2761]: E0916 04:54:11.770387 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.770509 kubelet[2761]: W0916 04:54:11.770491 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.770509 kubelet[2761]: E0916 04:54:11.770505 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.770700 kubelet[2761]: E0916 04:54:11.770653 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.770700 kubelet[2761]: W0916 04:54:11.770665 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.770700 kubelet[2761]: E0916 04:54:11.770673 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.770911 kubelet[2761]: E0916 04:54:11.770841 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.770911 kubelet[2761]: W0916 04:54:11.770858 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.770911 kubelet[2761]: E0916 04:54:11.770873 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.771046 kubelet[2761]: E0916 04:54:11.771009 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.771046 kubelet[2761]: W0916 04:54:11.771016 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.771046 kubelet[2761]: E0916 04:54:11.771031 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.771179 kubelet[2761]: E0916 04:54:11.771138 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.771179 kubelet[2761]: W0916 04:54:11.771145 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.771179 kubelet[2761]: E0916 04:54:11.771166 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.771368 kubelet[2761]: E0916 04:54:11.771318 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.771368 kubelet[2761]: W0916 04:54:11.771336 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.771368 kubelet[2761]: E0916 04:54:11.771357 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.771769 kubelet[2761]: E0916 04:54:11.771728 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.771769 kubelet[2761]: W0916 04:54:11.771742 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.771769 kubelet[2761]: E0916 04:54:11.771756 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.772016 kubelet[2761]: E0916 04:54:11.771865 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.772016 kubelet[2761]: W0916 04:54:11.771886 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.772016 kubelet[2761]: E0916 04:54:11.771917 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.772276 kubelet[2761]: E0916 04:54:11.772093 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.772276 kubelet[2761]: W0916 04:54:11.772103 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.772276 kubelet[2761]: E0916 04:54:11.772118 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.772418 kubelet[2761]: E0916 04:54:11.772406 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.772508 kubelet[2761]: W0916 04:54:11.772465 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.772508 kubelet[2761]: E0916 04:54:11.772484 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.772675 kubelet[2761]: E0916 04:54:11.772630 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.772675 kubelet[2761]: W0916 04:54:11.772638 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.772675 kubelet[2761]: E0916 04:54:11.772652 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.772801 kubelet[2761]: E0916 04:54:11.772789 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.772801 kubelet[2761]: W0916 04:54:11.772796 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.772959 kubelet[2761]: E0916 04:54:11.772939 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.773131 kubelet[2761]: E0916 04:54:11.773120 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.773131 kubelet[2761]: W0916 04:54:11.773130 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.773362 kubelet[2761]: E0916 04:54:11.773218 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.773362 kubelet[2761]: E0916 04:54:11.773263 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.773362 kubelet[2761]: W0916 04:54:11.773273 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.773362 kubelet[2761]: E0916 04:54:11.773303 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.773602 kubelet[2761]: E0916 04:54:11.773568 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.773602 kubelet[2761]: W0916 04:54:11.773583 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.773674 kubelet[2761]: E0916 04:54:11.773637 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.774601 kubelet[2761]: E0916 04:54:11.773840 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.774601 kubelet[2761]: W0916 04:54:11.773851 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.774601 kubelet[2761]: E0916 04:54:11.773860 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:11.774601 kubelet[2761]: E0916 04:54:11.774198 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:11.774601 kubelet[2761]: W0916 04:54:11.774208 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:11.774601 kubelet[2761]: E0916 04:54:11.774217 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.687941 containerd[1564]: time="2025-09-16T04:54:12.687873700Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:12.694767 containerd[1564]: time="2025-09-16T04:54:12.694736249Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4446660" Sep 16 04:54:12.699731 containerd[1564]: time="2025-09-16T04:54:12.698947772Z" level=info msg="ImageCreate event name:\"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:12.702956 containerd[1564]: time="2025-09-16T04:54:12.702932339Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:12.708269 containerd[1564]: time="2025-09-16T04:54:12.708246597Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5939323\" in 1.975224846s" Sep 16 04:54:12.708416 containerd[1564]: time="2025-09-16T04:54:12.708340083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:4f2b088ed6fdfc6a97ac0650a4ba8171107d6656ce265c592e4c8423fd10e5c4\"" Sep 16 04:54:12.711546 containerd[1564]: time="2025-09-16T04:54:12.711476430Z" level=info msg="CreateContainer within sandbox \"8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 16 04:54:12.729342 containerd[1564]: time="2025-09-16T04:54:12.728043666Z" level=info msg="Container fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:12.746547 containerd[1564]: time="2025-09-16T04:54:12.746503238Z" level=info msg="CreateContainer within sandbox \"8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d\"" Sep 16 04:54:12.747798 containerd[1564]: time="2025-09-16T04:54:12.747755074Z" level=info msg="StartContainer for \"fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d\"" Sep 16 04:54:12.749611 containerd[1564]: time="2025-09-16T04:54:12.749555869Z" level=info msg="connecting to shim fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d" address="unix:///run/containerd/s/64abbf14477ae60e5427a6ce7c2aaf147d21c5ccdb4bc873b263a8887efa4611" protocol=ttrpc version=3 Sep 16 04:54:12.769040 systemd[1]: Started cri-containerd-fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d.scope - libcontainer container fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d. Sep 16 04:54:12.772417 kubelet[2761]: E0916 04:54:12.772387 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.772845 kubelet[2761]: W0916 04:54:12.772603 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.772845 kubelet[2761]: E0916 04:54:12.772627 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.773450 kubelet[2761]: E0916 04:54:12.773299 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.773450 kubelet[2761]: W0916 04:54:12.773312 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.773450 kubelet[2761]: E0916 04:54:12.773324 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.773805 kubelet[2761]: E0916 04:54:12.773760 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.773805 kubelet[2761]: W0916 04:54:12.773771 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.773805 kubelet[2761]: E0916 04:54:12.773784 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.774338 kubelet[2761]: E0916 04:54:12.774326 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.774574 kubelet[2761]: W0916 04:54:12.774405 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.774574 kubelet[2761]: E0916 04:54:12.774418 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.774974 kubelet[2761]: E0916 04:54:12.774793 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.774974 kubelet[2761]: W0916 04:54:12.774804 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.774974 kubelet[2761]: E0916 04:54:12.774812 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.775260 kubelet[2761]: E0916 04:54:12.775210 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.775260 kubelet[2761]: W0916 04:54:12.775221 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.775260 kubelet[2761]: E0916 04:54:12.775229 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.775753 kubelet[2761]: E0916 04:54:12.775610 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.775753 kubelet[2761]: W0916 04:54:12.775621 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.775753 kubelet[2761]: E0916 04:54:12.775630 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.776098 kubelet[2761]: E0916 04:54:12.776052 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.776098 kubelet[2761]: W0916 04:54:12.776063 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.776098 kubelet[2761]: E0916 04:54:12.776071 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.776707 kubelet[2761]: E0916 04:54:12.776575 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.776707 kubelet[2761]: W0916 04:54:12.776615 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.776974 kubelet[2761]: E0916 04:54:12.776822 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.777400 kubelet[2761]: E0916 04:54:12.777263 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.777400 kubelet[2761]: W0916 04:54:12.777273 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.777400 kubelet[2761]: E0916 04:54:12.777283 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.777685 kubelet[2761]: E0916 04:54:12.777636 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.777685 kubelet[2761]: W0916 04:54:12.777649 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.777685 kubelet[2761]: E0916 04:54:12.777658 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.778098 kubelet[2761]: E0916 04:54:12.778046 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.778098 kubelet[2761]: W0916 04:54:12.778058 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.778098 kubelet[2761]: E0916 04:54:12.778066 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.778631 kubelet[2761]: E0916 04:54:12.778464 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.778631 kubelet[2761]: W0916 04:54:12.778537 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.778631 kubelet[2761]: E0916 04:54:12.778548 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.778913 kubelet[2761]: E0916 04:54:12.778856 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.778981 kubelet[2761]: W0916 04:54:12.778970 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.779072 kubelet[2761]: E0916 04:54:12.779061 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.779517 kubelet[2761]: E0916 04:54:12.779373 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:54:12.779517 kubelet[2761]: W0916 04:54:12.779384 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:54:12.779517 kubelet[2761]: E0916 04:54:12.779393 2761 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:54:12.824092 systemd[1]: cri-containerd-fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d.scope: Deactivated successfully. Sep 16 04:54:12.824974 systemd[1]: cri-containerd-fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d.scope: Consumed 27ms CPU time, 6M memory peak, 4.6M written to disk. Sep 16 04:54:12.856509 containerd[1564]: time="2025-09-16T04:54:12.856465009Z" level=info msg="received exit event container_id:\"fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d\" id:\"fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d\" pid:3452 exited_at:{seconds:1757998452 nanos:826383629}" Sep 16 04:54:12.860012 containerd[1564]: time="2025-09-16T04:54:12.859639257Z" level=info msg="StartContainer for \"fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d\" returns successfully" Sep 16 04:54:12.880755 containerd[1564]: time="2025-09-16T04:54:12.880720241Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d\" id:\"fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d\" pid:3452 exited_at:{seconds:1757998452 nanos:826383629}" Sep 16 04:54:12.896411 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fe1ec56286c99f48c0e1ac850c2ff312e213a672948b3f094d6dd48ede9e479d-rootfs.mount: Deactivated successfully. Sep 16 04:54:13.610656 kubelet[2761]: E0916 04:54:13.610607 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zbfw2" podUID="d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c" Sep 16 04:54:13.716794 containerd[1564]: time="2025-09-16T04:54:13.716592615Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 16 04:54:15.610768 kubelet[2761]: E0916 04:54:15.610667 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zbfw2" podUID="d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c" Sep 16 04:54:17.610870 kubelet[2761]: E0916 04:54:17.610835 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zbfw2" podUID="d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c" Sep 16 04:54:17.939526 containerd[1564]: time="2025-09-16T04:54:17.939196972Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:17.941178 containerd[1564]: time="2025-09-16T04:54:17.941138512Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=70440613" Sep 16 04:54:17.943757 containerd[1564]: time="2025-09-16T04:54:17.943646302Z" level=info msg="ImageCreate event name:\"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:17.949020 containerd[1564]: time="2025-09-16T04:54:17.948867268Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:17.950647 containerd[1564]: time="2025-09-16T04:54:17.950476364Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"71933316\" in 4.233845126s" Sep 16 04:54:17.950647 containerd[1564]: time="2025-09-16T04:54:17.950552897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:034822460c2f667e1f4a7679c843cc35ce1bf2c25dec86f04e07fb403df7e458\"" Sep 16 04:54:17.955308 containerd[1564]: time="2025-09-16T04:54:17.955237618Z" level=info msg="CreateContainer within sandbox \"8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 16 04:54:17.978933 containerd[1564]: time="2025-09-16T04:54:17.977118197Z" level=info msg="Container a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:17.985162 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3167736815.mount: Deactivated successfully. Sep 16 04:54:17.995278 containerd[1564]: time="2025-09-16T04:54:17.995218111Z" level=info msg="CreateContainer within sandbox \"8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836\"" Sep 16 04:54:17.997050 containerd[1564]: time="2025-09-16T04:54:17.996067222Z" level=info msg="StartContainer for \"a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836\"" Sep 16 04:54:17.998587 containerd[1564]: time="2025-09-16T04:54:17.998543975Z" level=info msg="connecting to shim a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836" address="unix:///run/containerd/s/64abbf14477ae60e5427a6ce7c2aaf147d21c5ccdb4bc873b263a8887efa4611" protocol=ttrpc version=3 Sep 16 04:54:18.036144 systemd[1]: Started cri-containerd-a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836.scope - libcontainer container a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836. Sep 16 04:54:18.140019 containerd[1564]: time="2025-09-16T04:54:18.139977986Z" level=info msg="StartContainer for \"a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836\" returns successfully" Sep 16 04:54:18.699551 systemd[1]: cri-containerd-a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836.scope: Deactivated successfully. Sep 16 04:54:18.699936 systemd[1]: cri-containerd-a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836.scope: Consumed 459ms CPU time, 164.1M memory peak, 6.2M read from disk, 171.3M written to disk. Sep 16 04:54:18.801179 containerd[1564]: time="2025-09-16T04:54:18.799741859Z" level=info msg="received exit event container_id:\"a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836\" id:\"a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836\" pid:3526 exited_at:{seconds:1757998458 nanos:798985180}" Sep 16 04:54:18.804555 containerd[1564]: time="2025-09-16T04:54:18.803889913Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836\" id:\"a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836\" pid:3526 exited_at:{seconds:1757998458 nanos:798985180}" Sep 16 04:54:18.864877 kubelet[2761]: I0916 04:54:18.864840 2761 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 16 04:54:18.919130 kubelet[2761]: I0916 04:54:18.917399 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655-config-volume\") pod \"coredns-7c65d6cfc9-hhpj6\" (UID: \"d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655\") " pod="kube-system/coredns-7c65d6cfc9-hhpj6" Sep 16 04:54:18.919130 kubelet[2761]: I0916 04:54:18.917681 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5w6q\" (UniqueName: \"kubernetes.io/projected/d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655-kube-api-access-q5w6q\") pod \"coredns-7c65d6cfc9-hhpj6\" (UID: \"d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655\") " pod="kube-system/coredns-7c65d6cfc9-hhpj6" Sep 16 04:54:18.920448 systemd[1]: Created slice kubepods-burstable-podd5bfb5d1_08f7_48ee_b5e1_6e0ba3bf8655.slice - libcontainer container kubepods-burstable-podd5bfb5d1_08f7_48ee_b5e1_6e0ba3bf8655.slice. Sep 16 04:54:18.959023 systemd[1]: Created slice kubepods-besteffort-poda775d4c8_1673_4091_971b_6305d05b964b.slice - libcontainer container kubepods-besteffort-poda775d4c8_1673_4091_971b_6305d05b964b.slice. Sep 16 04:54:18.975194 systemd[1]: Created slice kubepods-besteffort-pod3656d4e7_adec_4a47_bcd1_ad26e49ac2c5.slice - libcontainer container kubepods-besteffort-pod3656d4e7_adec_4a47_bcd1_ad26e49ac2c5.slice. Sep 16 04:54:18.987566 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a8a4185b3d8dbb8f23138b028ef5022954fccfdee5de6483f905376c6b936836-rootfs.mount: Deactivated successfully. Sep 16 04:54:18.991478 systemd[1]: Created slice kubepods-burstable-pod369182db_6487_436b_a33d_08f7db2acf90.slice - libcontainer container kubepods-burstable-pod369182db_6487_436b_a33d_08f7db2acf90.slice. Sep 16 04:54:18.998532 systemd[1]: Created slice kubepods-besteffort-podd6a95398_d9c4_4910_8a9c_46884be92ce4.slice - libcontainer container kubepods-besteffort-podd6a95398_d9c4_4910_8a9c_46884be92ce4.slice. Sep 16 04:54:19.005063 systemd[1]: Created slice kubepods-besteffort-pod3fbf2254_8fa7_4368_8a76_518e4c98f2ee.slice - libcontainer container kubepods-besteffort-pod3fbf2254_8fa7_4368_8a76_518e4c98f2ee.slice. Sep 16 04:54:19.011400 systemd[1]: Created slice kubepods-besteffort-pod22e7ad62_06be_460a_9992_748a94f58366.slice - libcontainer container kubepods-besteffort-pod22e7ad62_06be_460a_9992_748a94f58366.slice. Sep 16 04:54:19.018307 kubelet[2761]: I0916 04:54:19.018287 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d6a95398-d9c4-4910-8a9c-46884be92ce4-calico-apiserver-certs\") pod \"calico-apiserver-58cb7b665-w4dc5\" (UID: \"d6a95398-d9c4-4910-8a9c-46884be92ce4\") " pod="calico-apiserver/calico-apiserver-58cb7b665-w4dc5" Sep 16 04:54:19.019231 kubelet[2761]: I0916 04:54:19.018699 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsw8\" (UniqueName: \"kubernetes.io/projected/d6a95398-d9c4-4910-8a9c-46884be92ce4-kube-api-access-ljsw8\") pod \"calico-apiserver-58cb7b665-w4dc5\" (UID: \"d6a95398-d9c4-4910-8a9c-46884be92ce4\") " pod="calico-apiserver/calico-apiserver-58cb7b665-w4dc5" Sep 16 04:54:19.019231 kubelet[2761]: I0916 04:54:19.018728 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/22e7ad62-06be-460a-9992-748a94f58366-calico-apiserver-certs\") pod \"calico-apiserver-58cb7b665-bgpf8\" (UID: \"22e7ad62-06be-460a-9992-748a94f58366\") " pod="calico-apiserver/calico-apiserver-58cb7b665-bgpf8" Sep 16 04:54:19.119617 kubelet[2761]: I0916 04:54:19.119560 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqwvb\" (UniqueName: \"kubernetes.io/projected/3656d4e7-adec-4a47-bcd1-ad26e49ac2c5-kube-api-access-lqwvb\") pod \"goldmane-7988f88666-rrt7l\" (UID: \"3656d4e7-adec-4a47-bcd1-ad26e49ac2c5\") " pod="calico-system/goldmane-7988f88666-rrt7l" Sep 16 04:54:19.119617 kubelet[2761]: I0916 04:54:19.119607 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-whisker-backend-key-pair\") pod \"whisker-9bbfdf44c-mjnl4\" (UID: \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\") " pod="calico-system/whisker-9bbfdf44c-mjnl4" Sep 16 04:54:19.119780 kubelet[2761]: I0916 04:54:19.119628 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwssr\" (UniqueName: \"kubernetes.io/projected/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-kube-api-access-zwssr\") pod \"whisker-9bbfdf44c-mjnl4\" (UID: \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\") " pod="calico-system/whisker-9bbfdf44c-mjnl4" Sep 16 04:54:19.119780 kubelet[2761]: I0916 04:54:19.119647 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a775d4c8-1673-4091-971b-6305d05b964b-tigera-ca-bundle\") pod \"calico-kube-controllers-54dcbc64bf-rskqr\" (UID: \"a775d4c8-1673-4091-971b-6305d05b964b\") " pod="calico-system/calico-kube-controllers-54dcbc64bf-rskqr" Sep 16 04:54:19.119780 kubelet[2761]: I0916 04:54:19.119687 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3656d4e7-adec-4a47-bcd1-ad26e49ac2c5-goldmane-key-pair\") pod \"goldmane-7988f88666-rrt7l\" (UID: \"3656d4e7-adec-4a47-bcd1-ad26e49ac2c5\") " pod="calico-system/goldmane-7988f88666-rrt7l" Sep 16 04:54:19.119780 kubelet[2761]: I0916 04:54:19.119708 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3656d4e7-adec-4a47-bcd1-ad26e49ac2c5-config\") pod \"goldmane-7988f88666-rrt7l\" (UID: \"3656d4e7-adec-4a47-bcd1-ad26e49ac2c5\") " pod="calico-system/goldmane-7988f88666-rrt7l" Sep 16 04:54:19.119780 kubelet[2761]: I0916 04:54:19.119741 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhldc\" (UniqueName: \"kubernetes.io/projected/369182db-6487-436b-a33d-08f7db2acf90-kube-api-access-xhldc\") pod \"coredns-7c65d6cfc9-6qnvb\" (UID: \"369182db-6487-436b-a33d-08f7db2acf90\") " pod="kube-system/coredns-7c65d6cfc9-6qnvb" Sep 16 04:54:19.119921 kubelet[2761]: I0916 04:54:19.119762 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3656d4e7-adec-4a47-bcd1-ad26e49ac2c5-goldmane-ca-bundle\") pod \"goldmane-7988f88666-rrt7l\" (UID: \"3656d4e7-adec-4a47-bcd1-ad26e49ac2c5\") " pod="calico-system/goldmane-7988f88666-rrt7l" Sep 16 04:54:19.119921 kubelet[2761]: I0916 04:54:19.119783 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwpsl\" (UniqueName: \"kubernetes.io/projected/a775d4c8-1673-4091-971b-6305d05b964b-kube-api-access-lwpsl\") pod \"calico-kube-controllers-54dcbc64bf-rskqr\" (UID: \"a775d4c8-1673-4091-971b-6305d05b964b\") " pod="calico-system/calico-kube-controllers-54dcbc64bf-rskqr" Sep 16 04:54:19.119921 kubelet[2761]: I0916 04:54:19.119803 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-whisker-ca-bundle\") pod \"whisker-9bbfdf44c-mjnl4\" (UID: \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\") " pod="calico-system/whisker-9bbfdf44c-mjnl4" Sep 16 04:54:19.119921 kubelet[2761]: I0916 04:54:19.119827 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzxc8\" (UniqueName: \"kubernetes.io/projected/22e7ad62-06be-460a-9992-748a94f58366-kube-api-access-pzxc8\") pod \"calico-apiserver-58cb7b665-bgpf8\" (UID: \"22e7ad62-06be-460a-9992-748a94f58366\") " pod="calico-apiserver/calico-apiserver-58cb7b665-bgpf8" Sep 16 04:54:19.119921 kubelet[2761]: I0916 04:54:19.119847 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/369182db-6487-436b-a33d-08f7db2acf90-config-volume\") pod \"coredns-7c65d6cfc9-6qnvb\" (UID: \"369182db-6487-436b-a33d-08f7db2acf90\") " pod="kube-system/coredns-7c65d6cfc9-6qnvb" Sep 16 04:54:19.245616 containerd[1564]: time="2025-09-16T04:54:19.245500781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hhpj6,Uid:d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655,Namespace:kube-system,Attempt:0,}" Sep 16 04:54:19.272909 containerd[1564]: time="2025-09-16T04:54:19.272859101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54dcbc64bf-rskqr,Uid:a775d4c8-1673-4091-971b-6305d05b964b,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:19.297764 containerd[1564]: time="2025-09-16T04:54:19.297170930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6qnvb,Uid:369182db-6487-436b-a33d-08f7db2acf90,Namespace:kube-system,Attempt:0,}" Sep 16 04:54:19.298813 containerd[1564]: time="2025-09-16T04:54:19.298782620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-rrt7l,Uid:3656d4e7-adec-4a47-bcd1-ad26e49ac2c5,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:19.300680 containerd[1564]: time="2025-09-16T04:54:19.300658296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58cb7b665-w4dc5,Uid:d6a95398-d9c4-4910-8a9c-46884be92ce4,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:54:19.322197 containerd[1564]: time="2025-09-16T04:54:19.322169326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58cb7b665-bgpf8,Uid:22e7ad62-06be-460a-9992-748a94f58366,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:54:19.331587 containerd[1564]: time="2025-09-16T04:54:19.331545460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-9bbfdf44c-mjnl4,Uid:3fbf2254-8fa7-4368-8a76-518e4c98f2ee,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:19.482505 containerd[1564]: time="2025-09-16T04:54:19.482456652Z" level=error msg="Failed to destroy network for sandbox \"054a4a0ea55db3cf58207ff8a4450472146f93df3efdb3bf323a605c24b2a962\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.485042 containerd[1564]: time="2025-09-16T04:54:19.484987245Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54dcbc64bf-rskqr,Uid:a775d4c8-1673-4091-971b-6305d05b964b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"054a4a0ea55db3cf58207ff8a4450472146f93df3efdb3bf323a605c24b2a962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.485584 kubelet[2761]: E0916 04:54:19.485529 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"054a4a0ea55db3cf58207ff8a4450472146f93df3efdb3bf323a605c24b2a962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.485837 kubelet[2761]: E0916 04:54:19.485691 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"054a4a0ea55db3cf58207ff8a4450472146f93df3efdb3bf323a605c24b2a962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-54dcbc64bf-rskqr" Sep 16 04:54:19.485837 kubelet[2761]: E0916 04:54:19.485767 2761 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"054a4a0ea55db3cf58207ff8a4450472146f93df3efdb3bf323a605c24b2a962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-54dcbc64bf-rskqr" Sep 16 04:54:19.486001 kubelet[2761]: E0916 04:54:19.485810 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-54dcbc64bf-rskqr_calico-system(a775d4c8-1673-4091-971b-6305d05b964b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-54dcbc64bf-rskqr_calico-system(a775d4c8-1673-4091-971b-6305d05b964b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"054a4a0ea55db3cf58207ff8a4450472146f93df3efdb3bf323a605c24b2a962\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-54dcbc64bf-rskqr" podUID="a775d4c8-1673-4091-971b-6305d05b964b" Sep 16 04:54:19.498544 containerd[1564]: time="2025-09-16T04:54:19.498164614Z" level=error msg="Failed to destroy network for sandbox \"5402fa74937bdceb2b342a610c9b2e7d01f30eaf8c8e1e0f1f2ccce6ad3ac471\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.504924 containerd[1564]: time="2025-09-16T04:54:19.504830157Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6qnvb,Uid:369182db-6487-436b-a33d-08f7db2acf90,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5402fa74937bdceb2b342a610c9b2e7d01f30eaf8c8e1e0f1f2ccce6ad3ac471\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.505172 kubelet[2761]: E0916 04:54:19.505140 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5402fa74937bdceb2b342a610c9b2e7d01f30eaf8c8e1e0f1f2ccce6ad3ac471\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.505248 kubelet[2761]: E0916 04:54:19.505192 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5402fa74937bdceb2b342a610c9b2e7d01f30eaf8c8e1e0f1f2ccce6ad3ac471\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-6qnvb" Sep 16 04:54:19.505248 kubelet[2761]: E0916 04:54:19.505209 2761 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5402fa74937bdceb2b342a610c9b2e7d01f30eaf8c8e1e0f1f2ccce6ad3ac471\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-6qnvb" Sep 16 04:54:19.505303 kubelet[2761]: E0916 04:54:19.505247 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-6qnvb_kube-system(369182db-6487-436b-a33d-08f7db2acf90)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-6qnvb_kube-system(369182db-6487-436b-a33d-08f7db2acf90)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5402fa74937bdceb2b342a610c9b2e7d01f30eaf8c8e1e0f1f2ccce6ad3ac471\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-6qnvb" podUID="369182db-6487-436b-a33d-08f7db2acf90" Sep 16 04:54:19.517055 containerd[1564]: time="2025-09-16T04:54:19.516963069Z" level=error msg="Failed to destroy network for sandbox \"bad0e55b9fd676fe5c6e6bb773fdd11cbbd0a231817b7ec4de2b74794d1118d2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.521575 containerd[1564]: time="2025-09-16T04:54:19.521441033Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58cb7b665-bgpf8,Uid:22e7ad62-06be-460a-9992-748a94f58366,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bad0e55b9fd676fe5c6e6bb773fdd11cbbd0a231817b7ec4de2b74794d1118d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.523290 kubelet[2761]: E0916 04:54:19.522255 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bad0e55b9fd676fe5c6e6bb773fdd11cbbd0a231817b7ec4de2b74794d1118d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.523290 kubelet[2761]: E0916 04:54:19.522303 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bad0e55b9fd676fe5c6e6bb773fdd11cbbd0a231817b7ec4de2b74794d1118d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58cb7b665-bgpf8" Sep 16 04:54:19.523290 kubelet[2761]: E0916 04:54:19.522320 2761 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bad0e55b9fd676fe5c6e6bb773fdd11cbbd0a231817b7ec4de2b74794d1118d2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58cb7b665-bgpf8" Sep 16 04:54:19.524230 kubelet[2761]: E0916 04:54:19.522350 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-58cb7b665-bgpf8_calico-apiserver(22e7ad62-06be-460a-9992-748a94f58366)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-58cb7b665-bgpf8_calico-apiserver(22e7ad62-06be-460a-9992-748a94f58366)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bad0e55b9fd676fe5c6e6bb773fdd11cbbd0a231817b7ec4de2b74794d1118d2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-58cb7b665-bgpf8" podUID="22e7ad62-06be-460a-9992-748a94f58366" Sep 16 04:54:19.524723 containerd[1564]: time="2025-09-16T04:54:19.524584304Z" level=error msg="Failed to destroy network for sandbox \"8a997249951cd871481334af4587597bef42b2e8e2bb4de8e22d4e48abbad3fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.528167 containerd[1564]: time="2025-09-16T04:54:19.528137785Z" level=error msg="Failed to destroy network for sandbox \"2704c11cec042bab2a2972b07103029fc8cd2a5f85b13db329cee1e8d5f36e6d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.528497 containerd[1564]: time="2025-09-16T04:54:19.528454088Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-9bbfdf44c-mjnl4,Uid:3fbf2254-8fa7-4368-8a76-518e4c98f2ee,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a997249951cd871481334af4587597bef42b2e8e2bb4de8e22d4e48abbad3fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.528797 kubelet[2761]: E0916 04:54:19.528733 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a997249951cd871481334af4587597bef42b2e8e2bb4de8e22d4e48abbad3fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.528797 kubelet[2761]: E0916 04:54:19.528766 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a997249951cd871481334af4587597bef42b2e8e2bb4de8e22d4e48abbad3fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-9bbfdf44c-mjnl4" Sep 16 04:54:19.528998 kubelet[2761]: E0916 04:54:19.528921 2761 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a997249951cd871481334af4587597bef42b2e8e2bb4de8e22d4e48abbad3fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-9bbfdf44c-mjnl4" Sep 16 04:54:19.529394 kubelet[2761]: E0916 04:54:19.528985 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-9bbfdf44c-mjnl4_calico-system(3fbf2254-8fa7-4368-8a76-518e4c98f2ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-9bbfdf44c-mjnl4_calico-system(3fbf2254-8fa7-4368-8a76-518e4c98f2ee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a997249951cd871481334af4587597bef42b2e8e2bb4de8e22d4e48abbad3fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-9bbfdf44c-mjnl4" podUID="3fbf2254-8fa7-4368-8a76-518e4c98f2ee" Sep 16 04:54:19.530170 containerd[1564]: time="2025-09-16T04:54:19.530147642Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-rrt7l,Uid:3656d4e7-adec-4a47-bcd1-ad26e49ac2c5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2704c11cec042bab2a2972b07103029fc8cd2a5f85b13db329cee1e8d5f36e6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.530909 kubelet[2761]: E0916 04:54:19.530386 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2704c11cec042bab2a2972b07103029fc8cd2a5f85b13db329cee1e8d5f36e6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.530909 kubelet[2761]: E0916 04:54:19.530431 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2704c11cec042bab2a2972b07103029fc8cd2a5f85b13db329cee1e8d5f36e6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-rrt7l" Sep 16 04:54:19.530909 kubelet[2761]: E0916 04:54:19.530451 2761 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2704c11cec042bab2a2972b07103029fc8cd2a5f85b13db329cee1e8d5f36e6d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-rrt7l" Sep 16 04:54:19.530998 kubelet[2761]: E0916 04:54:19.530481 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-rrt7l_calico-system(3656d4e7-adec-4a47-bcd1-ad26e49ac2c5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-rrt7l_calico-system(3656d4e7-adec-4a47-bcd1-ad26e49ac2c5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2704c11cec042bab2a2972b07103029fc8cd2a5f85b13db329cee1e8d5f36e6d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-rrt7l" podUID="3656d4e7-adec-4a47-bcd1-ad26e49ac2c5" Sep 16 04:54:19.533243 containerd[1564]: time="2025-09-16T04:54:19.533216313Z" level=error msg="Failed to destroy network for sandbox \"192027f76c4dc7d70f31e7d27458d95ec20604b670697e900f645cd79ff3684e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.535032 containerd[1564]: time="2025-09-16T04:54:19.535004746Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hhpj6,Uid:d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"192027f76c4dc7d70f31e7d27458d95ec20604b670697e900f645cd79ff3684e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.535202 kubelet[2761]: E0916 04:54:19.535179 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"192027f76c4dc7d70f31e7d27458d95ec20604b670697e900f645cd79ff3684e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.535239 kubelet[2761]: E0916 04:54:19.535212 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"192027f76c4dc7d70f31e7d27458d95ec20604b670697e900f645cd79ff3684e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hhpj6" Sep 16 04:54:19.535239 kubelet[2761]: E0916 04:54:19.535227 2761 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"192027f76c4dc7d70f31e7d27458d95ec20604b670697e900f645cd79ff3684e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-hhpj6" Sep 16 04:54:19.535302 kubelet[2761]: E0916 04:54:19.535284 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-hhpj6_kube-system(d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-hhpj6_kube-system(d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"192027f76c4dc7d70f31e7d27458d95ec20604b670697e900f645cd79ff3684e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-hhpj6" podUID="d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655" Sep 16 04:54:19.543019 containerd[1564]: time="2025-09-16T04:54:19.542984623Z" level=error msg="Failed to destroy network for sandbox \"86d397a18da3cdc0453f6cdf25a3e45a734a8b6001f802c15a9ee4a7ab643ab4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.545862 containerd[1564]: time="2025-09-16T04:54:19.545834595Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58cb7b665-w4dc5,Uid:d6a95398-d9c4-4910-8a9c-46884be92ce4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"86d397a18da3cdc0453f6cdf25a3e45a734a8b6001f802c15a9ee4a7ab643ab4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.546260 kubelet[2761]: E0916 04:54:19.546232 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86d397a18da3cdc0453f6cdf25a3e45a734a8b6001f802c15a9ee4a7ab643ab4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.546309 kubelet[2761]: E0916 04:54:19.546269 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86d397a18da3cdc0453f6cdf25a3e45a734a8b6001f802c15a9ee4a7ab643ab4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58cb7b665-w4dc5" Sep 16 04:54:19.546309 kubelet[2761]: E0916 04:54:19.546284 2761 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"86d397a18da3cdc0453f6cdf25a3e45a734a8b6001f802c15a9ee4a7ab643ab4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-58cb7b665-w4dc5" Sep 16 04:54:19.546357 kubelet[2761]: E0916 04:54:19.546321 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-58cb7b665-w4dc5_calico-apiserver(d6a95398-d9c4-4910-8a9c-46884be92ce4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-58cb7b665-w4dc5_calico-apiserver(d6a95398-d9c4-4910-8a9c-46884be92ce4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"86d397a18da3cdc0453f6cdf25a3e45a734a8b6001f802c15a9ee4a7ab643ab4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-58cb7b665-w4dc5" podUID="d6a95398-d9c4-4910-8a9c-46884be92ce4" Sep 16 04:54:19.615772 systemd[1]: Created slice kubepods-besteffort-podd8d8c4e0_7dcf_4596_aa36_f8d9f0f3722c.slice - libcontainer container kubepods-besteffort-podd8d8c4e0_7dcf_4596_aa36_f8d9f0f3722c.slice. Sep 16 04:54:19.618205 containerd[1564]: time="2025-09-16T04:54:19.618168535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zbfw2,Uid:d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:19.666370 containerd[1564]: time="2025-09-16T04:54:19.666235821Z" level=error msg="Failed to destroy network for sandbox \"cf7e7ed35100c1bc5004ddeaf58e5eff206069ccc9e3054d20615c2545246c64\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.670232 containerd[1564]: time="2025-09-16T04:54:19.670193329Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zbfw2,Uid:d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf7e7ed35100c1bc5004ddeaf58e5eff206069ccc9e3054d20615c2545246c64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.670451 kubelet[2761]: E0916 04:54:19.670412 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf7e7ed35100c1bc5004ddeaf58e5eff206069ccc9e3054d20615c2545246c64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:54:19.670520 kubelet[2761]: E0916 04:54:19.670470 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf7e7ed35100c1bc5004ddeaf58e5eff206069ccc9e3054d20615c2545246c64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zbfw2" Sep 16 04:54:19.670520 kubelet[2761]: E0916 04:54:19.670509 2761 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cf7e7ed35100c1bc5004ddeaf58e5eff206069ccc9e3054d20615c2545246c64\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zbfw2" Sep 16 04:54:19.670591 kubelet[2761]: E0916 04:54:19.670561 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zbfw2_calico-system(d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zbfw2_calico-system(d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cf7e7ed35100c1bc5004ddeaf58e5eff206069ccc9e3054d20615c2545246c64\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zbfw2" podUID="d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c" Sep 16 04:54:19.759387 containerd[1564]: time="2025-09-16T04:54:19.759228151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 16 04:54:26.251109 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2197613193.mount: Deactivated successfully. Sep 16 04:54:26.353908 containerd[1564]: time="2025-09-16T04:54:26.346755160Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:26.357348 containerd[1564]: time="2025-09-16T04:54:26.357305228Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=157078339" Sep 16 04:54:26.379850 containerd[1564]: time="2025-09-16T04:54:26.379793045Z" level=info msg="ImageCreate event name:\"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:26.389735 containerd[1564]: time="2025-09-16T04:54:26.389660332Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:26.390208 containerd[1564]: time="2025-09-16T04:54:26.390047859Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"157078201\" in 6.630735129s" Sep 16 04:54:26.390208 containerd[1564]: time="2025-09-16T04:54:26.390079287Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:ce9c4ac0f175f22c56e80844e65379d9ebe1d8a4e2bbb38dc1db0f53a8826f0f\"" Sep 16 04:54:26.411822 containerd[1564]: time="2025-09-16T04:54:26.411781271Z" level=info msg="CreateContainer within sandbox \"8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 16 04:54:26.473853 containerd[1564]: time="2025-09-16T04:54:26.473710715Z" level=info msg="Container ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:26.475142 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2561995772.mount: Deactivated successfully. Sep 16 04:54:26.529051 containerd[1564]: time="2025-09-16T04:54:26.528944538Z" level=info msg="CreateContainer within sandbox \"8c564f7461e97365ed6601e255e687ff029d0e43363ff6fad8bb37ca76759438\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91\"" Sep 16 04:54:26.531836 containerd[1564]: time="2025-09-16T04:54:26.531791415Z" level=info msg="StartContainer for \"ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91\"" Sep 16 04:54:26.537345 containerd[1564]: time="2025-09-16T04:54:26.537269544Z" level=info msg="connecting to shim ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91" address="unix:///run/containerd/s/64abbf14477ae60e5427a6ce7c2aaf147d21c5ccdb4bc873b263a8887efa4611" protocol=ttrpc version=3 Sep 16 04:54:26.653033 systemd[1]: Started cri-containerd-ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91.scope - libcontainer container ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91. Sep 16 04:54:26.704576 containerd[1564]: time="2025-09-16T04:54:26.704454966Z" level=info msg="StartContainer for \"ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91\" returns successfully" Sep 16 04:54:26.799950 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 16 04:54:26.800641 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 16 04:54:26.815644 kubelet[2761]: I0916 04:54:26.815560 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-59jns" podStartSLOduration=1.614662708 podStartE2EDuration="19.815544237s" podCreationTimestamp="2025-09-16 04:54:07 +0000 UTC" firstStartedPulling="2025-09-16 04:54:08.192086484 +0000 UTC m=+19.716337114" lastFinishedPulling="2025-09-16 04:54:26.392968002 +0000 UTC m=+37.917218643" observedRunningTime="2025-09-16 04:54:26.814090942 +0000 UTC m=+38.338341582" watchObservedRunningTime="2025-09-16 04:54:26.815544237 +0000 UTC m=+38.339794877" Sep 16 04:54:27.074684 kubelet[2761]: I0916 04:54:27.074352 2761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-whisker-ca-bundle\") pod \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\" (UID: \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\") " Sep 16 04:54:27.074684 kubelet[2761]: I0916 04:54:27.074414 2761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-whisker-backend-key-pair\") pod \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\" (UID: \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\") " Sep 16 04:54:27.074684 kubelet[2761]: I0916 04:54:27.074436 2761 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwssr\" (UniqueName: \"kubernetes.io/projected/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-kube-api-access-zwssr\") pod \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\" (UID: \"3fbf2254-8fa7-4368-8a76-518e4c98f2ee\") " Sep 16 04:54:27.078915 kubelet[2761]: I0916 04:54:27.077293 2761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "3fbf2254-8fa7-4368-8a76-518e4c98f2ee" (UID: "3fbf2254-8fa7-4368-8a76-518e4c98f2ee"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 16 04:54:27.079410 kubelet[2761]: I0916 04:54:27.079353 2761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-kube-api-access-zwssr" (OuterVolumeSpecName: "kube-api-access-zwssr") pod "3fbf2254-8fa7-4368-8a76-518e4c98f2ee" (UID: "3fbf2254-8fa7-4368-8a76-518e4c98f2ee"). InnerVolumeSpecName "kube-api-access-zwssr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 16 04:54:27.079858 kubelet[2761]: I0916 04:54:27.079794 2761 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "3fbf2254-8fa7-4368-8a76-518e4c98f2ee" (UID: "3fbf2254-8fa7-4368-8a76-518e4c98f2ee"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 16 04:54:27.175525 kubelet[2761]: I0916 04:54:27.175493 2761 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-whisker-ca-bundle\") on node \"ci-4459-0-0-n-06f2563e85\" DevicePath \"\"" Sep 16 04:54:27.175525 kubelet[2761]: I0916 04:54:27.175517 2761 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-whisker-backend-key-pair\") on node \"ci-4459-0-0-n-06f2563e85\" DevicePath \"\"" Sep 16 04:54:27.175525 kubelet[2761]: I0916 04:54:27.175527 2761 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwssr\" (UniqueName: \"kubernetes.io/projected/3fbf2254-8fa7-4368-8a76-518e4c98f2ee-kube-api-access-zwssr\") on node \"ci-4459-0-0-n-06f2563e85\" DevicePath \"\"" Sep 16 04:54:27.252665 systemd[1]: var-lib-kubelet-pods-3fbf2254\x2d8fa7\x2d4368\x2d8a76\x2d518e4c98f2ee-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzwssr.mount: Deactivated successfully. Sep 16 04:54:27.252773 systemd[1]: var-lib-kubelet-pods-3fbf2254\x2d8fa7\x2d4368\x2d8a76\x2d518e4c98f2ee-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 16 04:54:27.789587 kubelet[2761]: I0916 04:54:27.789545 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:54:27.793575 systemd[1]: Removed slice kubepods-besteffort-pod3fbf2254_8fa7_4368_8a76_518e4c98f2ee.slice - libcontainer container kubepods-besteffort-pod3fbf2254_8fa7_4368_8a76_518e4c98f2ee.slice. Sep 16 04:54:27.856203 systemd[1]: Created slice kubepods-besteffort-pod5581d506_2221_4a67_b872_f6f6f7c03785.slice - libcontainer container kubepods-besteffort-pod5581d506_2221_4a67_b872_f6f6f7c03785.slice. Sep 16 04:54:27.980655 kubelet[2761]: I0916 04:54:27.980590 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5nmm\" (UniqueName: \"kubernetes.io/projected/5581d506-2221-4a67-b872-f6f6f7c03785-kube-api-access-h5nmm\") pod \"whisker-88666cf98-lpgjf\" (UID: \"5581d506-2221-4a67-b872-f6f6f7c03785\") " pod="calico-system/whisker-88666cf98-lpgjf" Sep 16 04:54:27.981170 kubelet[2761]: I0916 04:54:27.980668 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5581d506-2221-4a67-b872-f6f6f7c03785-whisker-ca-bundle\") pod \"whisker-88666cf98-lpgjf\" (UID: \"5581d506-2221-4a67-b872-f6f6f7c03785\") " pod="calico-system/whisker-88666cf98-lpgjf" Sep 16 04:54:27.981170 kubelet[2761]: I0916 04:54:27.980699 2761 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5581d506-2221-4a67-b872-f6f6f7c03785-whisker-backend-key-pair\") pod \"whisker-88666cf98-lpgjf\" (UID: \"5581d506-2221-4a67-b872-f6f6f7c03785\") " pod="calico-system/whisker-88666cf98-lpgjf" Sep 16 04:54:28.161263 containerd[1564]: time="2025-09-16T04:54:28.160983977Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-88666cf98-lpgjf,Uid:5581d506-2221-4a67-b872-f6f6f7c03785,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:28.506533 systemd-networkd[1468]: calif9ae9e3d3fb: Link UP Sep 16 04:54:28.507835 systemd-networkd[1468]: calif9ae9e3d3fb: Gained carrier Sep 16 04:54:28.520776 containerd[1564]: 2025-09-16 04:54:28.222 [INFO][3930] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 16 04:54:28.520776 containerd[1564]: 2025-09-16 04:54:28.259 [INFO][3930] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0 whisker-88666cf98- calico-system 5581d506-2221-4a67-b872-f6f6f7c03785 870 0 2025-09-16 04:54:27 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:88666cf98 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-0-0-n-06f2563e85 whisker-88666cf98-lpgjf eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif9ae9e3d3fb [] [] }} ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Namespace="calico-system" Pod="whisker-88666cf98-lpgjf" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-" Sep 16 04:54:28.520776 containerd[1564]: 2025-09-16 04:54:28.260 [INFO][3930] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Namespace="calico-system" Pod="whisker-88666cf98-lpgjf" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" Sep 16 04:54:28.520776 containerd[1564]: 2025-09-16 04:54:28.443 [INFO][3954] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" HandleID="k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Workload="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.445 [INFO][3954] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" HandleID="k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Workload="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e4c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-n-06f2563e85", "pod":"whisker-88666cf98-lpgjf", "timestamp":"2025-09-16 04:54:28.443427631 +0000 UTC"}, Hostname:"ci-4459-0-0-n-06f2563e85", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.446 [INFO][3954] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.447 [INFO][3954] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.447 [INFO][3954] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-06f2563e85' Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.466 [INFO][3954] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.475 [INFO][3954] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.480 [INFO][3954] ipam/ipam.go 511: Trying affinity for 192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.482 [INFO][3954] ipam/ipam.go 158: Attempting to load block cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.521855 containerd[1564]: 2025-09-16 04:54:28.483 [INFO][3954] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.524003 containerd[1564]: 2025-09-16 04:54:28.483 [INFO][3954] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.50.128/26 handle="k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.524003 containerd[1564]: 2025-09-16 04:54:28.484 [INFO][3954] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed Sep 16 04:54:28.524003 containerd[1564]: 2025-09-16 04:54:28.488 [INFO][3954] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.50.128/26 handle="k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.524003 containerd[1564]: 2025-09-16 04:54:28.492 [INFO][3954] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.50.129/26] block=192.168.50.128/26 handle="k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.524003 containerd[1564]: 2025-09-16 04:54:28.492 [INFO][3954] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.50.129/26] handle="k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:28.524003 containerd[1564]: 2025-09-16 04:54:28.492 [INFO][3954] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:54:28.524003 containerd[1564]: 2025-09-16 04:54:28.492 [INFO][3954] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.129/26] IPv6=[] ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" HandleID="k8s-pod-network.1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Workload="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" Sep 16 04:54:28.524114 containerd[1564]: 2025-09-16 04:54:28.495 [INFO][3930] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Namespace="calico-system" Pod="whisker-88666cf98-lpgjf" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0", GenerateName:"whisker-88666cf98-", Namespace:"calico-system", SelfLink:"", UID:"5581d506-2221-4a67-b872-f6f6f7c03785", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"88666cf98", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"", Pod:"whisker-88666cf98-lpgjf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.50.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif9ae9e3d3fb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:28.524114 containerd[1564]: 2025-09-16 04:54:28.495 [INFO][3930] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.50.129/32] ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Namespace="calico-system" Pod="whisker-88666cf98-lpgjf" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" Sep 16 04:54:28.524189 containerd[1564]: 2025-09-16 04:54:28.495 [INFO][3930] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9ae9e3d3fb ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Namespace="calico-system" Pod="whisker-88666cf98-lpgjf" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" Sep 16 04:54:28.524189 containerd[1564]: 2025-09-16 04:54:28.504 [INFO][3930] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Namespace="calico-system" Pod="whisker-88666cf98-lpgjf" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" Sep 16 04:54:28.524228 containerd[1564]: 2025-09-16 04:54:28.505 [INFO][3930] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Namespace="calico-system" Pod="whisker-88666cf98-lpgjf" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0", GenerateName:"whisker-88666cf98-", Namespace:"calico-system", SelfLink:"", UID:"5581d506-2221-4a67-b872-f6f6f7c03785", ResourceVersion:"870", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"88666cf98", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed", Pod:"whisker-88666cf98-lpgjf", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.50.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif9ae9e3d3fb", MAC:"ca:aa:82:94:d8:95", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:28.524295 containerd[1564]: 2025-09-16 04:54:28.514 [INFO][3930] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" Namespace="calico-system" Pod="whisker-88666cf98-lpgjf" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-whisker--88666cf98--lpgjf-eth0" Sep 16 04:54:28.613846 kubelet[2761]: I0916 04:54:28.613680 2761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbf2254-8fa7-4368-8a76-518e4c98f2ee" path="/var/lib/kubelet/pods/3fbf2254-8fa7-4368-8a76-518e4c98f2ee/volumes" Sep 16 04:54:28.709724 containerd[1564]: time="2025-09-16T04:54:28.709683045Z" level=info msg="connecting to shim 1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed" address="unix:///run/containerd/s/3fcacd38ab5fc8316c02f0a3cb0e3d887c3f2ad4b9a805c9c94f457279966f20" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:28.734338 systemd[1]: Started cri-containerd-1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed.scope - libcontainer container 1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed. Sep 16 04:54:28.757353 systemd-networkd[1468]: vxlan.calico: Link UP Sep 16 04:54:28.757355 systemd-networkd[1468]: vxlan.calico: Gained carrier Sep 16 04:54:28.820398 containerd[1564]: time="2025-09-16T04:54:28.820342237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-88666cf98-lpgjf,Uid:5581d506-2221-4a67-b872-f6f6f7c03785,Namespace:calico-system,Attempt:0,} returns sandbox id \"1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed\"" Sep 16 04:54:28.827300 containerd[1564]: time="2025-09-16T04:54:28.827279913Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 16 04:54:30.101782 systemd-networkd[1468]: vxlan.calico: Gained IPv6LL Sep 16 04:54:30.357134 systemd-networkd[1468]: calif9ae9e3d3fb: Gained IPv6LL Sep 16 04:54:30.574808 containerd[1564]: time="2025-09-16T04:54:30.574582012Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:30.579713 containerd[1564]: time="2025-09-16T04:54:30.579285639Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4661291" Sep 16 04:54:30.582438 containerd[1564]: time="2025-09-16T04:54:30.582321310Z" level=info msg="ImageCreate event name:\"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:30.588491 containerd[1564]: time="2025-09-16T04:54:30.588418201Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:30.590367 containerd[1564]: time="2025-09-16T04:54:30.590052815Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"6153986\" in 1.762734359s" Sep 16 04:54:30.590367 containerd[1564]: time="2025-09-16T04:54:30.590111174Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:9a4eedeed4a531acefb7f5d0a1b7e3856b1a9a24d9e7d25deef2134d7a734c2d\"" Sep 16 04:54:30.596933 containerd[1564]: time="2025-09-16T04:54:30.596106394Z" level=info msg="CreateContainer within sandbox \"1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 16 04:54:30.608638 containerd[1564]: time="2025-09-16T04:54:30.608071997Z" level=info msg="Container 2a71d2cfb157a09af17ee8427dbd66bf89844db8d623491437d5821fc6c5788f: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:30.615295 containerd[1564]: time="2025-09-16T04:54:30.615171005Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zbfw2,Uid:d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:30.649871 containerd[1564]: time="2025-09-16T04:54:30.649812508Z" level=info msg="CreateContainer within sandbox \"1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"2a71d2cfb157a09af17ee8427dbd66bf89844db8d623491437d5821fc6c5788f\"" Sep 16 04:54:30.652324 containerd[1564]: time="2025-09-16T04:54:30.652276448Z" level=info msg="StartContainer for \"2a71d2cfb157a09af17ee8427dbd66bf89844db8d623491437d5821fc6c5788f\"" Sep 16 04:54:30.656367 containerd[1564]: time="2025-09-16T04:54:30.656295503Z" level=info msg="connecting to shim 2a71d2cfb157a09af17ee8427dbd66bf89844db8d623491437d5821fc6c5788f" address="unix:///run/containerd/s/3fcacd38ab5fc8316c02f0a3cb0e3d887c3f2ad4b9a805c9c94f457279966f20" protocol=ttrpc version=3 Sep 16 04:54:30.687245 systemd[1]: Started cri-containerd-2a71d2cfb157a09af17ee8427dbd66bf89844db8d623491437d5821fc6c5788f.scope - libcontainer container 2a71d2cfb157a09af17ee8427dbd66bf89844db8d623491437d5821fc6c5788f. Sep 16 04:54:30.762567 containerd[1564]: time="2025-09-16T04:54:30.762483226Z" level=info msg="StartContainer for \"2a71d2cfb157a09af17ee8427dbd66bf89844db8d623491437d5821fc6c5788f\" returns successfully" Sep 16 04:54:30.765732 containerd[1564]: time="2025-09-16T04:54:30.765709495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 16 04:54:30.785688 systemd-networkd[1468]: cali79597db86f3: Link UP Sep 16 04:54:30.787491 systemd-networkd[1468]: cali79597db86f3: Gained carrier Sep 16 04:54:30.804943 containerd[1564]: 2025-09-16 04:54:30.707 [INFO][4131] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0 csi-node-driver- calico-system d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c 685 0 2025-09-16 04:54:07 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-0-0-n-06f2563e85 csi-node-driver-zbfw2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali79597db86f3 [] [] }} ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Namespace="calico-system" Pod="csi-node-driver-zbfw2" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-" Sep 16 04:54:30.804943 containerd[1564]: 2025-09-16 04:54:30.707 [INFO][4131] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Namespace="calico-system" Pod="csi-node-driver-zbfw2" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" Sep 16 04:54:30.804943 containerd[1564]: 2025-09-16 04:54:30.740 [INFO][4159] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" HandleID="k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Workload="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.740 [INFO][4159] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" HandleID="k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Workload="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c4ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-n-06f2563e85", "pod":"csi-node-driver-zbfw2", "timestamp":"2025-09-16 04:54:30.740242601 +0000 UTC"}, Hostname:"ci-4459-0-0-n-06f2563e85", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.740 [INFO][4159] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.740 [INFO][4159] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.740 [INFO][4159] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-06f2563e85' Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.749 [INFO][4159] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.758 [INFO][4159] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.762 [INFO][4159] ipam/ipam.go 511: Trying affinity for 192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.765 [INFO][4159] ipam/ipam.go 158: Attempting to load block cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.805929 containerd[1564]: 2025-09-16 04:54:30.769 [INFO][4159] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.806506 containerd[1564]: 2025-09-16 04:54:30.769 [INFO][4159] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.50.128/26 handle="k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.806506 containerd[1564]: 2025-09-16 04:54:30.771 [INFO][4159] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080 Sep 16 04:54:30.806506 containerd[1564]: 2025-09-16 04:54:30.774 [INFO][4159] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.50.128/26 handle="k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.806506 containerd[1564]: 2025-09-16 04:54:30.779 [INFO][4159] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.50.130/26] block=192.168.50.128/26 handle="k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.806506 containerd[1564]: 2025-09-16 04:54:30.779 [INFO][4159] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.50.130/26] handle="k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:30.806506 containerd[1564]: 2025-09-16 04:54:30.779 [INFO][4159] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:54:30.806506 containerd[1564]: 2025-09-16 04:54:30.779 [INFO][4159] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.130/26] IPv6=[] ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" HandleID="k8s-pod-network.618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Workload="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" Sep 16 04:54:30.806631 containerd[1564]: 2025-09-16 04:54:30.782 [INFO][4131] cni-plugin/k8s.go 418: Populated endpoint ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Namespace="calico-system" Pod="csi-node-driver-zbfw2" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"", Pod:"csi-node-driver-zbfw2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.50.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali79597db86f3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:30.806683 containerd[1564]: 2025-09-16 04:54:30.782 [INFO][4131] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.50.130/32] ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Namespace="calico-system" Pod="csi-node-driver-zbfw2" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" Sep 16 04:54:30.806683 containerd[1564]: 2025-09-16 04:54:30.782 [INFO][4131] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali79597db86f3 ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Namespace="calico-system" Pod="csi-node-driver-zbfw2" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" Sep 16 04:54:30.806683 containerd[1564]: 2025-09-16 04:54:30.787 [INFO][4131] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Namespace="calico-system" Pod="csi-node-driver-zbfw2" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" Sep 16 04:54:30.806736 containerd[1564]: 2025-09-16 04:54:30.788 [INFO][4131] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Namespace="calico-system" Pod="csi-node-driver-zbfw2" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080", Pod:"csi-node-driver-zbfw2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.50.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali79597db86f3", MAC:"0a:38:ac:9a:ce:33", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:30.806781 containerd[1564]: 2025-09-16 04:54:30.797 [INFO][4131] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" Namespace="calico-system" Pod="csi-node-driver-zbfw2" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-csi--node--driver--zbfw2-eth0" Sep 16 04:54:30.830903 containerd[1564]: time="2025-09-16T04:54:30.830849165Z" level=info msg="connecting to shim 618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080" address="unix:///run/containerd/s/d27c3aa2ce311a9cdbb9c6059769a46c85a76f950a97a21a46140c82001388b1" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:30.852018 systemd[1]: Started cri-containerd-618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080.scope - libcontainer container 618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080. Sep 16 04:54:30.879962 containerd[1564]: time="2025-09-16T04:54:30.879936160Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zbfw2,Uid:d8d8c4e0-7dcf-4596-aa36-f8d9f0f3722c,Namespace:calico-system,Attempt:0,} returns sandbox id \"618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080\"" Sep 16 04:54:31.611736 containerd[1564]: time="2025-09-16T04:54:31.611620120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hhpj6,Uid:d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655,Namespace:kube-system,Attempt:0,}" Sep 16 04:54:31.613148 containerd[1564]: time="2025-09-16T04:54:31.612975020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-rrt7l,Uid:3656d4e7-adec-4a47-bcd1-ad26e49ac2c5,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:31.816032 systemd-networkd[1468]: cali0e0bab07b48: Link UP Sep 16 04:54:31.817126 systemd-networkd[1468]: cali0e0bab07b48: Gained carrier Sep 16 04:54:31.843682 containerd[1564]: 2025-09-16 04:54:31.716 [INFO][4232] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0 goldmane-7988f88666- calico-system 3656d4e7-adec-4a47-bcd1-ad26e49ac2c5 801 0 2025-09-16 04:54:06 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-0-0-n-06f2563e85 goldmane-7988f88666-rrt7l eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0e0bab07b48 [] [] }} ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Namespace="calico-system" Pod="goldmane-7988f88666-rrt7l" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-" Sep 16 04:54:31.843682 containerd[1564]: 2025-09-16 04:54:31.716 [INFO][4232] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Namespace="calico-system" Pod="goldmane-7988f88666-rrt7l" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" Sep 16 04:54:31.843682 containerd[1564]: 2025-09-16 04:54:31.761 [INFO][4259] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" HandleID="k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Workload="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.762 [INFO][4259] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" HandleID="k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Workload="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f890), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-n-06f2563e85", "pod":"goldmane-7988f88666-rrt7l", "timestamp":"2025-09-16 04:54:31.761844965 +0000 UTC"}, Hostname:"ci-4459-0-0-n-06f2563e85", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.762 [INFO][4259] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.762 [INFO][4259] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.762 [INFO][4259] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-06f2563e85' Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.770 [INFO][4259] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.776 [INFO][4259] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.781 [INFO][4259] ipam/ipam.go 511: Trying affinity for 192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.784 [INFO][4259] ipam/ipam.go 158: Attempting to load block cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.844831 containerd[1564]: 2025-09-16 04:54:31.789 [INFO][4259] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.846452 containerd[1564]: 2025-09-16 04:54:31.789 [INFO][4259] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.50.128/26 handle="k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.846452 containerd[1564]: 2025-09-16 04:54:31.791 [INFO][4259] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f Sep 16 04:54:31.846452 containerd[1564]: 2025-09-16 04:54:31.796 [INFO][4259] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.50.128/26 handle="k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.846452 containerd[1564]: 2025-09-16 04:54:31.803 [INFO][4259] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.50.131/26] block=192.168.50.128/26 handle="k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.846452 containerd[1564]: 2025-09-16 04:54:31.804 [INFO][4259] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.50.131/26] handle="k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.846452 containerd[1564]: 2025-09-16 04:54:31.804 [INFO][4259] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:54:31.846452 containerd[1564]: 2025-09-16 04:54:31.804 [INFO][4259] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.131/26] IPv6=[] ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" HandleID="k8s-pod-network.5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Workload="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" Sep 16 04:54:31.846871 containerd[1564]: 2025-09-16 04:54:31.807 [INFO][4232] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Namespace="calico-system" Pod="goldmane-7988f88666-rrt7l" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"3656d4e7-adec-4a47-bcd1-ad26e49ac2c5", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"", Pod:"goldmane-7988f88666-rrt7l", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.50.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0e0bab07b48", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:31.847222 containerd[1564]: 2025-09-16 04:54:31.808 [INFO][4232] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.50.131/32] ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Namespace="calico-system" Pod="goldmane-7988f88666-rrt7l" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" Sep 16 04:54:31.847222 containerd[1564]: 2025-09-16 04:54:31.808 [INFO][4232] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0e0bab07b48 ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Namespace="calico-system" Pod="goldmane-7988f88666-rrt7l" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" Sep 16 04:54:31.847222 containerd[1564]: 2025-09-16 04:54:31.817 [INFO][4232] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Namespace="calico-system" Pod="goldmane-7988f88666-rrt7l" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" Sep 16 04:54:31.847332 containerd[1564]: 2025-09-16 04:54:31.817 [INFO][4232] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Namespace="calico-system" Pod="goldmane-7988f88666-rrt7l" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"3656d4e7-adec-4a47-bcd1-ad26e49ac2c5", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f", Pod:"goldmane-7988f88666-rrt7l", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.50.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0e0bab07b48", MAC:"ba:e5:fe:e1:c8:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:31.847412 containerd[1564]: 2025-09-16 04:54:31.838 [INFO][4232] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" Namespace="calico-system" Pod="goldmane-7988f88666-rrt7l" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-goldmane--7988f88666--rrt7l-eth0" Sep 16 04:54:31.885799 containerd[1564]: time="2025-09-16T04:54:31.885052760Z" level=info msg="connecting to shim 5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f" address="unix:///run/containerd/s/0dc538fb743cc8c7b968aab78ea9ca72596bcfde27ab9fd12c9b370a112c7fdc" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:31.923009 systemd[1]: Started cri-containerd-5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f.scope - libcontainer container 5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f. Sep 16 04:54:31.929560 systemd-networkd[1468]: cali578c9075999: Link UP Sep 16 04:54:31.930917 systemd-networkd[1468]: cali578c9075999: Gained carrier Sep 16 04:54:31.950020 containerd[1564]: 2025-09-16 04:54:31.716 [INFO][4231] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0 coredns-7c65d6cfc9- kube-system d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655 794 0 2025-09-16 04:53:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-0-0-n-06f2563e85 coredns-7c65d6cfc9-hhpj6 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali578c9075999 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hhpj6" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-" Sep 16 04:54:31.950020 containerd[1564]: 2025-09-16 04:54:31.716 [INFO][4231] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hhpj6" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" Sep 16 04:54:31.950020 containerd[1564]: 2025-09-16 04:54:31.768 [INFO][4256] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" HandleID="k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Workload="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.769 [INFO][4256] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" HandleID="k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Workload="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332140), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-0-0-n-06f2563e85", "pod":"coredns-7c65d6cfc9-hhpj6", "timestamp":"2025-09-16 04:54:31.768860448 +0000 UTC"}, Hostname:"ci-4459-0-0-n-06f2563e85", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.769 [INFO][4256] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.804 [INFO][4256] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.804 [INFO][4256] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-06f2563e85' Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.871 [INFO][4256] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.880 [INFO][4256] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.888 [INFO][4256] ipam/ipam.go 511: Trying affinity for 192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.890 [INFO][4256] ipam/ipam.go 158: Attempting to load block cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.950389 containerd[1564]: 2025-09-16 04:54:31.897 [INFO][4256] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.951004 containerd[1564]: 2025-09-16 04:54:31.897 [INFO][4256] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.50.128/26 handle="k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.951004 containerd[1564]: 2025-09-16 04:54:31.902 [INFO][4256] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542 Sep 16 04:54:31.951004 containerd[1564]: 2025-09-16 04:54:31.909 [INFO][4256] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.50.128/26 handle="k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.951004 containerd[1564]: 2025-09-16 04:54:31.916 [INFO][4256] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.50.132/26] block=192.168.50.128/26 handle="k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.951004 containerd[1564]: 2025-09-16 04:54:31.916 [INFO][4256] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.50.132/26] handle="k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:31.951004 containerd[1564]: 2025-09-16 04:54:31.916 [INFO][4256] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:54:31.951004 containerd[1564]: 2025-09-16 04:54:31.916 [INFO][4256] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.132/26] IPv6=[] ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" HandleID="k8s-pod-network.cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Workload="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" Sep 16 04:54:31.951120 containerd[1564]: 2025-09-16 04:54:31.922 [INFO][4231] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hhpj6" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 53, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"", Pod:"coredns-7c65d6cfc9-hhpj6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.50.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali578c9075999", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:31.951120 containerd[1564]: 2025-09-16 04:54:31.922 [INFO][4231] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.50.132/32] ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hhpj6" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" Sep 16 04:54:31.951120 containerd[1564]: 2025-09-16 04:54:31.925 [INFO][4231] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali578c9075999 ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hhpj6" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" Sep 16 04:54:31.951120 containerd[1564]: 2025-09-16 04:54:31.932 [INFO][4231] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hhpj6" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" Sep 16 04:54:31.951120 containerd[1564]: 2025-09-16 04:54:31.933 [INFO][4231] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hhpj6" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 53, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542", Pod:"coredns-7c65d6cfc9-hhpj6", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.50.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali578c9075999", MAC:"a2:50:35:68:15:7b", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:31.951120 containerd[1564]: 2025-09-16 04:54:31.945 [INFO][4231] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" Namespace="kube-system" Pod="coredns-7c65d6cfc9-hhpj6" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--hhpj6-eth0" Sep 16 04:54:31.978005 containerd[1564]: time="2025-09-16T04:54:31.977975398Z" level=info msg="connecting to shim cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542" address="unix:///run/containerd/s/f9055bcd57e154da5427a2123a2a81962efafd03f890381c579712e04bd159a4" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:32.000120 systemd[1]: Started cri-containerd-cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542.scope - libcontainer container cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542. Sep 16 04:54:32.004459 containerd[1564]: time="2025-09-16T04:54:32.004435414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-rrt7l,Uid:3656d4e7-adec-4a47-bcd1-ad26e49ac2c5,Namespace:calico-system,Attempt:0,} returns sandbox id \"5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f\"" Sep 16 04:54:32.043131 containerd[1564]: time="2025-09-16T04:54:32.043097136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-hhpj6,Uid:d5bfb5d1-08f7-48ee-b5e1-6e0ba3bf8655,Namespace:kube-system,Attempt:0,} returns sandbox id \"cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542\"" Sep 16 04:54:32.046729 containerd[1564]: time="2025-09-16T04:54:32.046428091Z" level=info msg="CreateContainer within sandbox \"cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 04:54:32.078220 containerd[1564]: time="2025-09-16T04:54:32.078192042Z" level=info msg="Container 80a7d7ee3f8e90f1ffb30f9ecc7de8408593439349af83da0397875a2508ab62: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:32.088136 containerd[1564]: time="2025-09-16T04:54:32.088102952Z" level=info msg="CreateContainer within sandbox \"cf75f29daa640df684b2815e88370e3449ad5b86c09df7c2beae06a94db69542\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"80a7d7ee3f8e90f1ffb30f9ecc7de8408593439349af83da0397875a2508ab62\"" Sep 16 04:54:32.088693 containerd[1564]: time="2025-09-16T04:54:32.088607038Z" level=info msg="StartContainer for \"80a7d7ee3f8e90f1ffb30f9ecc7de8408593439349af83da0397875a2508ab62\"" Sep 16 04:54:32.089806 containerd[1564]: time="2025-09-16T04:54:32.089768435Z" level=info msg="connecting to shim 80a7d7ee3f8e90f1ffb30f9ecc7de8408593439349af83da0397875a2508ab62" address="unix:///run/containerd/s/f9055bcd57e154da5427a2123a2a81962efafd03f890381c579712e04bd159a4" protocol=ttrpc version=3 Sep 16 04:54:32.110048 systemd[1]: Started cri-containerd-80a7d7ee3f8e90f1ffb30f9ecc7de8408593439349af83da0397875a2508ab62.scope - libcontainer container 80a7d7ee3f8e90f1ffb30f9ecc7de8408593439349af83da0397875a2508ab62. Sep 16 04:54:32.149153 containerd[1564]: time="2025-09-16T04:54:32.149030088Z" level=info msg="StartContainer for \"80a7d7ee3f8e90f1ffb30f9ecc7de8408593439349af83da0397875a2508ab62\" returns successfully" Sep 16 04:54:32.406300 systemd-networkd[1468]: cali79597db86f3: Gained IPv6LL Sep 16 04:54:32.612794 containerd[1564]: time="2025-09-16T04:54:32.612705528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58cb7b665-w4dc5,Uid:d6a95398-d9c4-4910-8a9c-46884be92ce4,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:54:32.614218 containerd[1564]: time="2025-09-16T04:54:32.612740383Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54dcbc64bf-rskqr,Uid:a775d4c8-1673-4091-971b-6305d05b964b,Namespace:calico-system,Attempt:0,}" Sep 16 04:54:32.615418 containerd[1564]: time="2025-09-16T04:54:32.615099245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58cb7b665-bgpf8,Uid:22e7ad62-06be-460a-9992-748a94f58366,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:54:32.784629 systemd-networkd[1468]: calif6e96717def: Link UP Sep 16 04:54:32.784975 systemd-networkd[1468]: calif6e96717def: Gained carrier Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.716 [INFO][4430] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0 calico-kube-controllers-54dcbc64bf- calico-system a775d4c8-1673-4091-971b-6305d05b964b 802 0 2025-09-16 04:54:08 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:54dcbc64bf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-0-0-n-06f2563e85 calico-kube-controllers-54dcbc64bf-rskqr eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calif6e96717def [] [] }} ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Namespace="calico-system" Pod="calico-kube-controllers-54dcbc64bf-rskqr" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.716 [INFO][4430] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Namespace="calico-system" Pod="calico-kube-controllers-54dcbc64bf-rskqr" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.750 [INFO][4458] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" HandleID="k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.751 [INFO][4458] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" HandleID="k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002df610), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-n-06f2563e85", "pod":"calico-kube-controllers-54dcbc64bf-rskqr", "timestamp":"2025-09-16 04:54:32.750883882 +0000 UTC"}, Hostname:"ci-4459-0-0-n-06f2563e85", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.751 [INFO][4458] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.751 [INFO][4458] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.751 [INFO][4458] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-06f2563e85' Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.756 [INFO][4458] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.760 [INFO][4458] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.765 [INFO][4458] ipam/ipam.go 511: Trying affinity for 192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.767 [INFO][4458] ipam/ipam.go 158: Attempting to load block cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.769 [INFO][4458] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.769 [INFO][4458] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.50.128/26 handle="k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.770 [INFO][4458] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5 Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.774 [INFO][4458] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.50.128/26 handle="k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.779 [INFO][4458] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.50.133/26] block=192.168.50.128/26 handle="k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.779 [INFO][4458] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.50.133/26] handle="k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.779 [INFO][4458] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:54:32.803782 containerd[1564]: 2025-09-16 04:54:32.779 [INFO][4458] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.133/26] IPv6=[] ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" HandleID="k8s-pod-network.34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" Sep 16 04:54:32.805757 containerd[1564]: 2025-09-16 04:54:32.781 [INFO][4430] cni-plugin/k8s.go 418: Populated endpoint ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Namespace="calico-system" Pod="calico-kube-controllers-54dcbc64bf-rskqr" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0", GenerateName:"calico-kube-controllers-54dcbc64bf-", Namespace:"calico-system", SelfLink:"", UID:"a775d4c8-1673-4091-971b-6305d05b964b", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54dcbc64bf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"", Pod:"calico-kube-controllers-54dcbc64bf-rskqr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.50.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif6e96717def", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:32.805757 containerd[1564]: 2025-09-16 04:54:32.781 [INFO][4430] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.50.133/32] ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Namespace="calico-system" Pod="calico-kube-controllers-54dcbc64bf-rskqr" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" Sep 16 04:54:32.805757 containerd[1564]: 2025-09-16 04:54:32.781 [INFO][4430] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif6e96717def ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Namespace="calico-system" Pod="calico-kube-controllers-54dcbc64bf-rskqr" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" Sep 16 04:54:32.805757 containerd[1564]: 2025-09-16 04:54:32.783 [INFO][4430] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Namespace="calico-system" Pod="calico-kube-controllers-54dcbc64bf-rskqr" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" Sep 16 04:54:32.805757 containerd[1564]: 2025-09-16 04:54:32.783 [INFO][4430] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Namespace="calico-system" Pod="calico-kube-controllers-54dcbc64bf-rskqr" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0", GenerateName:"calico-kube-controllers-54dcbc64bf-", Namespace:"calico-system", SelfLink:"", UID:"a775d4c8-1673-4091-971b-6305d05b964b", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54dcbc64bf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5", Pod:"calico-kube-controllers-54dcbc64bf-rskqr", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.50.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calif6e96717def", MAC:"c6:e1:83:4c:4f:57", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:32.805757 containerd[1564]: 2025-09-16 04:54:32.797 [INFO][4430] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" Namespace="calico-system" Pod="calico-kube-controllers-54dcbc64bf-rskqr" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--kube--controllers--54dcbc64bf--rskqr-eth0" Sep 16 04:54:32.844301 kubelet[2761]: I0916 04:54:32.844111 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-hhpj6" podStartSLOduration=37.84409318 podStartE2EDuration="37.84409318s" podCreationTimestamp="2025-09-16 04:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:54:32.842587406 +0000 UTC m=+44.366838036" watchObservedRunningTime="2025-09-16 04:54:32.84409318 +0000 UTC m=+44.368343810" Sep 16 04:54:32.850430 containerd[1564]: time="2025-09-16T04:54:32.850370558Z" level=info msg="connecting to shim 34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5" address="unix:///run/containerd/s/07f4936c72386ee7ad9b0347964fd3337a0b84242d55f99f0eb8acfef905d303" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:32.879074 systemd[1]: Started cri-containerd-34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5.scope - libcontainer container 34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5. Sep 16 04:54:32.927526 systemd-networkd[1468]: calib47911398b7: Link UP Sep 16 04:54:32.928633 systemd-networkd[1468]: calib47911398b7: Gained carrier Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.703 [INFO][4414] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0 calico-apiserver-58cb7b665- calico-apiserver d6a95398-d9c4-4910-8a9c-46884be92ce4 800 0 2025-09-16 04:54:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:58cb7b665 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-n-06f2563e85 calico-apiserver-58cb7b665-w4dc5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib47911398b7 [] [] }} ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-w4dc5" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.703 [INFO][4414] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-w4dc5" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.760 [INFO][4453] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" HandleID="k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.760 [INFO][4453] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" HandleID="k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003558e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-n-06f2563e85", "pod":"calico-apiserver-58cb7b665-w4dc5", "timestamp":"2025-09-16 04:54:32.760497586 +0000 UTC"}, Hostname:"ci-4459-0-0-n-06f2563e85", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.760 [INFO][4453] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.779 [INFO][4453] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.779 [INFO][4453] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-06f2563e85' Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.863 [INFO][4453] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.874 [INFO][4453] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.889 [INFO][4453] ipam/ipam.go 511: Trying affinity for 192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.892 [INFO][4453] ipam/ipam.go 158: Attempting to load block cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.895 [INFO][4453] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.895 [INFO][4453] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.50.128/26 handle="k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.898 [INFO][4453] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872 Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.902 [INFO][4453] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.50.128/26 handle="k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.908 [INFO][4453] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.50.134/26] block=192.168.50.128/26 handle="k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.909 [INFO][4453] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.50.134/26] handle="k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.909 [INFO][4453] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:54:32.947273 containerd[1564]: 2025-09-16 04:54:32.909 [INFO][4453] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.134/26] IPv6=[] ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" HandleID="k8s-pod-network.d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" Sep 16 04:54:32.949555 containerd[1564]: 2025-09-16 04:54:32.918 [INFO][4414] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-w4dc5" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0", GenerateName:"calico-apiserver-58cb7b665-", Namespace:"calico-apiserver", SelfLink:"", UID:"d6a95398-d9c4-4910-8a9c-46884be92ce4", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58cb7b665", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"", Pod:"calico-apiserver-58cb7b665-w4dc5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.50.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib47911398b7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:32.949555 containerd[1564]: 2025-09-16 04:54:32.919 [INFO][4414] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.50.134/32] ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-w4dc5" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" Sep 16 04:54:32.949555 containerd[1564]: 2025-09-16 04:54:32.919 [INFO][4414] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib47911398b7 ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-w4dc5" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" Sep 16 04:54:32.949555 containerd[1564]: 2025-09-16 04:54:32.930 [INFO][4414] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-w4dc5" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" Sep 16 04:54:32.949555 containerd[1564]: 2025-09-16 04:54:32.931 [INFO][4414] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-w4dc5" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0", GenerateName:"calico-apiserver-58cb7b665-", Namespace:"calico-apiserver", SelfLink:"", UID:"d6a95398-d9c4-4910-8a9c-46884be92ce4", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58cb7b665", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872", Pod:"calico-apiserver-58cb7b665-w4dc5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.50.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib47911398b7", MAC:"76:96:cf:47:48:c8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:32.949555 containerd[1564]: 2025-09-16 04:54:32.942 [INFO][4414] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-w4dc5" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--w4dc5-eth0" Sep 16 04:54:32.949555 containerd[1564]: time="2025-09-16T04:54:32.947713946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54dcbc64bf-rskqr,Uid:a775d4c8-1673-4091-971b-6305d05b964b,Namespace:calico-system,Attempt:0,} returns sandbox id \"34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5\"" Sep 16 04:54:32.982614 systemd-networkd[1468]: cali578c9075999: Gained IPv6LL Sep 16 04:54:32.996954 containerd[1564]: time="2025-09-16T04:54:32.996902684Z" level=info msg="connecting to shim d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872" address="unix:///run/containerd/s/dce727bd3e285e8d583858973bcb15cfd6b85b794ddeb920f146579d8e121e91" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:33.023020 systemd[1]: Started cri-containerd-d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872.scope - libcontainer container d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872. Sep 16 04:54:33.026830 systemd-networkd[1468]: calif9ba969d6ef: Link UP Sep 16 04:54:33.027018 systemd-networkd[1468]: calif9ba969d6ef: Gained carrier Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.721 [INFO][4422] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0 calico-apiserver-58cb7b665- calico-apiserver 22e7ad62-06be-460a-9992-748a94f58366 805 0 2025-09-16 04:54:04 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:58cb7b665 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-n-06f2563e85 calico-apiserver-58cb7b665-bgpf8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif9ba969d6ef [] [] }} ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-bgpf8" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.721 [INFO][4422] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-bgpf8" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.762 [INFO][4461] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" HandleID="k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.762 [INFO][4461] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" HandleID="k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-n-06f2563e85", "pod":"calico-apiserver-58cb7b665-bgpf8", "timestamp":"2025-09-16 04:54:32.762202402 +0000 UTC"}, Hostname:"ci-4459-0-0-n-06f2563e85", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.762 [INFO][4461] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.909 [INFO][4461] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.909 [INFO][4461] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-06f2563e85' Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.960 [INFO][4461] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.974 [INFO][4461] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.986 [INFO][4461] ipam/ipam.go 511: Trying affinity for 192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.989 [INFO][4461] ipam/ipam.go 158: Attempting to load block cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.993 [INFO][4461] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.993 [INFO][4461] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.50.128/26 handle="k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:32.997 [INFO][4461] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:33.004 [INFO][4461] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.50.128/26 handle="k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:33.016 [INFO][4461] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.50.135/26] block=192.168.50.128/26 handle="k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:33.017 [INFO][4461] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.50.135/26] handle="k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:33.017 [INFO][4461] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:54:33.046197 containerd[1564]: 2025-09-16 04:54:33.018 [INFO][4461] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.135/26] IPv6=[] ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" HandleID="k8s-pod-network.e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Workload="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" Sep 16 04:54:33.048633 containerd[1564]: 2025-09-16 04:54:33.021 [INFO][4422] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-bgpf8" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0", GenerateName:"calico-apiserver-58cb7b665-", Namespace:"calico-apiserver", SelfLink:"", UID:"22e7ad62-06be-460a-9992-748a94f58366", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58cb7b665", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"", Pod:"calico-apiserver-58cb7b665-bgpf8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.50.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9ba969d6ef", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:33.048633 containerd[1564]: 2025-09-16 04:54:33.022 [INFO][4422] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.50.135/32] ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-bgpf8" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" Sep 16 04:54:33.048633 containerd[1564]: 2025-09-16 04:54:33.022 [INFO][4422] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9ba969d6ef ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-bgpf8" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" Sep 16 04:54:33.048633 containerd[1564]: 2025-09-16 04:54:33.026 [INFO][4422] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-bgpf8" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" Sep 16 04:54:33.048633 containerd[1564]: 2025-09-16 04:54:33.026 [INFO][4422] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-bgpf8" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0", GenerateName:"calico-apiserver-58cb7b665-", Namespace:"calico-apiserver", SelfLink:"", UID:"22e7ad62-06be-460a-9992-748a94f58366", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 54, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"58cb7b665", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f", Pod:"calico-apiserver-58cb7b665-bgpf8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.50.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9ba969d6ef", MAC:"e6:6a:31:af:08:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:33.048633 containerd[1564]: 2025-09-16 04:54:33.039 [INFO][4422] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" Namespace="calico-apiserver" Pod="calico-apiserver-58cb7b665-bgpf8" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-calico--apiserver--58cb7b665--bgpf8-eth0" Sep 16 04:54:33.103346 containerd[1564]: time="2025-09-16T04:54:33.103315397Z" level=info msg="connecting to shim e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f" address="unix:///run/containerd/s/af5eddbc0534fffad6006bca0b8e13fd991d404b4aa05e62dd3703edfda6aec5" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:33.108412 containerd[1564]: time="2025-09-16T04:54:33.108390441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58cb7b665-w4dc5,Uid:d6a95398-d9c4-4910-8a9c-46884be92ce4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872\"" Sep 16 04:54:33.133023 systemd[1]: Started cri-containerd-e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f.scope - libcontainer container e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f. Sep 16 04:54:33.175840 containerd[1564]: time="2025-09-16T04:54:33.175806463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-58cb7b665-bgpf8,Uid:22e7ad62-06be-460a-9992-748a94f58366,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f\"" Sep 16 04:54:33.429989 systemd-networkd[1468]: cali0e0bab07b48: Gained IPv6LL Sep 16 04:54:33.611117 containerd[1564]: time="2025-09-16T04:54:33.611062261Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6qnvb,Uid:369182db-6487-436b-a33d-08f7db2acf90,Namespace:kube-system,Attempt:0,}" Sep 16 04:54:33.738733 systemd-networkd[1468]: cali54ff5be8146: Link UP Sep 16 04:54:33.740470 systemd-networkd[1468]: cali54ff5be8146: Gained carrier Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.671 [INFO][4637] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0 coredns-7c65d6cfc9- kube-system 369182db-6487-436b-a33d-08f7db2acf90 803 0 2025-09-16 04:53:55 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-0-0-n-06f2563e85 coredns-7c65d6cfc9-6qnvb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali54ff5be8146 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6qnvb" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.672 [INFO][4637] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6qnvb" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.697 [INFO][4648] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" HandleID="k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Workload="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.697 [INFO][4648] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" HandleID="k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Workload="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f1e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-0-0-n-06f2563e85", "pod":"coredns-7c65d6cfc9-6qnvb", "timestamp":"2025-09-16 04:54:33.69748118 +0000 UTC"}, Hostname:"ci-4459-0-0-n-06f2563e85", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.698 [INFO][4648] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.698 [INFO][4648] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.698 [INFO][4648] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-06f2563e85' Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.706 [INFO][4648] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.710 [INFO][4648] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.714 [INFO][4648] ipam/ipam.go 511: Trying affinity for 192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.716 [INFO][4648] ipam/ipam.go 158: Attempting to load block cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.718 [INFO][4648] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.50.128/26 host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.718 [INFO][4648] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.50.128/26 handle="k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.720 [INFO][4648] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913 Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.724 [INFO][4648] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.50.128/26 handle="k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.733 [INFO][4648] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.50.136/26] block=192.168.50.128/26 handle="k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.733 [INFO][4648] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.50.136/26] handle="k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" host="ci-4459-0-0-n-06f2563e85" Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.733 [INFO][4648] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:54:33.757028 containerd[1564]: 2025-09-16 04:54:33.733 [INFO][4648] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.50.136/26] IPv6=[] ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" HandleID="k8s-pod-network.2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Workload="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" Sep 16 04:54:33.759476 containerd[1564]: 2025-09-16 04:54:33.736 [INFO][4637] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6qnvb" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"369182db-6487-436b-a33d-08f7db2acf90", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 53, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"", Pod:"coredns-7c65d6cfc9-6qnvb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.50.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali54ff5be8146", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:33.759476 containerd[1564]: 2025-09-16 04:54:33.736 [INFO][4637] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.50.136/32] ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6qnvb" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" Sep 16 04:54:33.759476 containerd[1564]: 2025-09-16 04:54:33.736 [INFO][4637] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali54ff5be8146 ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6qnvb" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" Sep 16 04:54:33.759476 containerd[1564]: 2025-09-16 04:54:33.739 [INFO][4637] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6qnvb" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" Sep 16 04:54:33.759476 containerd[1564]: 2025-09-16 04:54:33.740 [INFO][4637] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6qnvb" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"369182db-6487-436b-a33d-08f7db2acf90", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 53, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-06f2563e85", ContainerID:"2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913", Pod:"coredns-7c65d6cfc9-6qnvb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.50.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali54ff5be8146", MAC:"46:35:4d:71:80:b0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:54:33.759476 containerd[1564]: 2025-09-16 04:54:33.753 [INFO][4637] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" Namespace="kube-system" Pod="coredns-7c65d6cfc9-6qnvb" WorkloadEndpoint="ci--4459--0--0--n--06f2563e85-k8s-coredns--7c65d6cfc9--6qnvb-eth0" Sep 16 04:54:33.848598 containerd[1564]: time="2025-09-16T04:54:33.848487096Z" level=info msg="connecting to shim 2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913" address="unix:///run/containerd/s/d67c27afe3dad45cd7538d9794e17505c52c6207ea0dad27a6bcfcab08202b05" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:54:33.873025 systemd[1]: Started cri-containerd-2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913.scope - libcontainer container 2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913. Sep 16 04:54:33.928351 containerd[1564]: time="2025-09-16T04:54:33.928324385Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-6qnvb,Uid:369182db-6487-436b-a33d-08f7db2acf90,Namespace:kube-system,Attempt:0,} returns sandbox id \"2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913\"" Sep 16 04:54:33.933155 containerd[1564]: time="2025-09-16T04:54:33.933134873Z" level=info msg="CreateContainer within sandbox \"2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 04:54:33.960715 containerd[1564]: time="2025-09-16T04:54:33.960598172Z" level=info msg="Container efd5e4466a03598ed4dd91bb722c71954df31c486bb0b5ff3b9f7d2bbf8dc785: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:33.991092 containerd[1564]: time="2025-09-16T04:54:33.990978870Z" level=info msg="CreateContainer within sandbox \"2683c03293df5e4d8199a4431ca0fab1a1166203fc6deab3807ae45c8f367913\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"efd5e4466a03598ed4dd91bb722c71954df31c486bb0b5ff3b9f7d2bbf8dc785\"" Sep 16 04:54:33.991566 containerd[1564]: time="2025-09-16T04:54:33.991518842Z" level=info msg="StartContainer for \"efd5e4466a03598ed4dd91bb722c71954df31c486bb0b5ff3b9f7d2bbf8dc785\"" Sep 16 04:54:33.993823 containerd[1564]: time="2025-09-16T04:54:33.993760594Z" level=info msg="connecting to shim efd5e4466a03598ed4dd91bb722c71954df31c486bb0b5ff3b9f7d2bbf8dc785" address="unix:///run/containerd/s/d67c27afe3dad45cd7538d9794e17505c52c6207ea0dad27a6bcfcab08202b05" protocol=ttrpc version=3 Sep 16 04:54:34.014040 systemd[1]: Started cri-containerd-efd5e4466a03598ed4dd91bb722c71954df31c486bb0b5ff3b9f7d2bbf8dc785.scope - libcontainer container efd5e4466a03598ed4dd91bb722c71954df31c486bb0b5ff3b9f7d2bbf8dc785. Sep 16 04:54:34.071969 containerd[1564]: time="2025-09-16T04:54:34.071766922Z" level=info msg="StartContainer for \"efd5e4466a03598ed4dd91bb722c71954df31c486bb0b5ff3b9f7d2bbf8dc785\" returns successfully" Sep 16 04:54:34.403876 containerd[1564]: time="2025-09-16T04:54:34.403668378Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:34.408110 containerd[1564]: time="2025-09-16T04:54:34.408054461Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=33085545" Sep 16 04:54:34.417234 containerd[1564]: time="2025-09-16T04:54:34.417157698Z" level=info msg="ImageCreate event name:\"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:34.428621 containerd[1564]: time="2025-09-16T04:54:34.428491328Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"33085375\" in 3.662746436s" Sep 16 04:54:34.428621 containerd[1564]: time="2025-09-16T04:54:34.428539949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:7e29b0984d517678aab6ca138482c318989f6f28daf9d3b5dd6e4a5a3115ac16\"" Sep 16 04:54:34.461040 systemd-networkd[1468]: calib47911398b7: Gained IPv6LL Sep 16 04:54:34.473976 containerd[1564]: time="2025-09-16T04:54:34.470979254Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:34.473976 containerd[1564]: time="2025-09-16T04:54:34.471793932Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 16 04:54:34.476962 containerd[1564]: time="2025-09-16T04:54:34.476515404Z" level=info msg="CreateContainer within sandbox \"1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 16 04:54:34.551507 containerd[1564]: time="2025-09-16T04:54:34.551453969Z" level=info msg="Container 09882a1f26010a2629897d4422da1ef2f0225b626326c04254b9f8acabfb4b1c: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:34.599481 containerd[1564]: time="2025-09-16T04:54:34.599399899Z" level=info msg="CreateContainer within sandbox \"1308e012a58a41e0b2cb2f70a4b2d76eaff8cf1462294bd32857f4aa005347ed\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"09882a1f26010a2629897d4422da1ef2f0225b626326c04254b9f8acabfb4b1c\"" Sep 16 04:54:34.600588 containerd[1564]: time="2025-09-16T04:54:34.600432866Z" level=info msg="StartContainer for \"09882a1f26010a2629897d4422da1ef2f0225b626326c04254b9f8acabfb4b1c\"" Sep 16 04:54:34.602622 containerd[1564]: time="2025-09-16T04:54:34.602551078Z" level=info msg="connecting to shim 09882a1f26010a2629897d4422da1ef2f0225b626326c04254b9f8acabfb4b1c" address="unix:///run/containerd/s/3fcacd38ab5fc8316c02f0a3cb0e3d887c3f2ad4b9a805c9c94f457279966f20" protocol=ttrpc version=3 Sep 16 04:54:34.630891 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1162115747.mount: Deactivated successfully. Sep 16 04:54:34.646014 systemd-networkd[1468]: calif6e96717def: Gained IPv6LL Sep 16 04:54:34.649126 systemd[1]: Started cri-containerd-09882a1f26010a2629897d4422da1ef2f0225b626326c04254b9f8acabfb4b1c.scope - libcontainer container 09882a1f26010a2629897d4422da1ef2f0225b626326c04254b9f8acabfb4b1c. Sep 16 04:54:34.757221 containerd[1564]: time="2025-09-16T04:54:34.757155233Z" level=info msg="StartContainer for \"09882a1f26010a2629897d4422da1ef2f0225b626326c04254b9f8acabfb4b1c\" returns successfully" Sep 16 04:54:34.774037 systemd-networkd[1468]: calif9ba969d6ef: Gained IPv6LL Sep 16 04:54:34.872068 kubelet[2761]: I0916 04:54:34.872008 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-6qnvb" podStartSLOduration=39.871992872999996 podStartE2EDuration="39.871992873s" podCreationTimestamp="2025-09-16 04:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:54:34.871958589 +0000 UTC m=+46.396209239" watchObservedRunningTime="2025-09-16 04:54:34.871992873 +0000 UTC m=+46.396243503" Sep 16 04:54:34.886182 kubelet[2761]: I0916 04:54:34.885943 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-88666cf98-lpgjf" podStartSLOduration=2.282358925 podStartE2EDuration="7.885927949s" podCreationTimestamp="2025-09-16 04:54:27 +0000 UTC" firstStartedPulling="2025-09-16 04:54:28.826756762 +0000 UTC m=+40.351007392" lastFinishedPulling="2025-09-16 04:54:34.430325756 +0000 UTC m=+45.954576416" observedRunningTime="2025-09-16 04:54:34.885653604 +0000 UTC m=+46.409904254" watchObservedRunningTime="2025-09-16 04:54:34.885927949 +0000 UTC m=+46.410178579" Sep 16 04:54:35.093536 systemd-networkd[1468]: cali54ff5be8146: Gained IPv6LL Sep 16 04:54:36.568420 kubelet[2761]: I0916 04:54:36.568341 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:54:36.963239 containerd[1564]: time="2025-09-16T04:54:36.962967304Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91\" id:\"628f98f4343b72ac79509b22b69d1b18e5b95570c76e73f9ba32ca3b354c74ee\" pid:4808 exited_at:{seconds:1757998476 nanos:906432991}" Sep 16 04:54:37.075872 containerd[1564]: time="2025-09-16T04:54:37.075824386Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91\" id:\"cb55ab72144629842b0b0d347b008916163fac49126b39fc6eeb348689e91036\" pid:4832 exited_at:{seconds:1757998477 nanos:75039073}" Sep 16 04:54:37.183629 containerd[1564]: time="2025-09-16T04:54:37.183570926Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:37.193284 containerd[1564]: time="2025-09-16T04:54:37.193223544Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8760527" Sep 16 04:54:37.204360 containerd[1564]: time="2025-09-16T04:54:37.204313447Z" level=info msg="ImageCreate event name:\"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:37.212314 containerd[1564]: time="2025-09-16T04:54:37.212225591Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:37.212734 containerd[1564]: time="2025-09-16T04:54:37.212710921Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"10253230\" in 2.740880771s" Sep 16 04:54:37.212817 containerd[1564]: time="2025-09-16T04:54:37.212802513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:666f4e02e75c30547109a06ed75b415a990a970811173aa741379cfaac4d9dd7\"" Sep 16 04:54:37.213940 containerd[1564]: time="2025-09-16T04:54:37.213825360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 16 04:54:37.216374 containerd[1564]: time="2025-09-16T04:54:37.216345625Z" level=info msg="CreateContainer within sandbox \"618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 16 04:54:37.275113 containerd[1564]: time="2025-09-16T04:54:37.274386704Z" level=info msg="Container 85ff2eecbf858bb19ebb63f42e336553df78963374f0a7b2227a8fc61b28c7d9: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:37.318118 containerd[1564]: time="2025-09-16T04:54:37.318078611Z" level=info msg="CreateContainer within sandbox \"618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"85ff2eecbf858bb19ebb63f42e336553df78963374f0a7b2227a8fc61b28c7d9\"" Sep 16 04:54:37.320111 containerd[1564]: time="2025-09-16T04:54:37.318736846Z" level=info msg="StartContainer for \"85ff2eecbf858bb19ebb63f42e336553df78963374f0a7b2227a8fc61b28c7d9\"" Sep 16 04:54:37.320111 containerd[1564]: time="2025-09-16T04:54:37.320053725Z" level=info msg="connecting to shim 85ff2eecbf858bb19ebb63f42e336553df78963374f0a7b2227a8fc61b28c7d9" address="unix:///run/containerd/s/d27c3aa2ce311a9cdbb9c6059769a46c85a76f950a97a21a46140c82001388b1" protocol=ttrpc version=3 Sep 16 04:54:37.342037 systemd[1]: Started cri-containerd-85ff2eecbf858bb19ebb63f42e336553df78963374f0a7b2227a8fc61b28c7d9.scope - libcontainer container 85ff2eecbf858bb19ebb63f42e336553df78963374f0a7b2227a8fc61b28c7d9. Sep 16 04:54:37.382384 containerd[1564]: time="2025-09-16T04:54:37.382355562Z" level=info msg="StartContainer for \"85ff2eecbf858bb19ebb63f42e336553df78963374f0a7b2227a8fc61b28c7d9\" returns successfully" Sep 16 04:54:40.586694 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount959728580.mount: Deactivated successfully. Sep 16 04:54:41.163044 containerd[1564]: time="2025-09-16T04:54:41.162612664Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:41.167875 containerd[1564]: time="2025-09-16T04:54:41.167837149Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=66357526" Sep 16 04:54:41.171546 containerd[1564]: time="2025-09-16T04:54:41.171491040Z" level=info msg="ImageCreate event name:\"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:41.182345 containerd[1564]: time="2025-09-16T04:54:41.182286681Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:41.186535 containerd[1564]: time="2025-09-16T04:54:41.186427244Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"66357372\" in 3.972574974s" Sep 16 04:54:41.186535 containerd[1564]: time="2025-09-16T04:54:41.186458092Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:a7d029fd8f6be94c26af980675c1650818e1e6e19dbd2f8c13e6e61963f021e8\"" Sep 16 04:54:41.188930 containerd[1564]: time="2025-09-16T04:54:41.188857812Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 16 04:54:41.189999 containerd[1564]: time="2025-09-16T04:54:41.189968213Z" level=info msg="CreateContainer within sandbox \"5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 16 04:54:41.247167 containerd[1564]: time="2025-09-16T04:54:41.247029739Z" level=info msg="Container 94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:41.295127 containerd[1564]: time="2025-09-16T04:54:41.295065933Z" level=info msg="CreateContainer within sandbox \"5a286f1ec5c747b6c4d5a2039539f7b6b2f428da246e6069ea02da4805798c2f\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\"" Sep 16 04:54:41.296923 containerd[1564]: time="2025-09-16T04:54:41.296494291Z" level=info msg="StartContainer for \"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\"" Sep 16 04:54:41.297964 containerd[1564]: time="2025-09-16T04:54:41.297942286Z" level=info msg="connecting to shim 94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250" address="unix:///run/containerd/s/0dc538fb743cc8c7b968aab78ea9ca72596bcfde27ab9fd12c9b370a112c7fdc" protocol=ttrpc version=3 Sep 16 04:54:41.411341 systemd[1]: Started cri-containerd-94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250.scope - libcontainer container 94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250. Sep 16 04:54:41.590072 containerd[1564]: time="2025-09-16T04:54:41.590031511Z" level=info msg="StartContainer for \"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" returns successfully" Sep 16 04:54:41.932401 kubelet[2761]: I0916 04:54:41.932111 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-rrt7l" podStartSLOduration=26.740947446 podStartE2EDuration="35.922483152s" podCreationTimestamp="2025-09-16 04:54:06 +0000 UTC" firstStartedPulling="2025-09-16 04:54:32.006005046 +0000 UTC m=+43.530255676" lastFinishedPulling="2025-09-16 04:54:41.187540752 +0000 UTC m=+52.711791382" observedRunningTime="2025-09-16 04:54:41.922292514 +0000 UTC m=+53.446543165" watchObservedRunningTime="2025-09-16 04:54:41.922483152 +0000 UTC m=+53.446733782" Sep 16 04:54:42.017716 containerd[1564]: time="2025-09-16T04:54:42.017684788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"671db98efb0af1f8b21abe52e8b4ddb209e72eb65fe4bab68a24a76c0e50d9c5\" pid:4943 exit_status:1 exited_at:{seconds:1757998482 nanos:15682213}" Sep 16 04:54:43.091336 containerd[1564]: time="2025-09-16T04:54:43.091287533Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"32ccf6043440ae97c464c89b1c36272fff44b0c13eee9222ed0638679a7e0588\" pid:4967 exit_status:1 exited_at:{seconds:1757998483 nanos:90942025}" Sep 16 04:54:44.012247 containerd[1564]: time="2025-09-16T04:54:44.012202227Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"0d7e0db3f08cbb90dc65085efa95b251b02593cec2f7db0369cbc45d179519dc\" pid:4991 exit_status:1 exited_at:{seconds:1757998484 nanos:11646444}" Sep 16 04:54:45.071927 containerd[1564]: time="2025-09-16T04:54:45.071601092Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:45.165127 containerd[1564]: time="2025-09-16T04:54:45.081588788Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=51277746" Sep 16 04:54:45.189421 containerd[1564]: time="2025-09-16T04:54:45.189242044Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"52770417\" in 4.000348766s" Sep 16 04:54:45.189421 containerd[1564]: time="2025-09-16T04:54:45.189280175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\"" Sep 16 04:54:45.190613 containerd[1564]: time="2025-09-16T04:54:45.190590873Z" level=info msg="ImageCreate event name:\"sha256:df191a54fb79de3c693f8b1b864a1bd3bd14f63b3fff9d5fa4869c471ce3cd37\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:45.191983 containerd[1564]: time="2025-09-16T04:54:45.191519985Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:45.206125 containerd[1564]: time="2025-09-16T04:54:45.206090354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:54:45.327575 containerd[1564]: time="2025-09-16T04:54:45.327470267Z" level=info msg="CreateContainer within sandbox \"34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 16 04:54:45.360081 containerd[1564]: time="2025-09-16T04:54:45.360040996Z" level=info msg="Container baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:45.366460 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4200217814.mount: Deactivated successfully. Sep 16 04:54:45.385081 containerd[1564]: time="2025-09-16T04:54:45.385018357Z" level=info msg="CreateContainer within sandbox \"34bc23527742715595892ece8948ca17efb350b7715ffc14f262aef500c5c1f5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\"" Sep 16 04:54:45.386583 containerd[1564]: time="2025-09-16T04:54:45.385882588Z" level=info msg="StartContainer for \"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\"" Sep 16 04:54:45.387080 containerd[1564]: time="2025-09-16T04:54:45.387032013Z" level=info msg="connecting to shim baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74" address="unix:///run/containerd/s/07f4936c72386ee7ad9b0347964fd3337a0b84242d55f99f0eb8acfef905d303" protocol=ttrpc version=3 Sep 16 04:54:45.426170 systemd[1]: Started cri-containerd-baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74.scope - libcontainer container baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74. Sep 16 04:54:45.503342 containerd[1564]: time="2025-09-16T04:54:45.502999679Z" level=info msg="StartContainer for \"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\" returns successfully" Sep 16 04:54:45.975573 kubelet[2761]: I0916 04:54:45.975517 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-54dcbc64bf-rskqr" podStartSLOduration=25.72651519 podStartE2EDuration="37.975494069s" podCreationTimestamp="2025-09-16 04:54:08 +0000 UTC" firstStartedPulling="2025-09-16 04:54:32.951085265 +0000 UTC m=+44.475335896" lastFinishedPulling="2025-09-16 04:54:45.200064145 +0000 UTC m=+56.724314775" observedRunningTime="2025-09-16 04:54:45.967619419 +0000 UTC m=+57.491870049" watchObservedRunningTime="2025-09-16 04:54:45.975494069 +0000 UTC m=+57.499744699" Sep 16 04:54:46.017459 containerd[1564]: time="2025-09-16T04:54:46.017413594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\" id:\"71f763e7941a481a86ecb060d554e7c7224e0cd6a7cd173fb9aca5b964e44ed4\" pid:5059 exited_at:{seconds:1757998486 nanos:16617279}" Sep 16 04:54:47.710105 containerd[1564]: time="2025-09-16T04:54:47.709571984Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:47.715956 containerd[1564]: time="2025-09-16T04:54:47.715919789Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=47333864" Sep 16 04:54:47.731336 containerd[1564]: time="2025-09-16T04:54:47.731292046Z" level=info msg="ImageCreate event name:\"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:47.737395 containerd[1564]: time="2025-09-16T04:54:47.737365107Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:47.737988 containerd[1564]: time="2025-09-16T04:54:47.737941661Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 2.531552566s" Sep 16 04:54:47.738038 containerd[1564]: time="2025-09-16T04:54:47.737994482Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 16 04:54:47.741721 containerd[1564]: time="2025-09-16T04:54:47.741690389Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:54:47.744294 containerd[1564]: time="2025-09-16T04:54:47.743931070Z" level=info msg="CreateContainer within sandbox \"d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:54:47.777024 containerd[1564]: time="2025-09-16T04:54:47.776983004Z" level=info msg="Container e4e39b7efc0267b09ce3e0e1bdb354f43a3c7b64bb599be5e78412dabb7a9e58: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:47.801308 containerd[1564]: time="2025-09-16T04:54:47.801267377Z" level=info msg="CreateContainer within sandbox \"d3130a64d2196ee32a7e3105673e7bda8787edf20e92c60da149f870062c4872\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e4e39b7efc0267b09ce3e0e1bdb354f43a3c7b64bb599be5e78412dabb7a9e58\"" Sep 16 04:54:47.802516 containerd[1564]: time="2025-09-16T04:54:47.802490449Z" level=info msg="StartContainer for \"e4e39b7efc0267b09ce3e0e1bdb354f43a3c7b64bb599be5e78412dabb7a9e58\"" Sep 16 04:54:47.804145 containerd[1564]: time="2025-09-16T04:54:47.804118296Z" level=info msg="connecting to shim e4e39b7efc0267b09ce3e0e1bdb354f43a3c7b64bb599be5e78412dabb7a9e58" address="unix:///run/containerd/s/dce727bd3e285e8d583858973bcb15cfd6b85b794ddeb920f146579d8e121e91" protocol=ttrpc version=3 Sep 16 04:54:47.830784 systemd[1]: Started cri-containerd-e4e39b7efc0267b09ce3e0e1bdb354f43a3c7b64bb599be5e78412dabb7a9e58.scope - libcontainer container e4e39b7efc0267b09ce3e0e1bdb354f43a3c7b64bb599be5e78412dabb7a9e58. Sep 16 04:54:47.910093 containerd[1564]: time="2025-09-16T04:54:47.910052746Z" level=info msg="StartContainer for \"e4e39b7efc0267b09ce3e0e1bdb354f43a3c7b64bb599be5e78412dabb7a9e58\" returns successfully" Sep 16 04:54:47.983584 kubelet[2761]: I0916 04:54:47.983526 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-58cb7b665-w4dc5" podStartSLOduration=29.352559115 podStartE2EDuration="43.983511456s" podCreationTimestamp="2025-09-16 04:54:04 +0000 UTC" firstStartedPulling="2025-09-16 04:54:33.110110997 +0000 UTC m=+44.634361617" lastFinishedPulling="2025-09-16 04:54:47.741063329 +0000 UTC m=+59.265313958" observedRunningTime="2025-09-16 04:54:47.962724811 +0000 UTC m=+59.486975451" watchObservedRunningTime="2025-09-16 04:54:47.983511456 +0000 UTC m=+59.507762086" Sep 16 04:54:48.443564 containerd[1564]: time="2025-09-16T04:54:48.443159445Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:48.450435 containerd[1564]: time="2025-09-16T04:54:48.449990588Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 16 04:54:48.452055 containerd[1564]: time="2025-09-16T04:54:48.452023419Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"48826583\" in 710.300055ms" Sep 16 04:54:48.452123 containerd[1564]: time="2025-09-16T04:54:48.452073425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:879f2443aed0573271114108bfec35d3e76419f98282ef796c646d0986c5ba6a\"" Sep 16 04:54:48.453399 containerd[1564]: time="2025-09-16T04:54:48.453359686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 16 04:54:48.455621 containerd[1564]: time="2025-09-16T04:54:48.455589903Z" level=info msg="CreateContainer within sandbox \"e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:54:48.499496 containerd[1564]: time="2025-09-16T04:54:48.498718025Z" level=info msg="Container 80fc4504bf902df67c33f2d3ebd02cfb26ab632cb340f2eea4f1892f1230bdd2: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:48.508213 containerd[1564]: time="2025-09-16T04:54:48.508161609Z" level=info msg="CreateContainer within sandbox \"e08418afcf04c02af240f86c994cc7723788a34980e8fc2dd4354c2376f2f79f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"80fc4504bf902df67c33f2d3ebd02cfb26ab632cb340f2eea4f1892f1230bdd2\"" Sep 16 04:54:48.509074 containerd[1564]: time="2025-09-16T04:54:48.509046021Z" level=info msg="StartContainer for \"80fc4504bf902df67c33f2d3ebd02cfb26ab632cb340f2eea4f1892f1230bdd2\"" Sep 16 04:54:48.509782 containerd[1564]: time="2025-09-16T04:54:48.509751902Z" level=info msg="connecting to shim 80fc4504bf902df67c33f2d3ebd02cfb26ab632cb340f2eea4f1892f1230bdd2" address="unix:///run/containerd/s/af5eddbc0534fffad6006bca0b8e13fd991d404b4aa05e62dd3703edfda6aec5" protocol=ttrpc version=3 Sep 16 04:54:48.542118 systemd[1]: Started cri-containerd-80fc4504bf902df67c33f2d3ebd02cfb26ab632cb340f2eea4f1892f1230bdd2.scope - libcontainer container 80fc4504bf902df67c33f2d3ebd02cfb26ab632cb340f2eea4f1892f1230bdd2. Sep 16 04:54:48.766197 containerd[1564]: time="2025-09-16T04:54:48.766097399Z" level=info msg="StartContainer for \"80fc4504bf902df67c33f2d3ebd02cfb26ab632cb340f2eea4f1892f1230bdd2\" returns successfully" Sep 16 04:54:48.988178 kubelet[2761]: I0916 04:54:48.988134 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:54:49.006534 kubelet[2761]: I0916 04:54:49.006465 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-58cb7b665-bgpf8" podStartSLOduration=29.730260132 podStartE2EDuration="45.006453039s" podCreationTimestamp="2025-09-16 04:54:04 +0000 UTC" firstStartedPulling="2025-09-16 04:54:33.176990632 +0000 UTC m=+44.701241252" lastFinishedPulling="2025-09-16 04:54:48.453183519 +0000 UTC m=+59.977434159" observedRunningTime="2025-09-16 04:54:49.005947773 +0000 UTC m=+60.530198402" watchObservedRunningTime="2025-09-16 04:54:49.006453039 +0000 UTC m=+60.530703669" Sep 16 04:54:49.401386 containerd[1564]: time="2025-09-16T04:54:49.401347326Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\" id:\"f6ab0303ca91fe880231b131bae263acd953093d441549bc8afc3ddb4ae916e6\" pid:5189 exited_at:{seconds:1757998489 nanos:401021032}" Sep 16 04:54:49.602672 containerd[1564]: time="2025-09-16T04:54:49.602631399Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"d0c6193e6d439cbc464dc1ecb129437a10d4b85b07d215e60c1982be961105f9\" pid:5183 exited_at:{seconds:1757998489 nanos:602195766}" Sep 16 04:54:50.367716 containerd[1564]: time="2025-09-16T04:54:50.367644134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:50.370215 containerd[1564]: time="2025-09-16T04:54:50.370189319Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=14698542" Sep 16 04:54:50.372741 containerd[1564]: time="2025-09-16T04:54:50.372132935Z" level=info msg="ImageCreate event name:\"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:50.377863 containerd[1564]: time="2025-09-16T04:54:50.377838904Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:54:50.379006 containerd[1564]: time="2025-09-16T04:54:50.378985586Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"16191197\" in 1.925592378s" Sep 16 04:54:50.379092 containerd[1564]: time="2025-09-16T04:54:50.379080117Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:b8f31c4fdaed3fa08af64de3d37d65a4c2ea0d9f6f522cb60d2e0cb424f8dd8a\"" Sep 16 04:54:50.439451 containerd[1564]: time="2025-09-16T04:54:50.439417998Z" level=info msg="CreateContainer within sandbox \"618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 16 04:54:50.468850 containerd[1564]: time="2025-09-16T04:54:50.468818243Z" level=info msg="Container 1c7064649b696af8a3993f069f18379e3ec1fecb7a9e078dba328083173c68a7: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:54:50.488922 containerd[1564]: time="2025-09-16T04:54:50.487546047Z" level=info msg="CreateContainer within sandbox \"618cab3592b1cf59695fe693a4a41dfb3f0412a346641eb964f678a0b7ec2080\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"1c7064649b696af8a3993f069f18379e3ec1fecb7a9e078dba328083173c68a7\"" Sep 16 04:54:50.490321 containerd[1564]: time="2025-09-16T04:54:50.490286095Z" level=info msg="StartContainer for \"1c7064649b696af8a3993f069f18379e3ec1fecb7a9e078dba328083173c68a7\"" Sep 16 04:54:50.492332 containerd[1564]: time="2025-09-16T04:54:50.492301418Z" level=info msg="connecting to shim 1c7064649b696af8a3993f069f18379e3ec1fecb7a9e078dba328083173c68a7" address="unix:///run/containerd/s/d27c3aa2ce311a9cdbb9c6059769a46c85a76f950a97a21a46140c82001388b1" protocol=ttrpc version=3 Sep 16 04:54:50.514132 systemd[1]: Started cri-containerd-1c7064649b696af8a3993f069f18379e3ec1fecb7a9e078dba328083173c68a7.scope - libcontainer container 1c7064649b696af8a3993f069f18379e3ec1fecb7a9e078dba328083173c68a7. Sep 16 04:54:50.570760 containerd[1564]: time="2025-09-16T04:54:50.570718641Z" level=info msg="StartContainer for \"1c7064649b696af8a3993f069f18379e3ec1fecb7a9e078dba328083173c68a7\" returns successfully" Sep 16 04:54:50.938784 kubelet[2761]: I0916 04:54:50.935537 2761 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 16 04:54:50.940479 kubelet[2761]: I0916 04:54:50.940453 2761 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 16 04:54:51.741997 kubelet[2761]: I0916 04:54:51.741941 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:54:51.748910 kubelet[2761]: I0916 04:54:51.745395 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zbfw2" podStartSLOduration=25.197617924 podStartE2EDuration="44.73883142s" podCreationTimestamp="2025-09-16 04:54:07 +0000 UTC" firstStartedPulling="2025-09-16 04:54:30.881094562 +0000 UTC m=+42.405345192" lastFinishedPulling="2025-09-16 04:54:50.422308058 +0000 UTC m=+61.946558688" observedRunningTime="2025-09-16 04:54:51.044724497 +0000 UTC m=+62.568975148" watchObservedRunningTime="2025-09-16 04:54:51.73883142 +0000 UTC m=+63.263082051" Sep 16 04:55:05.712274 systemd[1]: Started sshd@10-157.180.68.84:22-77.222.54.152:34854.service - OpenSSH per-connection server daemon (77.222.54.152:34854). Sep 16 04:55:07.133542 containerd[1564]: time="2025-09-16T04:55:07.133016147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91\" id:\"86ed902a6ac52bee808cf5b698e8aadfda4d36dacde74af0bf7287c7e135d859\" pid:5270 exited_at:{seconds:1757998507 nanos:97240061}" Sep 16 04:55:07.383106 sshd[5255]: Received disconnect from 77.222.54.152 port 34854:11: Bye Bye [preauth] Sep 16 04:55:07.383106 sshd[5255]: Disconnected from authenticating user root 77.222.54.152 port 34854 [preauth] Sep 16 04:55:07.387824 systemd[1]: sshd@10-157.180.68.84:22-77.222.54.152:34854.service: Deactivated successfully. Sep 16 04:55:19.355540 containerd[1564]: time="2025-09-16T04:55:19.354806071Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\" id:\"8d6c9fce376660db57cae6e32564fd37c20ff87723d2ba00ed98c92f12408d7a\" pid:5318 exited_at:{seconds:1757998519 nanos:353940883}" Sep 16 04:55:19.597117 containerd[1564]: time="2025-09-16T04:55:19.595622831Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"675f24d3e959ef7c787a487bce0ec132a6ee687ef8c1e26b655578afa871432f\" pid:5317 exited_at:{seconds:1757998519 nanos:594809883}" Sep 16 04:55:19.715343 containerd[1564]: time="2025-09-16T04:55:19.708417308Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"5b360617a14df451155c552ccc8811b2e14a3fc54893dbcb1ef2b929f032f46b\" pid:5348 exited_at:{seconds:1757998519 nanos:707970462}" Sep 16 04:55:20.771041 containerd[1564]: time="2025-09-16T04:55:20.771001775Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\" id:\"730837480c57d9889dd1cde170a8d03c2196ec3637f3f48dcbead77285e8843b\" pid:5371 exited_at:{seconds:1757998520 nanos:770799952}" Sep 16 04:55:25.128814 systemd[1]: Started sshd@11-157.180.68.84:22-139.178.89.65:45998.service - OpenSSH per-connection server daemon (139.178.89.65:45998). Sep 16 04:55:26.144932 sshd[5384]: Accepted publickey for core from 139.178.89.65 port 45998 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:55:26.148215 sshd-session[5384]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:55:26.155534 systemd-logind[1530]: New session 8 of user core. Sep 16 04:55:26.160030 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 16 04:55:27.367582 sshd[5389]: Connection closed by 139.178.89.65 port 45998 Sep 16 04:55:27.367932 sshd-session[5384]: pam_unix(sshd:session): session closed for user core Sep 16 04:55:27.378751 systemd[1]: sshd@11-157.180.68.84:22-139.178.89.65:45998.service: Deactivated successfully. Sep 16 04:55:27.379086 systemd-logind[1530]: Session 8 logged out. Waiting for processes to exit. Sep 16 04:55:27.382645 systemd[1]: session-8.scope: Deactivated successfully. Sep 16 04:55:27.386268 systemd-logind[1530]: Removed session 8. Sep 16 04:55:29.443674 systemd[1]: Started sshd@12-157.180.68.84:22-67.10.185.103:33658.service - OpenSSH per-connection server daemon (67.10.185.103:33658). Sep 16 04:55:30.446536 sshd[5404]: Received disconnect from 67.10.185.103 port 33658:11: Bye Bye [preauth] Sep 16 04:55:30.446536 sshd[5404]: Disconnected from authenticating user root 67.10.185.103 port 33658 [preauth] Sep 16 04:55:30.450369 systemd[1]: sshd@12-157.180.68.84:22-67.10.185.103:33658.service: Deactivated successfully. Sep 16 04:55:32.578338 systemd[1]: Started sshd@13-157.180.68.84:22-139.178.89.65:50416.service - OpenSSH per-connection server daemon (139.178.89.65:50416). Sep 16 04:55:33.714722 sshd[5410]: Accepted publickey for core from 139.178.89.65 port 50416 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:55:33.717435 sshd-session[5410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:55:33.728167 systemd-logind[1530]: New session 9 of user core. Sep 16 04:55:33.733140 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 16 04:55:34.722728 sshd[5413]: Connection closed by 139.178.89.65 port 50416 Sep 16 04:55:34.724232 sshd-session[5410]: pam_unix(sshd:session): session closed for user core Sep 16 04:55:34.731444 systemd-logind[1530]: Session 9 logged out. Waiting for processes to exit. Sep 16 04:55:34.732295 systemd[1]: sshd@13-157.180.68.84:22-139.178.89.65:50416.service: Deactivated successfully. Sep 16 04:55:34.735752 systemd[1]: session-9.scope: Deactivated successfully. Sep 16 04:55:34.739034 systemd-logind[1530]: Removed session 9. Sep 16 04:55:36.959066 containerd[1564]: time="2025-09-16T04:55:36.958976426Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91\" id:\"4b25c3adc4e22ce5ab0f9163870c89d1f640eb6bc858ae8143c698d8d222e13d\" pid:5437 exited_at:{seconds:1757998536 nanos:958543600}" Sep 16 04:55:39.912259 systemd[1]: Started sshd@14-157.180.68.84:22-139.178.89.65:50422.service - OpenSSH per-connection server daemon (139.178.89.65:50422). Sep 16 04:55:40.064260 systemd[1]: Started sshd@15-157.180.68.84:22-201.249.87.203:44788.service - OpenSSH per-connection server daemon (201.249.87.203:44788). Sep 16 04:55:41.052373 sshd[5449]: Accepted publickey for core from 139.178.89.65 port 50422 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:55:41.056073 sshd-session[5449]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:55:41.069384 systemd-logind[1530]: New session 10 of user core. Sep 16 04:55:41.075125 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 16 04:55:41.261811 sshd[5454]: Received disconnect from 201.249.87.203 port 44788:11: Bye Bye [preauth] Sep 16 04:55:41.261811 sshd[5454]: Disconnected from authenticating user root 201.249.87.203 port 44788 [preauth] Sep 16 04:55:41.265166 systemd[1]: sshd@15-157.180.68.84:22-201.249.87.203:44788.service: Deactivated successfully. Sep 16 04:55:42.024668 sshd[5457]: Connection closed by 139.178.89.65 port 50422 Sep 16 04:55:42.033188 sshd-session[5449]: pam_unix(sshd:session): session closed for user core Sep 16 04:55:42.041174 systemd[1]: sshd@14-157.180.68.84:22-139.178.89.65:50422.service: Deactivated successfully. Sep 16 04:55:42.044418 systemd[1]: session-10.scope: Deactivated successfully. Sep 16 04:55:42.047988 systemd-logind[1530]: Session 10 logged out. Waiting for processes to exit. Sep 16 04:55:42.049969 systemd-logind[1530]: Removed session 10. Sep 16 04:55:42.176706 systemd[1]: Started sshd@16-157.180.68.84:22-139.178.89.65:53648.service - OpenSSH per-connection server daemon (139.178.89.65:53648). Sep 16 04:55:43.196126 sshd[5472]: Accepted publickey for core from 139.178.89.65 port 53648 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:55:43.199529 sshd-session[5472]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:55:43.217982 systemd-logind[1530]: New session 11 of user core. Sep 16 04:55:43.223193 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 16 04:55:44.165304 sshd[5479]: Connection closed by 139.178.89.65 port 53648 Sep 16 04:55:44.168287 sshd-session[5472]: pam_unix(sshd:session): session closed for user core Sep 16 04:55:44.183265 systemd[1]: sshd@16-157.180.68.84:22-139.178.89.65:53648.service: Deactivated successfully. Sep 16 04:55:44.189370 systemd[1]: session-11.scope: Deactivated successfully. Sep 16 04:55:44.192670 systemd-logind[1530]: Session 11 logged out. Waiting for processes to exit. Sep 16 04:55:44.197511 systemd-logind[1530]: Removed session 11. Sep 16 04:55:44.336063 systemd[1]: Started sshd@17-157.180.68.84:22-139.178.89.65:53660.service - OpenSSH per-connection server daemon (139.178.89.65:53660). Sep 16 04:55:45.350037 sshd[5489]: Accepted publickey for core from 139.178.89.65 port 53660 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:55:45.352456 sshd-session[5489]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:55:45.359743 systemd-logind[1530]: New session 12 of user core. Sep 16 04:55:45.367120 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 16 04:55:46.170214 sshd[5492]: Connection closed by 139.178.89.65 port 53660 Sep 16 04:55:46.173537 sshd-session[5489]: pam_unix(sshd:session): session closed for user core Sep 16 04:55:46.178740 systemd[1]: sshd@17-157.180.68.84:22-139.178.89.65:53660.service: Deactivated successfully. Sep 16 04:55:46.178816 systemd-logind[1530]: Session 12 logged out. Waiting for processes to exit. Sep 16 04:55:46.180832 systemd[1]: session-12.scope: Deactivated successfully. Sep 16 04:55:46.182834 systemd-logind[1530]: Removed session 12. Sep 16 04:55:49.357492 containerd[1564]: time="2025-09-16T04:55:49.357434245Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\" id:\"3398f3527ea32286d3a35e75191d0af0355d062a2e3d87620d6bc9315f0a1a1a\" pid:5523 exited_at:{seconds:1757998549 nanos:353009666}" Sep 16 04:55:49.464944 containerd[1564]: time="2025-09-16T04:55:49.464867852Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"be0b1403764dc398994df9deff6676ec170672ca9185b07d190c3bfdb93a06f2\" pid:5539 exited_at:{seconds:1757998549 nanos:464128499}" Sep 16 04:55:51.339500 systemd[1]: Started sshd@18-157.180.68.84:22-139.178.89.65:46170.service - OpenSSH per-connection server daemon (139.178.89.65:46170). Sep 16 04:55:52.377859 sshd[5554]: Accepted publickey for core from 139.178.89.65 port 46170 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:55:52.380625 sshd-session[5554]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:55:52.385950 systemd-logind[1530]: New session 13 of user core. Sep 16 04:55:52.392054 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 16 04:55:53.210173 sshd[5557]: Connection closed by 139.178.89.65 port 46170 Sep 16 04:55:53.210514 sshd-session[5554]: pam_unix(sshd:session): session closed for user core Sep 16 04:55:53.215838 systemd[1]: sshd@18-157.180.68.84:22-139.178.89.65:46170.service: Deactivated successfully. Sep 16 04:55:53.218291 systemd[1]: session-13.scope: Deactivated successfully. Sep 16 04:55:53.220027 systemd-logind[1530]: Session 13 logged out. Waiting for processes to exit. Sep 16 04:55:53.221993 systemd-logind[1530]: Removed session 13. Sep 16 04:55:53.412317 systemd[1]: Started sshd@19-157.180.68.84:22-139.178.89.65:46180.service - OpenSSH per-connection server daemon (139.178.89.65:46180). Sep 16 04:55:54.498793 sshd[5569]: Accepted publickey for core from 139.178.89.65 port 46180 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:55:54.500175 sshd-session[5569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:55:54.504948 systemd-logind[1530]: New session 14 of user core. Sep 16 04:55:54.510036 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 16 04:55:55.508722 sshd[5574]: Connection closed by 139.178.89.65 port 46180 Sep 16 04:55:55.515549 sshd-session[5569]: pam_unix(sshd:session): session closed for user core Sep 16 04:55:55.524017 systemd[1]: sshd@19-157.180.68.84:22-139.178.89.65:46180.service: Deactivated successfully. Sep 16 04:55:55.526698 systemd[1]: session-14.scope: Deactivated successfully. Sep 16 04:55:55.533371 systemd-logind[1530]: Session 14 logged out. Waiting for processes to exit. Sep 16 04:55:55.535676 systemd-logind[1530]: Removed session 14. Sep 16 04:55:55.664162 systemd[1]: Started sshd@20-157.180.68.84:22-139.178.89.65:46184.service - OpenSSH per-connection server daemon (139.178.89.65:46184). Sep 16 04:55:56.665865 sshd[5584]: Accepted publickey for core from 139.178.89.65 port 46184 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:55:56.667720 sshd-session[5584]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:55:56.672486 systemd-logind[1530]: New session 15 of user core. Sep 16 04:55:56.680022 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 16 04:55:59.750563 sshd[5589]: Connection closed by 139.178.89.65 port 46184 Sep 16 04:55:59.782086 sshd-session[5584]: pam_unix(sshd:session): session closed for user core Sep 16 04:55:59.837480 systemd[1]: sshd@20-157.180.68.84:22-139.178.89.65:46184.service: Deactivated successfully. Sep 16 04:55:59.844610 systemd[1]: session-15.scope: Deactivated successfully. Sep 16 04:55:59.844794 systemd[1]: session-15.scope: Consumed 491ms CPU time, 83.7M memory peak. Sep 16 04:55:59.851385 systemd-logind[1530]: Session 15 logged out. Waiting for processes to exit. Sep 16 04:55:59.863313 systemd-logind[1530]: Removed session 15. Sep 16 04:55:59.943147 systemd[1]: Started sshd@21-157.180.68.84:22-139.178.89.65:46200.service - OpenSSH per-connection server daemon (139.178.89.65:46200). Sep 16 04:56:01.074279 sshd[5614]: Accepted publickey for core from 139.178.89.65 port 46200 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:56:01.078214 sshd-session[5614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:56:01.096567 systemd-logind[1530]: New session 16 of user core. Sep 16 04:56:01.101090 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 16 04:56:02.431186 sshd[5621]: Connection closed by 139.178.89.65 port 46200 Sep 16 04:56:02.432983 sshd-session[5614]: pam_unix(sshd:session): session closed for user core Sep 16 04:56:02.436230 systemd[1]: sshd@21-157.180.68.84:22-139.178.89.65:46200.service: Deactivated successfully. Sep 16 04:56:02.439081 systemd[1]: session-16.scope: Deactivated successfully. Sep 16 04:56:02.439967 systemd[1]: session-16.scope: Consumed 345ms CPU time, 73M memory peak. Sep 16 04:56:02.440645 systemd-logind[1530]: Session 16 logged out. Waiting for processes to exit. Sep 16 04:56:02.445677 systemd-logind[1530]: Removed session 16. Sep 16 04:56:02.598191 systemd[1]: Started sshd@22-157.180.68.84:22-139.178.89.65:41782.service - OpenSSH per-connection server daemon (139.178.89.65:41782). Sep 16 04:56:03.607085 sshd[5647]: Accepted publickey for core from 139.178.89.65 port 41782 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:56:03.610582 sshd-session[5647]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:56:03.620721 systemd-logind[1530]: New session 17 of user core. Sep 16 04:56:03.628129 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 16 04:56:04.388554 systemd[1]: Started sshd@23-157.180.68.84:22-43.157.92.77:39208.service - OpenSSH per-connection server daemon (43.157.92.77:39208). Sep 16 04:56:04.433701 sshd[5656]: Connection closed by 139.178.89.65 port 41782 Sep 16 04:56:04.434417 sshd-session[5647]: pam_unix(sshd:session): session closed for user core Sep 16 04:56:04.439264 systemd[1]: sshd@22-157.180.68.84:22-139.178.89.65:41782.service: Deactivated successfully. Sep 16 04:56:04.444201 systemd[1]: session-17.scope: Deactivated successfully. Sep 16 04:56:04.446764 systemd-logind[1530]: Session 17 logged out. Waiting for processes to exit. Sep 16 04:56:04.449647 systemd-logind[1530]: Removed session 17. Sep 16 04:56:04.632845 sshd[5668]: Received disconnect from 43.157.92.77 port 39208:11: Bye Bye [preauth] Sep 16 04:56:04.632845 sshd[5668]: Disconnected from authenticating user root 43.157.92.77 port 39208 [preauth] Sep 16 04:56:04.635877 systemd[1]: sshd@23-157.180.68.84:22-43.157.92.77:39208.service: Deactivated successfully. Sep 16 04:56:07.471934 containerd[1564]: time="2025-09-16T04:56:07.470927077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ff710dd04c6e50b5cda512290f1c8bec26be3bb61eed889cac5df829e8664e91\" id:\"f1611d2d639eb6174b4abf9c85f3fab2ea4fe68f8f53387cad8a7f87622addae\" pid:5695 exited_at:{seconds:1757998567 nanos:361666170}" Sep 16 04:56:09.635726 systemd[1]: Started sshd@24-157.180.68.84:22-139.178.89.65:41788.service - OpenSSH per-connection server daemon (139.178.89.65:41788). Sep 16 04:56:10.691947 sshd[5708]: Accepted publickey for core from 139.178.89.65 port 41788 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:56:10.696381 sshd-session[5708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:56:10.709999 systemd-logind[1530]: New session 18 of user core. Sep 16 04:56:10.715189 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 16 04:56:11.688091 sshd[5711]: Connection closed by 139.178.89.65 port 41788 Sep 16 04:56:11.688517 sshd-session[5708]: pam_unix(sshd:session): session closed for user core Sep 16 04:56:11.696688 systemd[1]: sshd@24-157.180.68.84:22-139.178.89.65:41788.service: Deactivated successfully. Sep 16 04:56:11.699691 systemd[1]: session-18.scope: Deactivated successfully. Sep 16 04:56:11.703745 systemd-logind[1530]: Session 18 logged out. Waiting for processes to exit. Sep 16 04:56:11.705786 systemd-logind[1530]: Removed session 18. Sep 16 04:56:16.901797 systemd[1]: Started sshd@25-157.180.68.84:22-139.178.89.65:47842.service - OpenSSH per-connection server daemon (139.178.89.65:47842). Sep 16 04:56:18.017600 sshd[5723]: Accepted publickey for core from 139.178.89.65 port 47842 ssh2: RSA SHA256:ukQ34xonoknF08dP0xLAU5hfihSV0h8HVu+YH+vjyGk Sep 16 04:56:18.019031 sshd-session[5723]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:56:18.025107 systemd-logind[1530]: New session 19 of user core. Sep 16 04:56:18.031065 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 16 04:56:18.874515 sshd[5726]: Connection closed by 139.178.89.65 port 47842 Sep 16 04:56:18.875588 sshd-session[5723]: pam_unix(sshd:session): session closed for user core Sep 16 04:56:18.882790 systemd-logind[1530]: Session 19 logged out. Waiting for processes to exit. Sep 16 04:56:18.882920 systemd[1]: sshd@25-157.180.68.84:22-139.178.89.65:47842.service: Deactivated successfully. Sep 16 04:56:18.885100 systemd[1]: session-19.scope: Deactivated successfully. Sep 16 04:56:18.887485 systemd-logind[1530]: Removed session 19. Sep 16 04:56:19.396603 containerd[1564]: time="2025-09-16T04:56:19.396533756Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\" id:\"ea6a3b9e4cf5799326d6efdf39481723e38f50d6f629a9f899e35ba89f2994c5\" pid:5760 exited_at:{seconds:1757998579 nanos:395833789}" Sep 16 04:56:19.643652 containerd[1564]: time="2025-09-16T04:56:19.643594746Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"6a3b44dc5ac5bcd095ecf7bf9d9ed73138e493a56eca3b38e7811f9b641abe05\" pid:5767 exited_at:{seconds:1757998579 nanos:637594236}" Sep 16 04:56:19.696876 containerd[1564]: time="2025-09-16T04:56:19.696696951Z" level=info msg="TaskExit event in podsandbox handler container_id:\"94082ec849dc31547a8caa28b8766f3920345b186157a50c4d1485f38afba250\" id:\"1b385ac108cdb1621774f80fd1344627af562009f546fd6a8570ae42566afe95\" pid:5792 exited_at:{seconds:1757998579 nanos:696251643}" Sep 16 04:56:20.765149 containerd[1564]: time="2025-09-16T04:56:20.765115297Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baec3f1ce3ef250911bc3c1516e23df2dca673f22d93dc20af73fca69e39fc74\" id:\"bde7ec865377a85d5a045207072397846da1ee08335311b82f8424aca3b55506\" pid:5814 exited_at:{seconds:1757998580 nanos:764957290}"