Aug 13 00:32:15.788476 kernel: Linux version 6.12.40-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Aug 12 21:42:48 -00 2025 Aug 13 00:32:15.788497 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:32:15.788505 kernel: BIOS-provided physical RAM map: Aug 13 00:32:15.788511 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Aug 13 00:32:15.788515 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Aug 13 00:32:15.788520 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Aug 13 00:32:15.788527 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Aug 13 00:32:15.788532 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Aug 13 00:32:15.788537 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Aug 13 00:32:15.788542 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Aug 13 00:32:15.788547 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Aug 13 00:32:15.788551 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Aug 13 00:32:15.788556 kernel: NX (Execute Disable) protection: active Aug 13 00:32:15.788561 kernel: APIC: Static calls initialized Aug 13 00:32:15.788568 kernel: SMBIOS 2.8 present. Aug 13 00:32:15.788573 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Aug 13 00:32:15.788579 kernel: DMI: Memory slots populated: 1/1 Aug 13 00:32:15.788584 kernel: Hypervisor detected: KVM Aug 13 00:32:15.788589 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Aug 13 00:32:15.788594 kernel: kvm-clock: using sched offset of 3992701933 cycles Aug 13 00:32:15.788600 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Aug 13 00:32:15.788605 kernel: tsc: Detected 2445.406 MHz processor Aug 13 00:32:15.788612 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 13 00:32:15.788618 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 13 00:32:15.788634 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Aug 13 00:32:15.788640 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Aug 13 00:32:15.788646 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 13 00:32:15.788651 kernel: Using GB pages for direct mapping Aug 13 00:32:15.788657 kernel: ACPI: Early table checksum verification disabled Aug 13 00:32:15.788662 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Aug 13 00:32:15.788667 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:32:15.788674 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:32:15.788679 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:32:15.788685 kernel: ACPI: FACS 0x000000007CFE0000 000040 Aug 13 00:32:15.788690 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:32:15.788696 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:32:15.788747 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:32:15.788754 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:32:15.788759 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Aug 13 00:32:15.788767 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Aug 13 00:32:15.788775 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Aug 13 00:32:15.788781 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Aug 13 00:32:15.788786 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Aug 13 00:32:15.788792 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Aug 13 00:32:15.788798 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Aug 13 00:32:15.788804 kernel: No NUMA configuration found Aug 13 00:32:15.788810 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Aug 13 00:32:15.788816 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Aug 13 00:32:15.788821 kernel: Zone ranges: Aug 13 00:32:15.788827 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 13 00:32:15.788832 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Aug 13 00:32:15.788838 kernel: Normal empty Aug 13 00:32:15.788843 kernel: Device empty Aug 13 00:32:15.788849 kernel: Movable zone start for each node Aug 13 00:32:15.788856 kernel: Early memory node ranges Aug 13 00:32:15.788861 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Aug 13 00:32:15.788867 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Aug 13 00:32:15.788873 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Aug 13 00:32:15.788878 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 13 00:32:15.788884 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Aug 13 00:32:15.788889 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Aug 13 00:32:15.788895 kernel: ACPI: PM-Timer IO Port: 0x608 Aug 13 00:32:15.788901 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Aug 13 00:32:15.788907 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Aug 13 00:32:15.788913 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Aug 13 00:32:15.788919 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Aug 13 00:32:15.788924 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 13 00:32:15.788930 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Aug 13 00:32:15.788940 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Aug 13 00:32:15.788951 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 13 00:32:15.788961 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Aug 13 00:32:15.788973 kernel: CPU topo: Max. logical packages: 1 Aug 13 00:32:15.788987 kernel: CPU topo: Max. logical dies: 1 Aug 13 00:32:15.788996 kernel: CPU topo: Max. dies per package: 1 Aug 13 00:32:15.789002 kernel: CPU topo: Max. threads per core: 1 Aug 13 00:32:15.789007 kernel: CPU topo: Num. cores per package: 2 Aug 13 00:32:15.789013 kernel: CPU topo: Num. threads per package: 2 Aug 13 00:32:15.789019 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Aug 13 00:32:15.789024 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Aug 13 00:32:15.789030 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Aug 13 00:32:15.789035 kernel: Booting paravirtualized kernel on KVM Aug 13 00:32:15.789041 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 13 00:32:15.789048 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Aug 13 00:32:15.789054 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Aug 13 00:32:15.789060 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Aug 13 00:32:15.789065 kernel: pcpu-alloc: [0] 0 1 Aug 13 00:32:15.789071 kernel: kvm-guest: PV spinlocks disabled, no host support Aug 13 00:32:15.789078 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:32:15.789084 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 13 00:32:15.789089 kernel: random: crng init done Aug 13 00:32:15.789096 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 13 00:32:15.789102 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Aug 13 00:32:15.789107 kernel: Fallback order for Node 0: 0 Aug 13 00:32:15.789113 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Aug 13 00:32:15.789119 kernel: Policy zone: DMA32 Aug 13 00:32:15.789125 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 13 00:32:15.789130 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 13 00:32:15.789136 kernel: ftrace: allocating 40098 entries in 157 pages Aug 13 00:32:15.789141 kernel: ftrace: allocated 157 pages with 5 groups Aug 13 00:32:15.789148 kernel: Dynamic Preempt: voluntary Aug 13 00:32:15.789154 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 13 00:32:15.789160 kernel: rcu: RCU event tracing is enabled. Aug 13 00:32:15.789166 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 13 00:32:15.789172 kernel: Trampoline variant of Tasks RCU enabled. Aug 13 00:32:15.789178 kernel: Rude variant of Tasks RCU enabled. Aug 13 00:32:15.789183 kernel: Tracing variant of Tasks RCU enabled. Aug 13 00:32:15.789189 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 13 00:32:15.789194 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 13 00:32:15.789200 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 13 00:32:15.789207 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 13 00:32:15.789212 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 13 00:32:15.789218 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Aug 13 00:32:15.789224 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 13 00:32:15.789229 kernel: Console: colour VGA+ 80x25 Aug 13 00:32:15.789235 kernel: printk: legacy console [tty0] enabled Aug 13 00:32:15.789241 kernel: printk: legacy console [ttyS0] enabled Aug 13 00:32:15.789246 kernel: ACPI: Core revision 20240827 Aug 13 00:32:15.789252 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Aug 13 00:32:15.789262 kernel: APIC: Switch to symmetric I/O mode setup Aug 13 00:32:15.789268 kernel: x2apic enabled Aug 13 00:32:15.789274 kernel: APIC: Switched APIC routing to: physical x2apic Aug 13 00:32:15.789281 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Aug 13 00:32:15.789287 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Aug 13 00:32:15.789293 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) Aug 13 00:32:15.789299 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Aug 13 00:32:15.789305 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Aug 13 00:32:15.789311 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Aug 13 00:32:15.789318 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 13 00:32:15.789324 kernel: Spectre V2 : Mitigation: Retpolines Aug 13 00:32:15.789330 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Aug 13 00:32:15.789336 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Aug 13 00:32:15.789342 kernel: RETBleed: Mitigation: untrained return thunk Aug 13 00:32:15.789348 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Aug 13 00:32:15.789354 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Aug 13 00:32:15.789361 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 13 00:32:15.789366 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 13 00:32:15.789372 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 13 00:32:15.789378 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 13 00:32:15.789384 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Aug 13 00:32:15.789390 kernel: Freeing SMP alternatives memory: 32K Aug 13 00:32:15.789396 kernel: pid_max: default: 32768 minimum: 301 Aug 13 00:32:15.789873 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 13 00:32:15.789885 kernel: landlock: Up and running. Aug 13 00:32:15.789895 kernel: SELinux: Initializing. Aug 13 00:32:15.789901 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 13 00:32:15.789907 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 13 00:32:15.789913 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Aug 13 00:32:15.789919 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Aug 13 00:32:15.789925 kernel: ... version: 0 Aug 13 00:32:15.789931 kernel: ... bit width: 48 Aug 13 00:32:15.789937 kernel: ... generic registers: 6 Aug 13 00:32:15.789943 kernel: ... value mask: 0000ffffffffffff Aug 13 00:32:15.789950 kernel: ... max period: 00007fffffffffff Aug 13 00:32:15.789956 kernel: ... fixed-purpose events: 0 Aug 13 00:32:15.789962 kernel: ... event mask: 000000000000003f Aug 13 00:32:15.789967 kernel: signal: max sigframe size: 1776 Aug 13 00:32:15.789973 kernel: rcu: Hierarchical SRCU implementation. Aug 13 00:32:15.789980 kernel: rcu: Max phase no-delay instances is 400. Aug 13 00:32:15.789986 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 13 00:32:15.789992 kernel: smp: Bringing up secondary CPUs ... Aug 13 00:32:15.789998 kernel: smpboot: x86: Booting SMP configuration: Aug 13 00:32:15.790005 kernel: .... node #0, CPUs: #1 Aug 13 00:32:15.790011 kernel: smp: Brought up 1 node, 2 CPUs Aug 13 00:32:15.790017 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) Aug 13 00:32:15.790023 kernel: Memory: 1917776K/2047464K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54444K init, 2524K bss, 125144K reserved, 0K cma-reserved) Aug 13 00:32:15.790032 kernel: devtmpfs: initialized Aug 13 00:32:15.790042 kernel: x86/mm: Memory block size: 128MB Aug 13 00:32:15.790054 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 13 00:32:15.790066 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 13 00:32:15.790078 kernel: pinctrl core: initialized pinctrl subsystem Aug 13 00:32:15.790092 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 13 00:32:15.790104 kernel: audit: initializing netlink subsys (disabled) Aug 13 00:32:15.790115 kernel: audit: type=2000 audit(1755045133.102:1): state=initialized audit_enabled=0 res=1 Aug 13 00:32:15.790126 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 13 00:32:15.790134 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 13 00:32:15.790140 kernel: cpuidle: using governor menu Aug 13 00:32:15.790146 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 13 00:32:15.790152 kernel: dca service started, version 1.12.1 Aug 13 00:32:15.790158 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Aug 13 00:32:15.790166 kernel: PCI: Using configuration type 1 for base access Aug 13 00:32:15.790172 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 13 00:32:15.790178 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 13 00:32:15.790184 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 13 00:32:15.790189 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 13 00:32:15.790195 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 13 00:32:15.790201 kernel: ACPI: Added _OSI(Module Device) Aug 13 00:32:15.790207 kernel: ACPI: Added _OSI(Processor Device) Aug 13 00:32:15.790213 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 13 00:32:15.790220 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 13 00:32:15.790226 kernel: ACPI: Interpreter enabled Aug 13 00:32:15.790232 kernel: ACPI: PM: (supports S0 S5) Aug 13 00:32:15.790238 kernel: ACPI: Using IOAPIC for interrupt routing Aug 13 00:32:15.790244 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 13 00:32:15.790250 kernel: PCI: Using E820 reservations for host bridge windows Aug 13 00:32:15.790267 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Aug 13 00:32:15.790273 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 13 00:32:15.790380 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 13 00:32:15.790455 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Aug 13 00:32:15.790538 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Aug 13 00:32:15.790547 kernel: PCI host bridge to bus 0000:00 Aug 13 00:32:15.792517 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 13 00:32:15.792583 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Aug 13 00:32:15.792649 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 13 00:32:15.792750 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Aug 13 00:32:15.792809 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Aug 13 00:32:15.792860 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Aug 13 00:32:15.792910 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 13 00:32:15.792980 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Aug 13 00:32:15.793055 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Aug 13 00:32:15.793121 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Aug 13 00:32:15.793180 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Aug 13 00:32:15.793238 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Aug 13 00:32:15.793295 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Aug 13 00:32:15.793353 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 13 00:32:15.793484 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.793554 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Aug 13 00:32:15.793617 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 13 00:32:15.793693 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Aug 13 00:32:15.793812 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Aug 13 00:32:15.793882 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.793943 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Aug 13 00:32:15.794001 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 13 00:32:15.794058 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Aug 13 00:32:15.794119 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Aug 13 00:32:15.794182 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.794241 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Aug 13 00:32:15.794298 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 13 00:32:15.794356 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Aug 13 00:32:15.794412 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Aug 13 00:32:15.794477 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.794584 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Aug 13 00:32:15.794691 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 13 00:32:15.794792 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Aug 13 00:32:15.794851 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Aug 13 00:32:15.794916 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.794974 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Aug 13 00:32:15.795031 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 13 00:32:15.795141 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Aug 13 00:32:15.795207 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Aug 13 00:32:15.795272 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.795330 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Aug 13 00:32:15.795387 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 13 00:32:15.795443 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Aug 13 00:32:15.795499 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Aug 13 00:32:15.795562 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.795636 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Aug 13 00:32:15.797845 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 13 00:32:15.797973 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Aug 13 00:32:15.798043 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Aug 13 00:32:15.798110 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.798170 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Aug 13 00:32:15.798232 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 13 00:32:15.798289 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Aug 13 00:32:15.798345 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Aug 13 00:32:15.798409 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:32:15.798467 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Aug 13 00:32:15.798523 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 13 00:32:15.798579 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Aug 13 00:32:15.798659 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Aug 13 00:32:15.798752 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Aug 13 00:32:15.798815 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Aug 13 00:32:15.798879 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Aug 13 00:32:15.798936 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Aug 13 00:32:15.798992 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Aug 13 00:32:15.799104 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Aug 13 00:32:15.799188 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Aug 13 00:32:15.799256 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Aug 13 00:32:15.799318 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Aug 13 00:32:15.799378 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Aug 13 00:32:15.799436 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Aug 13 00:32:15.799492 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 13 00:32:15.799561 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Aug 13 00:32:15.799620 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Aug 13 00:32:15.799746 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 13 00:32:15.802976 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Aug 13 00:32:15.803045 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Aug 13 00:32:15.803107 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Aug 13 00:32:15.803167 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 13 00:32:15.803240 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Aug 13 00:32:15.803303 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Aug 13 00:32:15.803362 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 13 00:32:15.803429 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Aug 13 00:32:15.803535 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Aug 13 00:32:15.803609 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 13 00:32:15.803716 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Aug 13 00:32:15.803785 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Aug 13 00:32:15.803847 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Aug 13 00:32:15.803906 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 13 00:32:15.803915 kernel: acpiphp: Slot [0] registered Aug 13 00:32:15.803985 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Aug 13 00:32:15.804048 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Aug 13 00:32:15.804113 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Aug 13 00:32:15.804173 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Aug 13 00:32:15.804230 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 13 00:32:15.804239 kernel: acpiphp: Slot [0-2] registered Aug 13 00:32:15.804296 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 13 00:32:15.804304 kernel: acpiphp: Slot [0-3] registered Aug 13 00:32:15.804360 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 13 00:32:15.804369 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Aug 13 00:32:15.804377 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Aug 13 00:32:15.804383 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Aug 13 00:32:15.804389 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Aug 13 00:32:15.804395 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Aug 13 00:32:15.804401 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Aug 13 00:32:15.804407 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Aug 13 00:32:15.804413 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Aug 13 00:32:15.804419 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Aug 13 00:32:15.804425 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Aug 13 00:32:15.804432 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Aug 13 00:32:15.804438 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Aug 13 00:32:15.804444 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Aug 13 00:32:15.804450 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Aug 13 00:32:15.804456 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Aug 13 00:32:15.804462 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Aug 13 00:32:15.804468 kernel: iommu: Default domain type: Translated Aug 13 00:32:15.804474 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 13 00:32:15.804480 kernel: PCI: Using ACPI for IRQ routing Aug 13 00:32:15.804487 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 13 00:32:15.804493 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Aug 13 00:32:15.804499 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Aug 13 00:32:15.804557 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Aug 13 00:32:15.804653 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Aug 13 00:32:15.805778 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 13 00:32:15.805798 kernel: vgaarb: loaded Aug 13 00:32:15.805811 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Aug 13 00:32:15.805828 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Aug 13 00:32:15.805840 kernel: clocksource: Switched to clocksource kvm-clock Aug 13 00:32:15.805851 kernel: VFS: Disk quotas dquot_6.6.0 Aug 13 00:32:15.805863 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 13 00:32:15.805870 kernel: pnp: PnP ACPI init Aug 13 00:32:15.805948 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Aug 13 00:32:15.805959 kernel: pnp: PnP ACPI: found 5 devices Aug 13 00:32:15.805965 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 13 00:32:15.805972 kernel: NET: Registered PF_INET protocol family Aug 13 00:32:15.805980 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 13 00:32:15.805986 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Aug 13 00:32:15.805992 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 13 00:32:15.805998 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Aug 13 00:32:15.806004 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 13 00:32:15.806010 kernel: TCP: Hash tables configured (established 16384 bind 16384) Aug 13 00:32:15.806016 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 13 00:32:15.806023 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 13 00:32:15.806030 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 13 00:32:15.806036 kernel: NET: Registered PF_XDP protocol family Aug 13 00:32:15.806098 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Aug 13 00:32:15.806159 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Aug 13 00:32:15.806216 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Aug 13 00:32:15.806274 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Aug 13 00:32:15.806331 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Aug 13 00:32:15.806387 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Aug 13 00:32:15.806447 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 13 00:32:15.806506 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Aug 13 00:32:15.806574 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Aug 13 00:32:15.806648 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 13 00:32:15.806729 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Aug 13 00:32:15.806792 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Aug 13 00:32:15.806851 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 13 00:32:15.806908 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Aug 13 00:32:15.807007 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Aug 13 00:32:15.807072 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 13 00:32:15.807134 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Aug 13 00:32:15.807192 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Aug 13 00:32:15.807248 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 13 00:32:15.807304 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Aug 13 00:32:15.807360 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Aug 13 00:32:15.807421 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 13 00:32:15.807477 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Aug 13 00:32:15.807536 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Aug 13 00:32:15.807592 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 13 00:32:15.807668 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Aug 13 00:32:15.809282 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Aug 13 00:32:15.809371 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Aug 13 00:32:15.809439 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 13 00:32:15.809499 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Aug 13 00:32:15.809557 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Aug 13 00:32:15.809614 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Aug 13 00:32:15.809687 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 13 00:32:15.809765 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Aug 13 00:32:15.809826 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Aug 13 00:32:15.809884 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Aug 13 00:32:15.809943 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Aug 13 00:32:15.809998 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Aug 13 00:32:15.810049 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Aug 13 00:32:15.810098 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Aug 13 00:32:15.810147 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Aug 13 00:32:15.810197 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Aug 13 00:32:15.810259 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Aug 13 00:32:15.810314 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Aug 13 00:32:15.810399 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Aug 13 00:32:15.810476 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Aug 13 00:32:15.810537 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Aug 13 00:32:15.810591 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Aug 13 00:32:15.810662 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Aug 13 00:32:15.810872 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Aug 13 00:32:15.810944 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Aug 13 00:32:15.811000 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Aug 13 00:32:15.811063 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Aug 13 00:32:15.811121 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Aug 13 00:32:15.811181 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Aug 13 00:32:15.811234 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Aug 13 00:32:15.811287 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Aug 13 00:32:15.811350 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Aug 13 00:32:15.811403 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Aug 13 00:32:15.811456 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Aug 13 00:32:15.811514 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Aug 13 00:32:15.811603 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Aug 13 00:32:15.811675 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Aug 13 00:32:15.811689 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Aug 13 00:32:15.811696 kernel: PCI: CLS 0 bytes, default 64 Aug 13 00:32:15.811718 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Aug 13 00:32:15.811725 kernel: Initialise system trusted keyrings Aug 13 00:32:15.811731 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Aug 13 00:32:15.811737 kernel: Key type asymmetric registered Aug 13 00:32:15.811743 kernel: Asymmetric key parser 'x509' registered Aug 13 00:32:15.811750 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 13 00:32:15.811756 kernel: io scheduler mq-deadline registered Aug 13 00:32:15.811764 kernel: io scheduler kyber registered Aug 13 00:32:15.811770 kernel: io scheduler bfq registered Aug 13 00:32:15.811838 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Aug 13 00:32:15.811898 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Aug 13 00:32:15.811957 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Aug 13 00:32:15.812014 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Aug 13 00:32:15.812072 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Aug 13 00:32:15.812129 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Aug 13 00:32:15.812186 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Aug 13 00:32:15.812247 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Aug 13 00:32:15.812304 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Aug 13 00:32:15.812362 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Aug 13 00:32:15.812420 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Aug 13 00:32:15.812478 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Aug 13 00:32:15.812535 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Aug 13 00:32:15.812593 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Aug 13 00:32:15.812709 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Aug 13 00:32:15.812785 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Aug 13 00:32:15.812795 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Aug 13 00:32:15.812851 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Aug 13 00:32:15.812908 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Aug 13 00:32:15.812917 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 13 00:32:15.812923 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Aug 13 00:32:15.812933 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 13 00:32:15.812940 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 13 00:32:15.812946 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Aug 13 00:32:15.812952 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Aug 13 00:32:15.812958 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Aug 13 00:32:15.813021 kernel: rtc_cmos 00:03: RTC can wake from S4 Aug 13 00:32:15.813032 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Aug 13 00:32:15.813083 kernel: rtc_cmos 00:03: registered as rtc0 Aug 13 00:32:15.813138 kernel: rtc_cmos 00:03: setting system clock to 2025-08-13T00:32:15 UTC (1755045135) Aug 13 00:32:15.813191 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Aug 13 00:32:15.813199 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Aug 13 00:32:15.813206 kernel: NET: Registered PF_INET6 protocol family Aug 13 00:32:15.813212 kernel: Segment Routing with IPv6 Aug 13 00:32:15.813218 kernel: In-situ OAM (IOAM) with IPv6 Aug 13 00:32:15.813225 kernel: NET: Registered PF_PACKET protocol family Aug 13 00:32:15.813231 kernel: Key type dns_resolver registered Aug 13 00:32:15.813239 kernel: IPI shorthand broadcast: enabled Aug 13 00:32:15.813245 kernel: sched_clock: Marking stable (3012006741, 154814609)->(3189334838, -22513488) Aug 13 00:32:15.813251 kernel: registered taskstats version 1 Aug 13 00:32:15.813257 kernel: Loading compiled-in X.509 certificates Aug 13 00:32:15.813264 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.40-flatcar: dee0b464d3f7f8d09744a2392f69dde258bc95c0' Aug 13 00:32:15.813270 kernel: Demotion targets for Node 0: null Aug 13 00:32:15.813276 kernel: Key type .fscrypt registered Aug 13 00:32:15.813282 kernel: Key type fscrypt-provisioning registered Aug 13 00:32:15.813289 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 13 00:32:15.813296 kernel: ima: Allocated hash algorithm: sha1 Aug 13 00:32:15.813302 kernel: ima: No architecture policies found Aug 13 00:32:15.813309 kernel: clk: Disabling unused clocks Aug 13 00:32:15.813315 kernel: Warning: unable to open an initial console. Aug 13 00:32:15.813321 kernel: Freeing unused kernel image (initmem) memory: 54444K Aug 13 00:32:15.813327 kernel: Write protecting the kernel read-only data: 24576k Aug 13 00:32:15.813334 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 13 00:32:15.813340 kernel: Run /init as init process Aug 13 00:32:15.813346 kernel: with arguments: Aug 13 00:32:15.813354 kernel: /init Aug 13 00:32:15.813360 kernel: with environment: Aug 13 00:32:15.813366 kernel: HOME=/ Aug 13 00:32:15.813372 kernel: TERM=linux Aug 13 00:32:15.813378 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 13 00:32:15.813385 systemd[1]: Successfully made /usr/ read-only. Aug 13 00:32:15.813395 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 00:32:15.813402 systemd[1]: Detected virtualization kvm. Aug 13 00:32:15.813410 systemd[1]: Detected architecture x86-64. Aug 13 00:32:15.813416 systemd[1]: Running in initrd. Aug 13 00:32:15.813423 systemd[1]: No hostname configured, using default hostname. Aug 13 00:32:15.813429 systemd[1]: Hostname set to . Aug 13 00:32:15.813436 systemd[1]: Initializing machine ID from VM UUID. Aug 13 00:32:15.813442 systemd[1]: Queued start job for default target initrd.target. Aug 13 00:32:15.813449 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:32:15.813456 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:32:15.813464 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 13 00:32:15.813471 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:32:15.813477 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 13 00:32:15.813484 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 13 00:32:15.813492 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 13 00:32:15.813498 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 13 00:32:15.813505 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:32:15.813513 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:32:15.813520 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:32:15.813526 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:32:15.813533 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:32:15.813539 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:32:15.813546 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:32:15.813552 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:32:15.813559 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 13 00:32:15.813567 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 13 00:32:15.813573 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:32:15.813581 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:32:15.813587 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:32:15.813594 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:32:15.813601 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 13 00:32:15.813607 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:32:15.813614 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 13 00:32:15.813621 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 13 00:32:15.813640 systemd[1]: Starting systemd-fsck-usr.service... Aug 13 00:32:15.813646 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:32:15.813653 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:32:15.813659 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:32:15.813666 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 13 00:32:15.813687 systemd-journald[216]: Collecting audit messages is disabled. Aug 13 00:32:15.813719 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:32:15.813726 systemd[1]: Finished systemd-fsck-usr.service. Aug 13 00:32:15.813735 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 00:32:15.813743 systemd-journald[216]: Journal started Aug 13 00:32:15.813759 systemd-journald[216]: Runtime Journal (/run/log/journal/67731b15ba2a49d7a35789134bd89b71) is 4.8M, max 38.6M, 33.7M free. Aug 13 00:32:15.790795 systemd-modules-load[218]: Inserted module 'overlay' Aug 13 00:32:15.855900 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:32:15.855922 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 13 00:32:15.855939 kernel: Bridge firewalling registered Aug 13 00:32:15.824771 systemd-modules-load[218]: Inserted module 'br_netfilter' Aug 13 00:32:15.855991 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:32:15.857060 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:32:15.858000 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:32:15.860148 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 13 00:32:15.862776 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:32:15.871125 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:32:15.878058 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:32:15.879035 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:32:15.885889 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:32:15.886462 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:32:15.887277 systemd-tmpfiles[234]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 13 00:32:15.890785 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 13 00:32:15.891327 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:32:15.894086 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:32:15.907526 dracut-cmdline[252]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:32:15.923529 systemd-resolved[254]: Positive Trust Anchors: Aug 13 00:32:15.923539 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:32:15.923563 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:32:15.928763 systemd-resolved[254]: Defaulting to hostname 'linux'. Aug 13 00:32:15.929497 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:32:15.931342 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:32:15.962749 kernel: SCSI subsystem initialized Aug 13 00:32:15.969725 kernel: Loading iSCSI transport class v2.0-870. Aug 13 00:32:15.978743 kernel: iscsi: registered transport (tcp) Aug 13 00:32:15.994731 kernel: iscsi: registered transport (qla4xxx) Aug 13 00:32:15.994764 kernel: QLogic iSCSI HBA Driver Aug 13 00:32:16.007997 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:32:16.020176 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:32:16.022020 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:32:16.049422 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 13 00:32:16.051144 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 13 00:32:16.089741 kernel: raid6: avx2x4 gen() 34665 MB/s Aug 13 00:32:16.106726 kernel: raid6: avx2x2 gen() 34576 MB/s Aug 13 00:32:16.123864 kernel: raid6: avx2x1 gen() 24153 MB/s Aug 13 00:32:16.123912 kernel: raid6: using algorithm avx2x4 gen() 34665 MB/s Aug 13 00:32:16.141926 kernel: raid6: .... xor() 4979 MB/s, rmw enabled Aug 13 00:32:16.141964 kernel: raid6: using avx2x2 recovery algorithm Aug 13 00:32:16.159748 kernel: xor: automatically using best checksumming function avx Aug 13 00:32:16.301756 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 13 00:32:16.309490 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:32:16.312085 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:32:16.346845 systemd-udevd[465]: Using default interface naming scheme 'v255'. Aug 13 00:32:16.352691 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:32:16.358766 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 13 00:32:16.404961 dracut-pre-trigger[472]: rd.md=0: removing MD RAID activation Aug 13 00:32:16.437652 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:32:16.441382 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:32:16.516180 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:32:16.520749 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 13 00:32:16.574746 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Aug 13 00:32:16.581955 kernel: scsi host0: Virtio SCSI HBA Aug 13 00:32:16.587793 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Aug 13 00:32:16.601741 kernel: cryptd: max_cpu_qlen set to 1000 Aug 13 00:32:16.639724 kernel: AES CTR mode by8 optimization enabled Aug 13 00:32:16.667831 kernel: sd 0:0:0:0: Power-on or device reset occurred Aug 13 00:32:16.670673 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Aug 13 00:32:16.670844 kernel: sd 0:0:0:0: [sda] Write Protect is off Aug 13 00:32:16.670933 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Aug 13 00:32:16.669963 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:32:16.670075 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:32:16.675584 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:32:16.678146 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 13 00:32:16.678263 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Aug 13 00:32:16.681279 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:32:16.686282 kernel: ACPI: bus type USB registered Aug 13 00:32:16.686366 kernel: usbcore: registered new interface driver usbfs Aug 13 00:32:16.690985 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 13 00:32:16.691011 kernel: usbcore: registered new interface driver hub Aug 13 00:32:16.691020 kernel: GPT:17805311 != 80003071 Aug 13 00:32:16.691028 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 13 00:32:16.691035 kernel: GPT:17805311 != 80003071 Aug 13 00:32:16.691043 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 13 00:32:16.691050 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:32:16.691061 kernel: libata version 3.00 loaded. Aug 13 00:32:16.692298 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Aug 13 00:32:16.692418 kernel: usbcore: registered new device driver usb Aug 13 00:32:16.709513 kernel: ahci 0000:00:1f.2: version 3.0 Aug 13 00:32:16.709745 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Aug 13 00:32:16.714737 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Aug 13 00:32:16.714863 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Aug 13 00:32:16.714946 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Aug 13 00:32:16.720736 kernel: scsi host1: ahci Aug 13 00:32:16.720885 kernel: scsi host2: ahci Aug 13 00:32:16.720963 kernel: scsi host3: ahci Aug 13 00:32:16.723795 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 13 00:32:16.723934 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Aug 13 00:32:16.724018 kernel: scsi host4: ahci Aug 13 00:32:16.724095 kernel: scsi host5: ahci Aug 13 00:32:16.724202 kernel: scsi host6: ahci Aug 13 00:32:16.724329 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 46 lpm-pol 0 Aug 13 00:32:16.724344 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 46 lpm-pol 0 Aug 13 00:32:16.724358 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 46 lpm-pol 0 Aug 13 00:32:16.724371 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 46 lpm-pol 0 Aug 13 00:32:16.724384 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 46 lpm-pol 0 Aug 13 00:32:16.724397 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 46 lpm-pol 0 Aug 13 00:32:16.724413 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Aug 13 00:32:16.724538 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 13 00:32:16.724678 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Aug 13 00:32:16.724783 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Aug 13 00:32:16.725728 kernel: hub 1-0:1.0: USB hub found Aug 13 00:32:16.725861 kernel: hub 1-0:1.0: 4 ports detected Aug 13 00:32:16.727720 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Aug 13 00:32:16.727872 kernel: hub 2-0:1.0: USB hub found Aug 13 00:32:16.727962 kernel: hub 2-0:1.0: 4 ports detected Aug 13 00:32:16.768534 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Aug 13 00:32:16.803584 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:32:16.824450 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Aug 13 00:32:16.830487 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Aug 13 00:32:16.831142 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Aug 13 00:32:16.839090 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 13 00:32:16.840731 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 13 00:32:16.860001 disk-uuid[626]: Primary Header is updated. Aug 13 00:32:16.860001 disk-uuid[626]: Secondary Entries is updated. Aug 13 00:32:16.860001 disk-uuid[626]: Secondary Header is updated. Aug 13 00:32:16.874753 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:32:16.970265 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Aug 13 00:32:17.032328 kernel: ata3: SATA link down (SStatus 0 SControl 300) Aug 13 00:32:17.032397 kernel: ata5: SATA link down (SStatus 0 SControl 300) Aug 13 00:32:17.032412 kernel: ata4: SATA link down (SStatus 0 SControl 300) Aug 13 00:32:17.035556 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Aug 13 00:32:17.038519 kernel: ata6: SATA link down (SStatus 0 SControl 300) Aug 13 00:32:17.038546 kernel: ata2: SATA link down (SStatus 0 SControl 300) Aug 13 00:32:17.038566 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Aug 13 00:32:17.039861 kernel: ata1.00: applying bridge limits Aug 13 00:32:17.040981 kernel: ata1.00: configured for UDMA/100 Aug 13 00:32:17.044749 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Aug 13 00:32:17.099655 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Aug 13 00:32:17.099991 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 13 00:32:17.109736 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Aug 13 00:32:17.121746 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 13 00:32:17.130072 kernel: usbcore: registered new interface driver usbhid Aug 13 00:32:17.130118 kernel: usbhid: USB HID core driver Aug 13 00:32:17.140243 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 Aug 13 00:32:17.140290 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Aug 13 00:32:17.419657 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 13 00:32:17.421411 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:32:17.422220 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:32:17.423692 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:32:17.426176 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 13 00:32:17.447460 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:32:17.893802 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:32:17.893877 disk-uuid[627]: The operation has completed successfully. Aug 13 00:32:17.952502 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 13 00:32:17.952601 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 13 00:32:17.976924 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 13 00:32:17.995341 sh[660]: Success Aug 13 00:32:18.015753 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 13 00:32:18.015816 kernel: device-mapper: uevent: version 1.0.3 Aug 13 00:32:18.019732 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 13 00:32:18.031819 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Aug 13 00:32:18.085439 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 13 00:32:18.087800 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 13 00:32:18.105998 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 13 00:32:18.119088 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Aug 13 00:32:18.119140 kernel: BTRFS: device fsid 0c0338fb-9434-41c1-99a2-737cbe2351c4 devid 1 transid 44 /dev/mapper/usr (254:0) scanned by mount (672) Aug 13 00:32:18.124862 kernel: BTRFS info (device dm-0): first mount of filesystem 0c0338fb-9434-41c1-99a2-737cbe2351c4 Aug 13 00:32:18.124901 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:32:18.127666 kernel: BTRFS info (device dm-0): using free-space-tree Aug 13 00:32:18.138656 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 13 00:32:18.139743 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 13 00:32:18.140776 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 13 00:32:18.141585 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 13 00:32:18.144938 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 13 00:32:18.178495 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (709) Aug 13 00:32:18.178557 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:32:18.182463 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:32:18.182505 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:32:18.192734 kernel: BTRFS info (device sda6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:32:18.193143 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 13 00:32:18.196837 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 13 00:32:18.229792 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:32:18.233824 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:32:18.282656 systemd-networkd[842]: lo: Link UP Aug 13 00:32:18.283279 systemd-networkd[842]: lo: Gained carrier Aug 13 00:32:18.285673 systemd-networkd[842]: Enumeration completed Aug 13 00:32:18.286371 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:32:18.288813 systemd[1]: Reached target network.target - Network. Aug 13 00:32:18.288939 systemd-networkd[842]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:32:18.288942 systemd-networkd[842]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:32:18.290919 systemd-networkd[842]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:32:18.290923 systemd-networkd[842]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:32:18.291572 systemd-networkd[842]: eth0: Link UP Aug 13 00:32:18.293736 systemd-networkd[842]: eth0: Gained carrier Aug 13 00:32:18.293745 systemd-networkd[842]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:32:18.295097 ignition[790]: Ignition 2.21.0 Aug 13 00:32:18.295103 ignition[790]: Stage: fetch-offline Aug 13 00:32:18.295128 ignition[790]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:32:18.295134 ignition[790]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:32:18.296874 systemd-networkd[842]: eth1: Link UP Aug 13 00:32:18.295204 ignition[790]: parsed url from cmdline: "" Aug 13 00:32:18.297560 systemd-networkd[842]: eth1: Gained carrier Aug 13 00:32:18.295206 ignition[790]: no config URL provided Aug 13 00:32:18.297569 systemd-networkd[842]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:32:18.295212 ignition[790]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 00:32:18.297734 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:32:18.295217 ignition[790]: no config at "/usr/lib/ignition/user.ign" Aug 13 00:32:18.299799 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 13 00:32:18.295221 ignition[790]: failed to fetch config: resource requires networking Aug 13 00:32:18.295352 ignition[790]: Ignition finished successfully Aug 13 00:32:18.318375 ignition[851]: Ignition 2.21.0 Aug 13 00:32:18.318387 ignition[851]: Stage: fetch Aug 13 00:32:18.318486 ignition[851]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:32:18.318493 ignition[851]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:32:18.318553 ignition[851]: parsed url from cmdline: "" Aug 13 00:32:18.318555 ignition[851]: no config URL provided Aug 13 00:32:18.318559 ignition[851]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 00:32:18.318564 ignition[851]: no config at "/usr/lib/ignition/user.ign" Aug 13 00:32:18.318596 ignition[851]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Aug 13 00:32:18.318780 ignition[851]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Aug 13 00:32:18.334758 systemd-networkd[842]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 13 00:32:18.346746 systemd-networkd[842]: eth0: DHCPv4 address 95.217.135.102/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 13 00:32:18.519191 ignition[851]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Aug 13 00:32:18.522826 ignition[851]: GET result: OK Aug 13 00:32:18.522892 ignition[851]: parsing config with SHA512: 2173a9763cd7b85d12bc30fe4b5d0ed3dc3698007d4ea55c5063f9b5701639ec61530013fe8e118a30dbffdf90ed16ac7856308a55b0af2dce4a698d8d15a57e Aug 13 00:32:18.527216 unknown[851]: fetched base config from "system" Aug 13 00:32:18.528053 unknown[851]: fetched base config from "system" Aug 13 00:32:18.528063 unknown[851]: fetched user config from "hetzner" Aug 13 00:32:18.528661 ignition[851]: fetch: fetch complete Aug 13 00:32:18.528666 ignition[851]: fetch: fetch passed Aug 13 00:32:18.531493 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 13 00:32:18.528751 ignition[851]: Ignition finished successfully Aug 13 00:32:18.533813 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 13 00:32:18.562372 ignition[859]: Ignition 2.21.0 Aug 13 00:32:18.562390 ignition[859]: Stage: kargs Aug 13 00:32:18.562582 ignition[859]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:32:18.562594 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:32:18.564422 ignition[859]: kargs: kargs passed Aug 13 00:32:18.564475 ignition[859]: Ignition finished successfully Aug 13 00:32:18.565844 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 13 00:32:18.568193 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 13 00:32:18.588534 ignition[866]: Ignition 2.21.0 Aug 13 00:32:18.588545 ignition[866]: Stage: disks Aug 13 00:32:18.588697 ignition[866]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:32:18.590479 ignition[866]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:32:18.592178 ignition[866]: disks: disks passed Aug 13 00:32:18.592229 ignition[866]: Ignition finished successfully Aug 13 00:32:18.593275 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 13 00:32:18.594545 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 13 00:32:18.595361 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 13 00:32:18.596678 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:32:18.597972 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:32:18.599055 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:32:18.601105 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 13 00:32:18.628420 systemd-fsck[875]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Aug 13 00:32:18.630387 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 13 00:32:18.632256 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 13 00:32:18.746730 kernel: EXT4-fs (sda9): mounted filesystem 069caac6-7833-4acd-8940-01a7ff7d1281 r/w with ordered data mode. Quota mode: none. Aug 13 00:32:18.747144 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 13 00:32:18.748315 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 13 00:32:18.750605 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:32:18.753779 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 13 00:32:18.766842 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 13 00:32:18.768698 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 13 00:32:18.768777 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:32:18.773284 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 13 00:32:18.781729 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (883) Aug 13 00:32:18.790222 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:32:18.790267 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:32:18.790279 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:32:18.789883 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 13 00:32:18.800732 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:32:18.841788 initrd-setup-root[910]: cut: /sysroot/etc/passwd: No such file or directory Aug 13 00:32:18.846576 initrd-setup-root[917]: cut: /sysroot/etc/group: No such file or directory Aug 13 00:32:18.847859 coreos-metadata[885]: Aug 13 00:32:18.846 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Aug 13 00:32:18.847859 coreos-metadata[885]: Aug 13 00:32:18.847 INFO Fetch successful Aug 13 00:32:18.850017 coreos-metadata[885]: Aug 13 00:32:18.847 INFO wrote hostname ci-4372-1-0-4-15a6623c0c to /sysroot/etc/hostname Aug 13 00:32:18.850535 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 00:32:18.852777 initrd-setup-root[924]: cut: /sysroot/etc/shadow: No such file or directory Aug 13 00:32:18.856246 initrd-setup-root[932]: cut: /sysroot/etc/gshadow: No such file or directory Aug 13 00:32:18.928486 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 13 00:32:18.930270 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 13 00:32:18.931724 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 13 00:32:18.944734 kernel: BTRFS info (device sda6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:32:18.956720 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 13 00:32:18.966406 ignition[1000]: INFO : Ignition 2.21.0 Aug 13 00:32:18.968710 ignition[1000]: INFO : Stage: mount Aug 13 00:32:18.968710 ignition[1000]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:32:18.968710 ignition[1000]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:32:18.968710 ignition[1000]: INFO : mount: mount passed Aug 13 00:32:18.971183 ignition[1000]: INFO : Ignition finished successfully Aug 13 00:32:18.970748 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 13 00:32:18.973054 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 13 00:32:19.118208 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 13 00:32:19.120251 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:32:19.150210 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1011) Aug 13 00:32:19.150272 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:32:19.151919 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:32:19.154256 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:32:19.159514 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:32:19.183997 ignition[1028]: INFO : Ignition 2.21.0 Aug 13 00:32:19.184873 ignition[1028]: INFO : Stage: files Aug 13 00:32:19.185371 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:32:19.185371 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:32:19.187839 ignition[1028]: DEBUG : files: compiled without relabeling support, skipping Aug 13 00:32:19.189388 ignition[1028]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 13 00:32:19.189388 ignition[1028]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 13 00:32:19.193284 ignition[1028]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 13 00:32:19.194691 ignition[1028]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 13 00:32:19.196050 unknown[1028]: wrote ssh authorized keys file for user: core Aug 13 00:32:19.196822 ignition[1028]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 13 00:32:19.198904 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 13 00:32:19.199974 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Aug 13 00:32:19.380393 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 13 00:32:19.534877 systemd-networkd[842]: eth1: Gained IPv6LL Aug 13 00:32:19.598859 systemd-networkd[842]: eth0: Gained IPv6LL Aug 13 00:32:20.027378 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Aug 13 00:32:20.027378 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 13 00:32:20.034570 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Aug 13 00:32:20.220327 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 13 00:32:20.362982 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Aug 13 00:32:20.362982 ignition[1028]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 13 00:32:20.365286 ignition[1028]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:32:20.366760 ignition[1028]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:32:20.366760 ignition[1028]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 13 00:32:20.366760 ignition[1028]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Aug 13 00:32:20.370004 ignition[1028]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 13 00:32:20.370004 ignition[1028]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 13 00:32:20.370004 ignition[1028]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Aug 13 00:32:20.370004 ignition[1028]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Aug 13 00:32:20.370004 ignition[1028]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Aug 13 00:32:20.370004 ignition[1028]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:32:20.370004 ignition[1028]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:32:20.370004 ignition[1028]: INFO : files: files passed Aug 13 00:32:20.370004 ignition[1028]: INFO : Ignition finished successfully Aug 13 00:32:20.368481 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 13 00:32:20.371839 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 13 00:32:20.373915 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 13 00:32:20.382789 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 13 00:32:20.382858 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 13 00:32:20.387671 initrd-setup-root-after-ignition[1058]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:32:20.387671 initrd-setup-root-after-ignition[1058]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:32:20.389553 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:32:20.390466 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:32:20.391334 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 13 00:32:20.393265 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 13 00:32:20.430094 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 13 00:32:20.430193 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 13 00:32:20.431570 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 13 00:32:20.432896 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 13 00:32:20.434187 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 13 00:32:20.434889 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 13 00:32:20.471764 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:32:20.473456 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 13 00:32:20.490161 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:32:20.490891 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:32:20.492306 systemd[1]: Stopped target timers.target - Timer Units. Aug 13 00:32:20.493550 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 13 00:32:20.493746 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:32:20.495256 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 13 00:32:20.496039 systemd[1]: Stopped target basic.target - Basic System. Aug 13 00:32:20.497463 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 13 00:32:20.498657 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:32:20.499938 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 13 00:32:20.501309 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 13 00:32:20.502749 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 13 00:32:20.504035 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:32:20.505542 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 13 00:32:20.506945 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 13 00:32:20.508258 systemd[1]: Stopped target swap.target - Swaps. Aug 13 00:32:20.509474 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 13 00:32:20.509576 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:32:20.511058 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:32:20.511903 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:32:20.513063 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 13 00:32:20.513166 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:32:20.514422 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 13 00:32:20.514557 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 13 00:32:20.516257 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 13 00:32:20.516396 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:32:20.517251 systemd[1]: ignition-files.service: Deactivated successfully. Aug 13 00:32:20.517389 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 13 00:32:20.518339 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 13 00:32:20.518468 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 00:32:20.521800 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 13 00:32:20.528148 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 13 00:32:20.528314 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:32:20.534861 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 13 00:32:20.536000 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 13 00:32:20.536186 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:32:20.538360 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 13 00:32:20.538471 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:32:20.545983 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 13 00:32:20.548206 ignition[1082]: INFO : Ignition 2.21.0 Aug 13 00:32:20.548206 ignition[1082]: INFO : Stage: umount Aug 13 00:32:20.550863 ignition[1082]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:32:20.550863 ignition[1082]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:32:20.550863 ignition[1082]: INFO : umount: umount passed Aug 13 00:32:20.550863 ignition[1082]: INFO : Ignition finished successfully Aug 13 00:32:20.549248 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 13 00:32:20.551909 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 13 00:32:20.551996 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 13 00:32:20.554181 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 13 00:32:20.554247 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 13 00:32:20.556306 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 13 00:32:20.556353 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 13 00:32:20.557949 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 13 00:32:20.557988 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 13 00:32:20.558847 systemd[1]: Stopped target network.target - Network. Aug 13 00:32:20.559476 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 13 00:32:20.559518 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:32:20.560272 systemd[1]: Stopped target paths.target - Path Units. Aug 13 00:32:20.562023 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 13 00:32:20.567778 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:32:20.568775 systemd[1]: Stopped target slices.target - Slice Units. Aug 13 00:32:20.569919 systemd[1]: Stopped target sockets.target - Socket Units. Aug 13 00:32:20.571184 systemd[1]: iscsid.socket: Deactivated successfully. Aug 13 00:32:20.571221 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:32:20.572193 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 13 00:32:20.572226 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:32:20.573169 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 13 00:32:20.573219 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 13 00:32:20.574145 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 13 00:32:20.574177 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 13 00:32:20.575305 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 13 00:32:20.576248 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 13 00:32:20.578177 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 13 00:32:20.578647 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 13 00:32:20.578786 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 13 00:32:20.580060 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 13 00:32:20.580121 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 13 00:32:20.583034 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 13 00:32:20.583144 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 13 00:32:20.585585 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 13 00:32:20.586017 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 13 00:32:20.586052 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:32:20.587965 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 13 00:32:20.590849 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 13 00:32:20.590918 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 13 00:32:20.592236 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 13 00:32:20.592467 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 13 00:32:20.593202 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 13 00:32:20.593228 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:32:20.595756 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 13 00:32:20.596302 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 13 00:32:20.596338 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:32:20.597343 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 13 00:32:20.597376 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:32:20.599608 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 13 00:32:20.599656 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 13 00:32:20.600124 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:32:20.601216 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 13 00:32:20.609100 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 13 00:32:20.614810 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:32:20.616290 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 13 00:32:20.616358 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 13 00:32:20.617525 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 13 00:32:20.617565 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 13 00:32:20.618230 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 13 00:32:20.618262 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:32:20.619184 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 13 00:32:20.619218 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:32:20.620616 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 13 00:32:20.620660 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 13 00:32:20.621661 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 13 00:32:20.621696 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:32:20.624785 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 13 00:32:20.625835 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 13 00:32:20.625874 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:32:20.626991 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 13 00:32:20.627023 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:32:20.628509 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:32:20.628541 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:32:20.635733 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 13 00:32:20.635801 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 13 00:32:20.637019 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 13 00:32:20.638866 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 13 00:32:20.671902 systemd[1]: Switching root. Aug 13 00:32:20.705606 systemd-journald[216]: Journal stopped Aug 13 00:32:21.522685 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). Aug 13 00:32:21.522788 kernel: SELinux: policy capability network_peer_controls=1 Aug 13 00:32:21.522802 kernel: SELinux: policy capability open_perms=1 Aug 13 00:32:21.522811 kernel: SELinux: policy capability extended_socket_class=1 Aug 13 00:32:21.522818 kernel: SELinux: policy capability always_check_network=0 Aug 13 00:32:21.522826 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 13 00:32:21.522833 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 13 00:32:21.522847 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 13 00:32:21.522855 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 13 00:32:21.522862 kernel: SELinux: policy capability userspace_initial_context=0 Aug 13 00:32:21.522870 kernel: audit: type=1403 audit(1755045140.840:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 13 00:32:21.522879 systemd[1]: Successfully loaded SELinux policy in 47.407ms. Aug 13 00:32:21.522897 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.032ms. Aug 13 00:32:21.522908 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 00:32:21.522916 systemd[1]: Detected virtualization kvm. Aug 13 00:32:21.522926 systemd[1]: Detected architecture x86-64. Aug 13 00:32:21.522934 systemd[1]: Detected first boot. Aug 13 00:32:21.522943 systemd[1]: Hostname set to . Aug 13 00:32:21.522951 systemd[1]: Initializing machine ID from VM UUID. Aug 13 00:32:21.522959 zram_generator::config[1126]: No configuration found. Aug 13 00:32:21.522968 kernel: Guest personality initialized and is inactive Aug 13 00:32:21.522976 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Aug 13 00:32:21.522983 kernel: Initialized host personality Aug 13 00:32:21.522991 kernel: NET: Registered PF_VSOCK protocol family Aug 13 00:32:21.523000 systemd[1]: Populated /etc with preset unit settings. Aug 13 00:32:21.523009 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 13 00:32:21.523017 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 13 00:32:21.523025 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 13 00:32:21.523033 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 13 00:32:21.523042 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 13 00:32:21.523051 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 13 00:32:21.523060 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 13 00:32:21.523069 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 13 00:32:21.523077 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 13 00:32:21.523085 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 13 00:32:21.523095 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 13 00:32:21.523103 systemd[1]: Created slice user.slice - User and Session Slice. Aug 13 00:32:21.523111 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:32:21.523119 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:32:21.523127 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 13 00:32:21.523137 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 13 00:32:21.523145 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 13 00:32:21.523153 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:32:21.523162 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Aug 13 00:32:21.523170 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:32:21.523182 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:32:21.523190 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 13 00:32:21.523198 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 13 00:32:21.523207 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 13 00:32:21.523214 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 13 00:32:21.523222 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:32:21.523234 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:32:21.523242 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:32:21.523252 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:32:21.523260 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 13 00:32:21.523268 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 13 00:32:21.523277 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 13 00:32:21.523285 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:32:21.523294 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:32:21.523302 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:32:21.523310 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 13 00:32:21.523317 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 13 00:32:21.523326 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 13 00:32:21.523334 systemd[1]: Mounting media.mount - External Media Directory... Aug 13 00:32:21.523343 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:32:21.523351 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 13 00:32:21.523359 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 13 00:32:21.523367 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 13 00:32:21.523376 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 13 00:32:21.523384 systemd[1]: Reached target machines.target - Containers. Aug 13 00:32:21.523392 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 13 00:32:21.523401 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:32:21.523409 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:32:21.523418 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 13 00:32:21.523426 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:32:21.523435 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:32:21.523443 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:32:21.523452 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 13 00:32:21.523460 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:32:21.523470 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 13 00:32:21.523478 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 13 00:32:21.523487 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 13 00:32:21.523498 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 13 00:32:21.523506 systemd[1]: Stopped systemd-fsck-usr.service. Aug 13 00:32:21.523515 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:32:21.523523 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:32:21.523531 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:32:21.523539 kernel: fuse: init (API version 7.41) Aug 13 00:32:21.523548 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:32:21.523557 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 13 00:32:21.523565 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 13 00:32:21.523573 kernel: loop: module loaded Aug 13 00:32:21.523582 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:32:21.523591 systemd[1]: verity-setup.service: Deactivated successfully. Aug 13 00:32:21.523599 systemd[1]: Stopped verity-setup.service. Aug 13 00:32:21.523607 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:32:21.523616 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 13 00:32:21.523625 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 13 00:32:21.523647 systemd[1]: Mounted media.mount - External Media Directory. Aug 13 00:32:21.523657 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 13 00:32:21.523683 systemd-journald[1217]: Collecting audit messages is disabled. Aug 13 00:32:21.524825 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 13 00:32:21.524842 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 13 00:32:21.524851 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 13 00:32:21.524861 systemd-journald[1217]: Journal started Aug 13 00:32:21.524883 systemd-journald[1217]: Runtime Journal (/run/log/journal/67731b15ba2a49d7a35789134bd89b71) is 4.8M, max 38.6M, 33.7M free. Aug 13 00:32:21.291813 systemd[1]: Queued start job for default target multi-user.target. Aug 13 00:32:21.296859 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Aug 13 00:32:21.297189 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 13 00:32:21.531571 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:32:21.530951 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:32:21.531900 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 13 00:32:21.532749 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 13 00:32:21.533365 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:32:21.533493 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:32:21.534118 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:32:21.534222 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:32:21.538006 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 13 00:32:21.538124 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 13 00:32:21.538747 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:32:21.538861 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:32:21.539456 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:32:21.540093 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 13 00:32:21.548410 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 13 00:32:21.567136 kernel: ACPI: bus type drm_connector registered Aug 13 00:32:21.560507 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 13 00:32:21.563551 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 13 00:32:21.564338 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 13 00:32:21.564364 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:32:21.565527 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 13 00:32:21.569858 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 13 00:32:21.570357 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:32:21.572217 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 13 00:32:21.574781 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 13 00:32:21.575287 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:32:21.576001 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 13 00:32:21.577798 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:32:21.579322 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:32:21.583088 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 13 00:32:21.586214 systemd-journald[1217]: Time spent on flushing to /var/log/journal/67731b15ba2a49d7a35789134bd89b71 is 30.786ms for 1148 entries. Aug 13 00:32:21.586214 systemd-journald[1217]: System Journal (/var/log/journal/67731b15ba2a49d7a35789134bd89b71) is 8M, max 584.8M, 576.8M free. Aug 13 00:32:21.624230 systemd-journald[1217]: Received client request to flush runtime journal. Aug 13 00:32:21.624270 kernel: loop0: detected capacity change from 0 to 146240 Aug 13 00:32:21.593471 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 13 00:32:21.597506 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:32:21.597657 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:32:21.598325 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:32:21.599804 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:32:21.600926 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 13 00:32:21.602232 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 13 00:32:21.606693 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:32:21.611082 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 13 00:32:21.611873 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 13 00:32:21.615385 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 13 00:32:21.628501 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 13 00:32:21.636353 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:32:21.643003 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 13 00:32:21.659755 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 13 00:32:21.669673 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 13 00:32:21.672382 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:32:21.676966 kernel: loop1: detected capacity change from 0 to 113872 Aug 13 00:32:21.698766 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Aug 13 00:32:21.699097 systemd-tmpfiles[1269]: ACLs are not supported, ignoring. Aug 13 00:32:21.706983 kernel: loop2: detected capacity change from 0 to 221472 Aug 13 00:32:21.705352 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:32:21.742734 kernel: loop3: detected capacity change from 0 to 8 Aug 13 00:32:21.755729 kernel: loop4: detected capacity change from 0 to 146240 Aug 13 00:32:21.779735 kernel: loop5: detected capacity change from 0 to 113872 Aug 13 00:32:21.797730 kernel: loop6: detected capacity change from 0 to 221472 Aug 13 00:32:21.820064 kernel: loop7: detected capacity change from 0 to 8 Aug 13 00:32:21.820532 (sd-merge)[1275]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Aug 13 00:32:21.821325 (sd-merge)[1275]: Merged extensions into '/usr'. Aug 13 00:32:21.825613 systemd[1]: Reload requested from client PID 1251 ('systemd-sysext') (unit systemd-sysext.service)... Aug 13 00:32:21.825726 systemd[1]: Reloading... Aug 13 00:32:21.893819 zram_generator::config[1301]: No configuration found. Aug 13 00:32:21.983992 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:32:22.045182 ldconfig[1246]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 13 00:32:22.057246 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 13 00:32:22.057588 systemd[1]: Reloading finished in 231 ms. Aug 13 00:32:22.076364 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 13 00:32:22.077314 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 13 00:32:22.087792 systemd[1]: Starting ensure-sysext.service... Aug 13 00:32:22.089126 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:32:22.105377 systemd[1]: Reload requested from client PID 1344 ('systemctl') (unit ensure-sysext.service)... Aug 13 00:32:22.105475 systemd[1]: Reloading... Aug 13 00:32:22.119819 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 13 00:32:22.120146 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 13 00:32:22.120335 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 13 00:32:22.120547 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 13 00:32:22.122235 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 13 00:32:22.123996 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Aug 13 00:32:22.124098 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Aug 13 00:32:22.128141 systemd-tmpfiles[1345]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:32:22.128241 systemd-tmpfiles[1345]: Skipping /boot Aug 13 00:32:22.140472 systemd-tmpfiles[1345]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:32:22.142145 systemd-tmpfiles[1345]: Skipping /boot Aug 13 00:32:22.168724 zram_generator::config[1369]: No configuration found. Aug 13 00:32:22.246048 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:32:22.311775 systemd[1]: Reloading finished in 205 ms. Aug 13 00:32:22.331906 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 13 00:32:22.340323 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:32:22.347427 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 00:32:22.350927 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 13 00:32:22.360502 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 13 00:32:22.364942 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:32:22.369921 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:32:22.372957 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 13 00:32:22.377394 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:32:22.378005 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:32:22.381400 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:32:22.382886 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:32:22.390220 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:32:22.390864 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:32:22.391005 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:32:22.391133 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:32:22.394461 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 13 00:32:22.401516 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 13 00:32:22.402365 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:32:22.402730 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:32:22.407201 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 13 00:32:22.415324 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:32:22.415756 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:32:22.417149 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:32:22.417478 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:32:22.419455 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:32:22.419658 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:32:22.422103 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:32:22.424839 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:32:22.425375 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:32:22.425461 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:32:22.425569 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:32:22.426539 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 13 00:32:22.427751 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:32:22.434101 systemd[1]: Finished ensure-sysext.service. Aug 13 00:32:22.436138 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 13 00:32:22.449761 systemd-udevd[1422]: Using default interface naming scheme 'v255'. Aug 13 00:32:22.452065 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:32:22.452237 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:32:22.452980 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:32:22.453110 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:32:22.453750 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:32:22.456688 augenrules[1458]: No rules Aug 13 00:32:22.457105 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:32:22.457288 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 00:32:22.463040 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 13 00:32:22.465880 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 13 00:32:22.472893 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 13 00:32:22.474278 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 13 00:32:22.488039 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:32:22.492350 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:32:22.545188 systemd-resolved[1420]: Positive Trust Anchors: Aug 13 00:32:22.545434 systemd-resolved[1420]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:32:22.545506 systemd-resolved[1420]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:32:22.550228 systemd-resolved[1420]: Using system hostname 'ci-4372-1-0-4-15a6623c0c'. Aug 13 00:32:22.554051 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:32:22.555372 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:32:22.580463 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 13 00:32:22.581067 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:32:22.582089 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 13 00:32:22.582826 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 13 00:32:22.583337 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 13 00:32:22.583805 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 13 00:32:22.584255 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 13 00:32:22.584277 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:32:22.586750 systemd[1]: Reached target time-set.target - System Time Set. Aug 13 00:32:22.587777 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 13 00:32:22.589197 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 13 00:32:22.590064 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:32:22.591674 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 13 00:32:22.594859 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 13 00:32:22.599463 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 13 00:32:22.601226 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 13 00:32:22.602262 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 13 00:32:22.618086 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 13 00:32:22.620095 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 13 00:32:22.621246 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 13 00:32:22.640339 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:32:22.641556 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:32:22.642501 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:32:22.642590 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:32:22.644906 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 13 00:32:22.647138 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 13 00:32:22.648980 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 13 00:32:22.652802 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 13 00:32:22.654733 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 13 00:32:22.655393 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 13 00:32:22.658622 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 13 00:32:22.660894 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 13 00:32:22.666881 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 13 00:32:22.671569 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 13 00:32:22.678430 jq[1519]: false Aug 13 00:32:22.678853 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 13 00:32:22.684137 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 13 00:32:22.686348 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 13 00:32:22.686679 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 13 00:32:22.691726 systemd[1]: Starting update-engine.service - Update Engine... Aug 13 00:32:22.695270 extend-filesystems[1522]: Found /dev/sda6 Aug 13 00:32:22.704539 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Refreshing passwd entry cache Aug 13 00:32:22.700808 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 13 00:32:22.699438 oslogin_cache_refresh[1523]: Refreshing passwd entry cache Aug 13 00:32:22.704846 extend-filesystems[1522]: Found /dev/sda9 Aug 13 00:32:22.703722 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 13 00:32:22.704471 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 13 00:32:22.708331 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Failure getting users, quitting Aug 13 00:32:22.708331 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 00:32:22.708331 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Refreshing group entry cache Aug 13 00:32:22.708331 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Failure getting groups, quitting Aug 13 00:32:22.708331 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 00:32:22.707964 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 13 00:32:22.707120 oslogin_cache_refresh[1523]: Failure getting users, quitting Aug 13 00:32:22.707135 oslogin_cache_refresh[1523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 00:32:22.707161 oslogin_cache_refresh[1523]: Refreshing group entry cache Aug 13 00:32:22.707504 oslogin_cache_refresh[1523]: Failure getting groups, quitting Aug 13 00:32:22.707509 oslogin_cache_refresh[1523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 00:32:22.710069 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 13 00:32:22.712356 extend-filesystems[1522]: Checking size of /dev/sda9 Aug 13 00:32:22.719006 coreos-metadata[1516]: Aug 13 00:32:22.709 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Aug 13 00:32:22.719006 coreos-metadata[1516]: Aug 13 00:32:22.710 INFO Failed to fetch: error sending request for url (http://169.254.169.254/hetzner/v1/metadata) Aug 13 00:32:22.710207 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 13 00:32:22.710847 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Aug 13 00:32:22.712946 systemd[1]: motdgen.service: Deactivated successfully. Aug 13 00:32:22.713095 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 13 00:32:22.719874 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 13 00:32:22.721011 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 13 00:32:22.722849 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 13 00:32:22.728847 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 13 00:32:22.736031 jq[1538]: true Aug 13 00:32:22.750302 extend-filesystems[1522]: Resized partition /dev/sda9 Aug 13 00:32:22.751171 systemd-networkd[1480]: lo: Link UP Aug 13 00:32:22.751977 systemd-networkd[1480]: lo: Gained carrier Aug 13 00:32:22.753062 extend-filesystems[1561]: resize2fs 1.47.2 (1-Jan-2025) Aug 13 00:32:22.758372 update_engine[1534]: I20250813 00:32:22.758177 1534 main.cc:92] Flatcar Update Engine starting Aug 13 00:32:22.761453 systemd-networkd[1480]: Enumeration completed Aug 13 00:32:22.761527 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:32:22.761746 systemd-networkd[1480]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:32:22.761749 systemd-networkd[1480]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:32:22.762402 systemd[1]: Reached target network.target - Network. Aug 13 00:32:22.763776 tar[1544]: linux-amd64/helm Aug 13 00:32:22.764561 systemd-networkd[1480]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:32:22.765043 systemd-networkd[1480]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:32:22.766768 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Aug 13 00:32:22.766385 systemd-networkd[1480]: eth0: Link UP Aug 13 00:32:22.766528 systemd-networkd[1480]: eth0: Gained carrier Aug 13 00:32:22.766543 systemd-networkd[1480]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:32:22.767684 systemd[1]: Starting containerd.service - containerd container runtime... Aug 13 00:32:22.771549 jq[1557]: true Aug 13 00:32:22.771948 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 13 00:32:22.773009 dbus-daemon[1517]: [system] SELinux support is enabled Aug 13 00:32:22.774906 systemd-networkd[1480]: eth1: Link UP Aug 13 00:32:22.775369 systemd-networkd[1480]: eth1: Gained carrier Aug 13 00:32:22.775381 systemd-networkd[1480]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:32:22.776463 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 13 00:32:22.777010 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 13 00:32:22.780214 update_engine[1534]: I20250813 00:32:22.780095 1534 update_check_scheduler.cc:74] Next update check in 6m35s Aug 13 00:32:22.786131 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 13 00:32:22.786161 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 13 00:32:22.786633 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 13 00:32:22.786666 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 13 00:32:22.787170 systemd[1]: Started update-engine.service - Update Engine. Aug 13 00:32:22.791749 kernel: mousedev: PS/2 mouse device common for all mice Aug 13 00:32:22.793872 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 13 00:32:22.796008 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 13 00:32:22.805757 systemd-networkd[1480]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 13 00:32:22.807151 systemd-timesyncd[1452]: Network configuration changed, trying to establish connection. Aug 13 00:32:22.827370 systemd-networkd[1480]: eth0: DHCPv4 address 95.217.135.102/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 13 00:32:22.830978 systemd-timesyncd[1452]: Network configuration changed, trying to establish connection. Aug 13 00:32:22.835816 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 13 00:32:22.836583 (ntainerd)[1576]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 13 00:32:22.915055 bash[1588]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:32:22.915719 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 13 00:32:22.924102 systemd[1]: Starting sshkeys.service... Aug 13 00:32:22.951938 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Aug 13 00:32:22.983716 extend-filesystems[1561]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Aug 13 00:32:22.983716 extend-filesystems[1561]: old_desc_blocks = 1, new_desc_blocks = 5 Aug 13 00:32:22.983716 extend-filesystems[1561]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Aug 13 00:32:23.015267 extend-filesystems[1522]: Resized filesystem in /dev/sda9 Aug 13 00:32:22.984420 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 13 00:32:22.984744 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 13 00:32:22.993787 locksmithd[1569]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 13 00:32:23.007495 systemd-logind[1532]: New seat seat0. Aug 13 00:32:23.012917 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 13 00:32:23.024751 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Aug 13 00:32:23.021818 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 13 00:32:23.023911 systemd[1]: Started systemd-logind.service - User Login Management. Aug 13 00:32:23.030152 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Aug 13 00:32:23.034288 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Aug 13 00:32:23.052742 sshd_keygen[1555]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 13 00:32:23.079524 containerd[1576]: time="2025-08-13T00:32:23Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 13 00:32:23.081981 containerd[1576]: time="2025-08-13T00:32:23.081954177Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Aug 13 00:32:23.088841 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 13 00:32:23.092036 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 13 00:32:23.095794 coreos-metadata[1603]: Aug 13 00:32:23.095 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Aug 13 00:32:23.101268 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Aug 13 00:32:23.101439 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Aug 13 00:32:23.101550 coreos-metadata[1603]: Aug 13 00:32:23.099 INFO Fetch successful Aug 13 00:32:23.103181 containerd[1576]: time="2025-08-13T00:32:23.103132148Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.827µs" Aug 13 00:32:23.103181 containerd[1576]: time="2025-08-13T00:32:23.103162334Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 13 00:32:23.103181 containerd[1576]: time="2025-08-13T00:32:23.103179677Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 13 00:32:23.103569 containerd[1576]: time="2025-08-13T00:32:23.103545844Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 13 00:32:23.103593 containerd[1576]: time="2025-08-13T00:32:23.103568085Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 13 00:32:23.103593 containerd[1576]: time="2025-08-13T00:32:23.103588153Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 00:32:23.103670 containerd[1576]: time="2025-08-13T00:32:23.103635151Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 00:32:23.103670 containerd[1576]: time="2025-08-13T00:32:23.103665178Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 00:32:23.104211 unknown[1603]: wrote ssh authorized keys file for user: core Aug 13 00:32:23.105790 containerd[1576]: time="2025-08-13T00:32:23.105765596Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 00:32:23.105790 containerd[1576]: time="2025-08-13T00:32:23.105787718Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 00:32:23.105837 containerd[1576]: time="2025-08-13T00:32:23.105803767Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 00:32:23.105837 containerd[1576]: time="2025-08-13T00:32:23.105811322Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 13 00:32:23.105895 containerd[1576]: time="2025-08-13T00:32:23.105877746Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 13 00:32:23.106041 containerd[1576]: time="2025-08-13T00:32:23.106023399Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 00:32:23.106067 containerd[1576]: time="2025-08-13T00:32:23.106054458Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 00:32:23.106067 containerd[1576]: time="2025-08-13T00:32:23.106062603Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 13 00:32:23.106905 containerd[1576]: time="2025-08-13T00:32:23.106885857Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 13 00:32:23.108119 containerd[1576]: time="2025-08-13T00:32:23.108094583Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 13 00:32:23.108504 containerd[1576]: time="2025-08-13T00:32:23.108471109Z" level=info msg="metadata content store policy set" policy=shared Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114291613Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114344612Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114358148Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114368106Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114378145Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114385639Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114428229Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114439110Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114447235Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114455010Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114461602Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 13 00:32:23.114739 containerd[1576]: time="2025-08-13T00:32:23.114476570Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115044695Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115071014Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115084329Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115092695Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115119275Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115130156Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115138521Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115146186Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115154591Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115162336Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 13 00:32:23.115351 containerd[1576]: time="2025-08-13T00:32:23.115170551Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 13 00:32:23.115720 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Aug 13 00:32:23.115842 containerd[1576]: time="2025-08-13T00:32:23.115826531Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 13 00:32:23.115892 containerd[1576]: time="2025-08-13T00:32:23.115882907Z" level=info msg="Start snapshots syncer" Aug 13 00:32:23.115945 containerd[1576]: time="2025-08-13T00:32:23.115933522Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 13 00:32:23.116172 containerd[1576]: time="2025-08-13T00:32:23.116142744Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 13 00:32:23.116304 containerd[1576]: time="2025-08-13T00:32:23.116290360Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 13 00:32:23.117279 containerd[1576]: time="2025-08-13T00:32:23.117263014Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 13 00:32:23.117395 containerd[1576]: time="2025-08-13T00:32:23.117380234Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 13 00:32:23.117449 containerd[1576]: time="2025-08-13T00:32:23.117438854Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 13 00:32:23.117488 containerd[1576]: time="2025-08-13T00:32:23.117479531Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 13 00:32:23.117527 containerd[1576]: time="2025-08-13T00:32:23.117517993Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 13 00:32:23.117600 containerd[1576]: time="2025-08-13T00:32:23.117588976Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 13 00:32:23.117655 containerd[1576]: time="2025-08-13T00:32:23.117632738Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 13 00:32:23.117750 containerd[1576]: time="2025-08-13T00:32:23.117692510Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 13 00:32:23.117818 containerd[1576]: time="2025-08-13T00:32:23.117807155Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 13 00:32:23.117858 containerd[1576]: time="2025-08-13T00:32:23.117849655Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 13 00:32:23.117895 containerd[1576]: time="2025-08-13T00:32:23.117887335Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 13 00:32:23.117956 containerd[1576]: time="2025-08-13T00:32:23.117941637Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 00:32:23.118013 containerd[1576]: time="2025-08-13T00:32:23.118001229Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 00:32:23.118051 containerd[1576]: time="2025-08-13T00:32:23.118042877Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 00:32:23.118088 containerd[1576]: time="2025-08-13T00:32:23.118079085Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 00:32:23.118280 containerd[1576]: time="2025-08-13T00:32:23.118155558Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 13 00:32:23.118280 containerd[1576]: time="2025-08-13T00:32:23.118190914Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 13 00:32:23.118280 containerd[1576]: time="2025-08-13T00:32:23.118200853Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 13 00:32:23.118280 containerd[1576]: time="2025-08-13T00:32:23.118213738Z" level=info msg="runtime interface created" Aug 13 00:32:23.118280 containerd[1576]: time="2025-08-13T00:32:23.118217785Z" level=info msg="created NRI interface" Aug 13 00:32:23.118280 containerd[1576]: time="2025-08-13T00:32:23.118223726Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 13 00:32:23.118280 containerd[1576]: time="2025-08-13T00:32:23.118232373Z" level=info msg="Connect containerd service" Aug 13 00:32:23.118280 containerd[1576]: time="2025-08-13T00:32:23.118253181Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 13 00:32:23.119748 kernel: ACPI: button: Power Button [PWRF] Aug 13 00:32:23.121158 containerd[1576]: time="2025-08-13T00:32:23.121119896Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 13 00:32:23.125627 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Aug 13 00:32:23.133240 systemd[1]: issuegen.service: Deactivated successfully. Aug 13 00:32:23.139015 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 13 00:32:23.156027 update-ssh-keys[1622]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:32:23.163086 kernel: Console: switching to colour dummy device 80x25 Aug 13 00:32:23.170533 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 13 00:32:23.174994 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Aug 13 00:32:23.175036 kernel: [drm] features: -context_init Aug 13 00:32:23.175767 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 13 00:32:23.176148 systemd[1]: Finished sshkeys.service. Aug 13 00:32:23.207920 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 13 00:32:23.212887 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 13 00:32:23.216882 kernel: EDAC MC: Ver: 3.0.0 Aug 13 00:32:23.215218 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Aug 13 00:32:23.215371 systemd[1]: Reached target getty.target - Login Prompts. Aug 13 00:32:23.242722 kernel: [drm] number of scanouts: 1 Aug 13 00:32:23.242785 kernel: [drm] number of cap sets: 0 Aug 13 00:32:23.247718 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Aug 13 00:32:23.262531 containerd[1576]: time="2025-08-13T00:32:23.262500540Z" level=info msg="Start subscribing containerd event" Aug 13 00:32:23.262965 containerd[1576]: time="2025-08-13T00:32:23.262939112Z" level=info msg="Start recovering state" Aug 13 00:32:23.263115 containerd[1576]: time="2025-08-13T00:32:23.263088051Z" level=info msg="Start event monitor" Aug 13 00:32:23.263172 containerd[1576]: time="2025-08-13T00:32:23.263161179Z" level=info msg="Start cni network conf syncer for default" Aug 13 00:32:23.263506 containerd[1576]: time="2025-08-13T00:32:23.263495355Z" level=info msg="Start streaming server" Aug 13 00:32:23.263555 containerd[1576]: time="2025-08-13T00:32:23.263546421Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 13 00:32:23.263609 containerd[1576]: time="2025-08-13T00:32:23.263600933Z" level=info msg="runtime interface starting up..." Aug 13 00:32:23.263661 containerd[1576]: time="2025-08-13T00:32:23.263651738Z" level=info msg="starting plugins..." Aug 13 00:32:23.263736 containerd[1576]: time="2025-08-13T00:32:23.263726308Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 13 00:32:23.265535 containerd[1576]: time="2025-08-13T00:32:23.265521034Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 13 00:32:23.265998 containerd[1576]: time="2025-08-13T00:32:23.265659293Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 13 00:32:23.268203 systemd[1]: Started containerd.service - containerd container runtime. Aug 13 00:32:23.268626 containerd[1576]: time="2025-08-13T00:32:23.268611198Z" level=info msg="containerd successfully booted in 0.189324s" Aug 13 00:32:23.333813 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:32:23.337277 systemd-logind[1532]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 13 00:32:23.354947 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:32:23.355505 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:32:23.360907 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:32:23.363459 systemd-logind[1532]: Watching system buttons on /dev/input/event3 (Power Button) Aug 13 00:32:23.453610 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:32:23.545131 tar[1544]: linux-amd64/LICENSE Aug 13 00:32:23.545131 tar[1544]: linux-amd64/README.md Aug 13 00:32:23.560418 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 13 00:32:23.711094 coreos-metadata[1516]: Aug 13 00:32:23.710 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #2 Aug 13 00:32:23.711867 coreos-metadata[1516]: Aug 13 00:32:23.711 INFO Fetch successful Aug 13 00:32:23.712045 coreos-metadata[1516]: Aug 13 00:32:23.712 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Aug 13 00:32:23.712402 coreos-metadata[1516]: Aug 13 00:32:23.712 INFO Fetch successful Aug 13 00:32:23.752981 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 13 00:32:23.753662 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 13 00:32:24.270953 systemd-networkd[1480]: eth1: Gained IPv6LL Aug 13 00:32:24.271821 systemd-timesyncd[1452]: Network configuration changed, trying to establish connection. Aug 13 00:32:24.274666 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 13 00:32:24.275553 systemd[1]: Reached target network-online.target - Network is Online. Aug 13 00:32:24.278524 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:32:24.282003 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 13 00:32:24.317477 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 13 00:32:24.782913 systemd-networkd[1480]: eth0: Gained IPv6LL Aug 13 00:32:24.783385 systemd-timesyncd[1452]: Network configuration changed, trying to establish connection. Aug 13 00:32:25.248061 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:32:25.248527 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 13 00:32:25.249131 systemd[1]: Startup finished in 3.091s (kernel) + 5.204s (initrd) + 4.454s (userspace) = 12.750s. Aug 13 00:32:25.259298 (kubelet)[1701]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:32:25.776221 kubelet[1701]: E0813 00:32:25.776164 1701 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:32:25.778465 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:32:25.778598 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:32:25.778911 systemd[1]: kubelet.service: Consumed 944ms CPU time, 263.8M memory peak. Aug 13 00:32:31.680542 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 13 00:32:31.682058 systemd[1]: Started sshd@0-95.217.135.102:22-139.178.89.65:32808.service - OpenSSH per-connection server daemon (139.178.89.65:32808). Aug 13 00:32:32.668825 sshd[1713]: Accepted publickey for core from 139.178.89.65 port 32808 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:32:32.670517 sshd-session[1713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:32.680598 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 13 00:32:32.681629 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 13 00:32:32.685407 systemd-logind[1532]: New session 1 of user core. Aug 13 00:32:32.699923 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 13 00:32:32.702546 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 13 00:32:32.711867 (systemd)[1717]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 13 00:32:32.714056 systemd-logind[1532]: New session c1 of user core. Aug 13 00:32:32.847185 systemd[1717]: Queued start job for default target default.target. Aug 13 00:32:32.857408 systemd[1717]: Created slice app.slice - User Application Slice. Aug 13 00:32:32.857430 systemd[1717]: Reached target paths.target - Paths. Aug 13 00:32:32.857462 systemd[1717]: Reached target timers.target - Timers. Aug 13 00:32:32.858404 systemd[1717]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 13 00:32:32.867547 systemd[1717]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 13 00:32:32.867584 systemd[1717]: Reached target sockets.target - Sockets. Aug 13 00:32:32.867619 systemd[1717]: Reached target basic.target - Basic System. Aug 13 00:32:32.867646 systemd[1717]: Reached target default.target - Main User Target. Aug 13 00:32:32.867677 systemd[1717]: Startup finished in 147ms. Aug 13 00:32:32.867770 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 13 00:32:32.875921 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 13 00:32:33.559870 systemd[1]: Started sshd@1-95.217.135.102:22-139.178.89.65:32810.service - OpenSSH per-connection server daemon (139.178.89.65:32810). Aug 13 00:32:34.554606 sshd[1728]: Accepted publickey for core from 139.178.89.65 port 32810 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:32:34.556023 sshd-session[1728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:34.561127 systemd-logind[1532]: New session 2 of user core. Aug 13 00:32:34.565844 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 13 00:32:35.224446 sshd[1730]: Connection closed by 139.178.89.65 port 32810 Aug 13 00:32:35.225058 sshd-session[1728]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:35.228077 systemd[1]: sshd@1-95.217.135.102:22-139.178.89.65:32810.service: Deactivated successfully. Aug 13 00:32:35.229657 systemd[1]: session-2.scope: Deactivated successfully. Aug 13 00:32:35.230412 systemd-logind[1532]: Session 2 logged out. Waiting for processes to exit. Aug 13 00:32:35.232158 systemd-logind[1532]: Removed session 2. Aug 13 00:32:35.395851 systemd[1]: Started sshd@2-95.217.135.102:22-139.178.89.65:32818.service - OpenSSH per-connection server daemon (139.178.89.65:32818). Aug 13 00:32:36.029201 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 13 00:32:36.030925 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:32:36.150536 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:32:36.162953 (kubelet)[1746]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:32:36.202313 kubelet[1746]: E0813 00:32:36.202260 1746 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:32:36.205996 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:32:36.206111 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:32:36.206363 systemd[1]: kubelet.service: Consumed 127ms CPU time, 110.9M memory peak. Aug 13 00:32:36.372653 sshd[1736]: Accepted publickey for core from 139.178.89.65 port 32818 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:32:36.374196 sshd-session[1736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:36.380189 systemd-logind[1532]: New session 3 of user core. Aug 13 00:32:36.385857 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 13 00:32:37.041788 sshd[1753]: Connection closed by 139.178.89.65 port 32818 Aug 13 00:32:37.042346 sshd-session[1736]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:37.046267 systemd-logind[1532]: Session 3 logged out. Waiting for processes to exit. Aug 13 00:32:37.046422 systemd[1]: sshd@2-95.217.135.102:22-139.178.89.65:32818.service: Deactivated successfully. Aug 13 00:32:37.048192 systemd[1]: session-3.scope: Deactivated successfully. Aug 13 00:32:37.049265 systemd-logind[1532]: Removed session 3. Aug 13 00:32:37.207187 systemd[1]: Started sshd@3-95.217.135.102:22-139.178.89.65:32822.service - OpenSSH per-connection server daemon (139.178.89.65:32822). Aug 13 00:32:38.193344 sshd[1759]: Accepted publickey for core from 139.178.89.65 port 32822 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:32:38.194535 sshd-session[1759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:38.199599 systemd-logind[1532]: New session 4 of user core. Aug 13 00:32:38.204844 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 13 00:32:38.868329 sshd[1761]: Connection closed by 139.178.89.65 port 32822 Aug 13 00:32:38.869247 sshd-session[1759]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:38.874516 systemd-logind[1532]: Session 4 logged out. Waiting for processes to exit. Aug 13 00:32:38.874634 systemd[1]: sshd@3-95.217.135.102:22-139.178.89.65:32822.service: Deactivated successfully. Aug 13 00:32:38.877182 systemd[1]: session-4.scope: Deactivated successfully. Aug 13 00:32:38.879400 systemd-logind[1532]: Removed session 4. Aug 13 00:32:39.039431 systemd[1]: Started sshd@4-95.217.135.102:22-139.178.89.65:48788.service - OpenSSH per-connection server daemon (139.178.89.65:48788). Aug 13 00:32:40.031044 sshd[1767]: Accepted publickey for core from 139.178.89.65 port 48788 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:32:40.032359 sshd-session[1767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:40.037480 systemd-logind[1532]: New session 5 of user core. Aug 13 00:32:40.051845 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 13 00:32:40.555256 sudo[1770]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 13 00:32:40.555499 sudo[1770]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:32:40.569122 sudo[1770]: pam_unix(sudo:session): session closed for user root Aug 13 00:32:40.726608 sshd[1769]: Connection closed by 139.178.89.65 port 48788 Aug 13 00:32:40.727491 sshd-session[1767]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:40.731173 systemd[1]: sshd@4-95.217.135.102:22-139.178.89.65:48788.service: Deactivated successfully. Aug 13 00:32:40.733201 systemd[1]: session-5.scope: Deactivated successfully. Aug 13 00:32:40.734647 systemd-logind[1532]: Session 5 logged out. Waiting for processes to exit. Aug 13 00:32:40.735970 systemd-logind[1532]: Removed session 5. Aug 13 00:32:40.896558 systemd[1]: Started sshd@5-95.217.135.102:22-139.178.89.65:48800.service - OpenSSH per-connection server daemon (139.178.89.65:48800). Aug 13 00:32:41.880188 sshd[1776]: Accepted publickey for core from 139.178.89.65 port 48800 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:32:41.881879 sshd-session[1776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:41.887598 systemd-logind[1532]: New session 6 of user core. Aug 13 00:32:41.899869 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 13 00:32:42.395924 sudo[1780]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 13 00:32:42.396216 sudo[1780]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:32:42.401767 sudo[1780]: pam_unix(sudo:session): session closed for user root Aug 13 00:32:42.408559 sudo[1779]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 13 00:32:42.408879 sudo[1779]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:32:42.420312 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 00:32:42.459647 augenrules[1802]: No rules Aug 13 00:32:42.460625 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:32:42.460859 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 00:32:42.462198 sudo[1779]: pam_unix(sudo:session): session closed for user root Aug 13 00:32:42.619004 sshd[1778]: Connection closed by 139.178.89.65 port 48800 Aug 13 00:32:42.619498 sshd-session[1776]: pam_unix(sshd:session): session closed for user core Aug 13 00:32:42.622138 systemd[1]: sshd@5-95.217.135.102:22-139.178.89.65:48800.service: Deactivated successfully. Aug 13 00:32:42.623982 systemd[1]: session-6.scope: Deactivated successfully. Aug 13 00:32:42.624807 systemd-logind[1532]: Session 6 logged out. Waiting for processes to exit. Aug 13 00:32:42.625941 systemd-logind[1532]: Removed session 6. Aug 13 00:32:42.790176 systemd[1]: Started sshd@6-95.217.135.102:22-139.178.89.65:48804.service - OpenSSH per-connection server daemon (139.178.89.65:48804). Aug 13 00:32:42.800178 systemd[1]: Started sshd@7-95.217.135.102:22-103.232.81.5:35890.service - OpenSSH per-connection server daemon (103.232.81.5:35890). Aug 13 00:32:43.233778 sshd[1813]: Connection closed by 103.232.81.5 port 35890 [preauth] Aug 13 00:32:43.236275 systemd[1]: sshd@7-95.217.135.102:22-103.232.81.5:35890.service: Deactivated successfully. Aug 13 00:32:43.763851 sshd[1811]: Accepted publickey for core from 139.178.89.65 port 48804 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:32:43.765774 sshd-session[1811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:32:43.773086 systemd-logind[1532]: New session 7 of user core. Aug 13 00:32:43.783037 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 13 00:32:44.279671 sudo[1819]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 13 00:32:44.279983 sudo[1819]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:32:44.599277 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 13 00:32:44.614916 (dockerd)[1838]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 13 00:32:44.840738 dockerd[1838]: time="2025-08-13T00:32:44.840393947Z" level=info msg="Starting up" Aug 13 00:32:44.842090 dockerd[1838]: time="2025-08-13T00:32:44.842058438Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 13 00:32:44.888177 systemd[1]: var-lib-docker-metacopy\x2dcheck2479026560-merged.mount: Deactivated successfully. Aug 13 00:32:44.922255 dockerd[1838]: time="2025-08-13T00:32:44.922000728Z" level=info msg="Loading containers: start." Aug 13 00:32:44.932747 kernel: Initializing XFRM netlink socket Aug 13 00:32:45.113255 systemd-timesyncd[1452]: Network configuration changed, trying to establish connection. Aug 13 00:32:45.709737 systemd-resolved[1420]: Clock change detected. Flushing caches. Aug 13 00:32:45.711465 systemd-timesyncd[1452]: Contacted time server 148.251.235.164:123 (2.flatcar.pool.ntp.org). Aug 13 00:32:45.711518 systemd-timesyncd[1452]: Initial clock synchronization to Wed 2025-08-13 00:32:45.709607 UTC. Aug 13 00:32:45.722422 systemd-networkd[1480]: docker0: Link UP Aug 13 00:32:45.728723 dockerd[1838]: time="2025-08-13T00:32:45.728672021Z" level=info msg="Loading containers: done." Aug 13 00:32:45.744415 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck775965989-merged.mount: Deactivated successfully. Aug 13 00:32:45.747203 dockerd[1838]: time="2025-08-13T00:32:45.747155179Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 13 00:32:45.747288 dockerd[1838]: time="2025-08-13T00:32:45.747243454Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Aug 13 00:32:45.747399 dockerd[1838]: time="2025-08-13T00:32:45.747372286Z" level=info msg="Initializing buildkit" Aug 13 00:32:45.771701 dockerd[1838]: time="2025-08-13T00:32:45.771591881Z" level=info msg="Completed buildkit initialization" Aug 13 00:32:45.779341 dockerd[1838]: time="2025-08-13T00:32:45.779299974Z" level=info msg="Daemon has completed initialization" Aug 13 00:32:45.779492 dockerd[1838]: time="2025-08-13T00:32:45.779450245Z" level=info msg="API listen on /run/docker.sock" Aug 13 00:32:45.779587 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 13 00:32:47.004809 containerd[1576]: time="2025-08-13T00:32:47.004750207Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.11\"" Aug 13 00:32:47.025379 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 13 00:32:47.029213 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:32:47.179054 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:32:47.189559 (kubelet)[2050]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:32:47.223465 kubelet[2050]: E0813 00:32:47.223413 2050 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:32:47.225579 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:32:47.225696 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:32:47.225958 systemd[1]: kubelet.service: Consumed 139ms CPU time, 108.4M memory peak. Aug 13 00:32:47.603886 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1034168270.mount: Deactivated successfully. Aug 13 00:32:48.500540 containerd[1576]: time="2025-08-13T00:32:48.500485303Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:48.501385 containerd[1576]: time="2025-08-13T00:32:48.501357198Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.11: active requests=0, bytes read=28077853" Aug 13 00:32:48.502449 containerd[1576]: time="2025-08-13T00:32:48.502402608Z" level=info msg="ImageCreate event name:\"sha256:ea7fa3cfabed1b85e7de8e0a02356b6dcb7708442d6e4600d68abaebe1e9b1fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:48.504364 containerd[1576]: time="2025-08-13T00:32:48.504293344Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a3d1c4440817725a1b503a7ccce94f3dce2b208ebf257b405dc2d97817df3dde\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:48.505129 containerd[1576]: time="2025-08-13T00:32:48.505001592Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.11\" with image id \"sha256:ea7fa3cfabed1b85e7de8e0a02356b6dcb7708442d6e4600d68abaebe1e9b1fc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a3d1c4440817725a1b503a7ccce94f3dce2b208ebf257b405dc2d97817df3dde\", size \"28074559\" in 1.500214264s" Aug 13 00:32:48.505129 containerd[1576]: time="2025-08-13T00:32:48.505028362Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.11\" returns image reference \"sha256:ea7fa3cfabed1b85e7de8e0a02356b6dcb7708442d6e4600d68abaebe1e9b1fc\"" Aug 13 00:32:48.505738 containerd[1576]: time="2025-08-13T00:32:48.505665967Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.11\"" Aug 13 00:32:49.588490 containerd[1576]: time="2025-08-13T00:32:49.588437204Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:49.589497 containerd[1576]: time="2025-08-13T00:32:49.589470541Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.11: active requests=0, bytes read=24713267" Aug 13 00:32:49.590436 containerd[1576]: time="2025-08-13T00:32:49.590399734Z" level=info msg="ImageCreate event name:\"sha256:c057eceea4b436b01f9ce394734cfb06f13b2a3688c3983270e99743370b6051\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:49.593218 containerd[1576]: time="2025-08-13T00:32:49.593162204Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:0f19de157f3d251f5ddeb6e9d026895bc55cb02592874b326fa345c57e5e2848\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:49.594115 containerd[1576]: time="2025-08-13T00:32:49.593986820Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.11\" with image id \"sha256:c057eceea4b436b01f9ce394734cfb06f13b2a3688c3983270e99743370b6051\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:0f19de157f3d251f5ddeb6e9d026895bc55cb02592874b326fa345c57e5e2848\", size \"26315079\" in 1.088300013s" Aug 13 00:32:49.594115 containerd[1576]: time="2025-08-13T00:32:49.594012528Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.11\" returns image reference \"sha256:c057eceea4b436b01f9ce394734cfb06f13b2a3688c3983270e99743370b6051\"" Aug 13 00:32:49.594664 containerd[1576]: time="2025-08-13T00:32:49.594635025Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.11\"" Aug 13 00:32:50.586368 containerd[1576]: time="2025-08-13T00:32:50.586299517Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:50.587106 containerd[1576]: time="2025-08-13T00:32:50.587079520Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.11: active requests=0, bytes read=18783722" Aug 13 00:32:50.587954 containerd[1576]: time="2025-08-13T00:32:50.587919284Z" level=info msg="ImageCreate event name:\"sha256:64e6a0b453108c87da0bb61473b35fd54078119a09edc56a4c8cb31602437c58\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:50.589859 containerd[1576]: time="2025-08-13T00:32:50.589819348Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:1a9b59b3bfa6c1f1911f6f865a795620c461d079e413061bb71981cadd67f39d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:50.590513 containerd[1576]: time="2025-08-13T00:32:50.590407149Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.11\" with image id \"sha256:64e6a0b453108c87da0bb61473b35fd54078119a09edc56a4c8cb31602437c58\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:1a9b59b3bfa6c1f1911f6f865a795620c461d079e413061bb71981cadd67f39d\", size \"20385552\" in 995.730396ms" Aug 13 00:32:50.590513 containerd[1576]: time="2025-08-13T00:32:50.590431765Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.11\" returns image reference \"sha256:64e6a0b453108c87da0bb61473b35fd54078119a09edc56a4c8cb31602437c58\"" Aug 13 00:32:50.591197 containerd[1576]: time="2025-08-13T00:32:50.591171573Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.11\"" Aug 13 00:32:51.561106 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2578760759.mount: Deactivated successfully. Aug 13 00:32:51.842107 containerd[1576]: time="2025-08-13T00:32:51.842002989Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:51.843008 containerd[1576]: time="2025-08-13T00:32:51.842980553Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.11: active requests=0, bytes read=30383640" Aug 13 00:32:51.843936 containerd[1576]: time="2025-08-13T00:32:51.843900317Z" level=info msg="ImageCreate event name:\"sha256:0cec28fd5c3c446ec52e2886ddea38bf7f7e17755aa5d0095d50d3df5914a8fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:51.845229 containerd[1576]: time="2025-08-13T00:32:51.845213118Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a31da847792c5e7e92e91b78da1ad21d693e4b2b48d0e9f4610c8764dc2a5d79\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:51.845731 containerd[1576]: time="2025-08-13T00:32:51.845602058Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.11\" with image id \"sha256:0cec28fd5c3c446ec52e2886ddea38bf7f7e17755aa5d0095d50d3df5914a8fd\", repo tag \"registry.k8s.io/kube-proxy:v1.31.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:a31da847792c5e7e92e91b78da1ad21d693e4b2b48d0e9f4610c8764dc2a5d79\", size \"30382631\" in 1.254405649s" Aug 13 00:32:51.845731 containerd[1576]: time="2025-08-13T00:32:51.845628497Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.11\" returns image reference \"sha256:0cec28fd5c3c446ec52e2886ddea38bf7f7e17755aa5d0095d50d3df5914a8fd\"" Aug 13 00:32:51.846011 containerd[1576]: time="2025-08-13T00:32:51.845975067Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Aug 13 00:32:52.317942 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1206140746.mount: Deactivated successfully. Aug 13 00:32:53.069695 containerd[1576]: time="2025-08-13T00:32:53.069638020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:53.070588 containerd[1576]: time="2025-08-13T00:32:53.070550331Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Aug 13 00:32:53.071621 containerd[1576]: time="2025-08-13T00:32:53.071575723Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:53.073969 containerd[1576]: time="2025-08-13T00:32:53.073928956Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:53.074662 containerd[1576]: time="2025-08-13T00:32:53.074621495Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.228619697s" Aug 13 00:32:53.074711 containerd[1576]: time="2025-08-13T00:32:53.074662511Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Aug 13 00:32:53.075136 containerd[1576]: time="2025-08-13T00:32:53.075112395Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 13 00:32:53.520013 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3236669910.mount: Deactivated successfully. Aug 13 00:32:53.525967 containerd[1576]: time="2025-08-13T00:32:53.525902572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:32:53.526663 containerd[1576]: time="2025-08-13T00:32:53.526626811Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Aug 13 00:32:53.527649 containerd[1576]: time="2025-08-13T00:32:53.527604944Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:32:53.529474 containerd[1576]: time="2025-08-13T00:32:53.529421811Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:32:53.530295 containerd[1576]: time="2025-08-13T00:32:53.529951054Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 454.809544ms" Aug 13 00:32:53.530295 containerd[1576]: time="2025-08-13T00:32:53.529985619Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 13 00:32:53.530493 containerd[1576]: time="2025-08-13T00:32:53.530454487Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Aug 13 00:32:54.034252 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4155745165.mount: Deactivated successfully. Aug 13 00:32:55.266255 containerd[1576]: time="2025-08-13T00:32:55.266201681Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:55.267231 containerd[1576]: time="2025-08-13T00:32:55.267198109Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780083" Aug 13 00:32:55.268128 containerd[1576]: time="2025-08-13T00:32:55.268055647Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:55.270944 containerd[1576]: time="2025-08-13T00:32:55.270887387Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:32:55.271871 containerd[1576]: time="2025-08-13T00:32:55.271701052Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 1.741217631s" Aug 13 00:32:55.271871 containerd[1576]: time="2025-08-13T00:32:55.271731570Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Aug 13 00:32:57.477113 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 13 00:32:57.482641 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:32:57.625456 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:32:57.629083 (kubelet)[2265]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:32:57.671899 kubelet[2265]: E0813 00:32:57.671829 2265 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:32:57.672959 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:32:57.673071 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:32:57.673298 systemd[1]: kubelet.service: Consumed 137ms CPU time, 111M memory peak. Aug 13 00:32:59.094545 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:32:59.094812 systemd[1]: kubelet.service: Consumed 137ms CPU time, 111M memory peak. Aug 13 00:32:59.099200 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:32:59.133532 systemd[1]: Reload requested from client PID 2279 ('systemctl') (unit session-7.scope)... Aug 13 00:32:59.133553 systemd[1]: Reloading... Aug 13 00:32:59.224383 zram_generator::config[2319]: No configuration found. Aug 13 00:32:59.309254 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:32:59.403518 systemd[1]: Reloading finished in 269 ms. Aug 13 00:32:59.458869 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 13 00:32:59.458953 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 13 00:32:59.459210 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:32:59.459258 systemd[1]: kubelet.service: Consumed 80ms CPU time, 98.6M memory peak. Aug 13 00:32:59.461749 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:32:59.566620 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:32:59.572672 (kubelet)[2377]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:32:59.607404 kubelet[2377]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:32:59.607404 kubelet[2377]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 13 00:32:59.607404 kubelet[2377]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:32:59.607404 kubelet[2377]: I0813 00:32:59.606701 2377 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:32:59.867925 kubelet[2377]: I0813 00:32:59.867815 2377 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 13 00:32:59.867925 kubelet[2377]: I0813 00:32:59.867860 2377 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:32:59.868412 kubelet[2377]: I0813 00:32:59.868316 2377 server.go:934] "Client rotation is on, will bootstrap in background" Aug 13 00:32:59.899709 kubelet[2377]: I0813 00:32:59.899654 2377 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:32:59.903260 kubelet[2377]: E0813 00:32:59.903213 2377 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://95.217.135.102:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 95.217.135.102:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:32:59.931546 kubelet[2377]: I0813 00:32:59.931489 2377 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 00:32:59.938476 kubelet[2377]: I0813 00:32:59.938434 2377 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:32:59.939797 kubelet[2377]: I0813 00:32:59.939743 2377 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 13 00:32:59.940010 kubelet[2377]: I0813 00:32:59.939959 2377 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:32:59.940176 kubelet[2377]: I0813 00:32:59.939996 2377 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-1-0-4-15a6623c0c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:32:59.940176 kubelet[2377]: I0813 00:32:59.940170 2377 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:32:59.940176 kubelet[2377]: I0813 00:32:59.940177 2377 container_manager_linux.go:300] "Creating device plugin manager" Aug 13 00:32:59.940320 kubelet[2377]: I0813 00:32:59.940283 2377 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:32:59.943129 kubelet[2377]: I0813 00:32:59.942881 2377 kubelet.go:408] "Attempting to sync node with API server" Aug 13 00:32:59.943129 kubelet[2377]: I0813 00:32:59.942916 2377 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:32:59.943129 kubelet[2377]: I0813 00:32:59.942953 2377 kubelet.go:314] "Adding apiserver pod source" Aug 13 00:32:59.943129 kubelet[2377]: I0813 00:32:59.942973 2377 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:32:59.946442 kubelet[2377]: W0813 00:32:59.946310 2377 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://95.217.135.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-1-0-4-15a6623c0c&limit=500&resourceVersion=0": dial tcp 95.217.135.102:6443: connect: connection refused Aug 13 00:32:59.946442 kubelet[2377]: E0813 00:32:59.946434 2377 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://95.217.135.102:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-1-0-4-15a6623c0c&limit=500&resourceVersion=0\": dial tcp 95.217.135.102:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:32:59.947056 kubelet[2377]: W0813 00:32:59.946936 2377 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://95.217.135.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 95.217.135.102:6443: connect: connection refused Aug 13 00:32:59.947056 kubelet[2377]: E0813 00:32:59.946975 2377 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://95.217.135.102:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 95.217.135.102:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:32:59.947303 kubelet[2377]: I0813 00:32:59.947243 2377 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 00:32:59.949796 kubelet[2377]: I0813 00:32:59.949756 2377 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 13 00:32:59.949851 kubelet[2377]: W0813 00:32:59.949807 2377 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 13 00:32:59.952153 kubelet[2377]: I0813 00:32:59.951209 2377 server.go:1274] "Started kubelet" Aug 13 00:32:59.952606 kubelet[2377]: I0813 00:32:59.952573 2377 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:32:59.952830 kubelet[2377]: I0813 00:32:59.952802 2377 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:32:59.952993 kubelet[2377]: I0813 00:32:59.952978 2377 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:32:59.954249 kubelet[2377]: I0813 00:32:59.953096 2377 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:32:59.956486 kubelet[2377]: I0813 00:32:59.956004 2377 server.go:449] "Adding debug handlers to kubelet server" Aug 13 00:32:59.960947 kubelet[2377]: I0813 00:32:59.960933 2377 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 13 00:32:59.961252 kubelet[2377]: E0813 00:32:59.961235 2377 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372-1-0-4-15a6623c0c\" not found" Aug 13 00:32:59.963678 kubelet[2377]: I0813 00:32:59.963665 2377 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 13 00:32:59.963792 kubelet[2377]: I0813 00:32:59.963781 2377 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:32:59.965548 kubelet[2377]: I0813 00:32:59.965524 2377 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:32:59.972565 kubelet[2377]: E0813 00:32:59.970419 2377 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://95.217.135.102:6443/api/v1/namespaces/default/events\": dial tcp 95.217.135.102:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372-1-0-4-15a6623c0c.185b2c4ae3c09bcd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-1-0-4-15a6623c0c,UID:ci-4372-1-0-4-15a6623c0c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-4-15a6623c0c,},FirstTimestamp:2025-08-13 00:32:59.951193037 +0000 UTC m=+0.375278767,LastTimestamp:2025-08-13 00:32:59.951193037 +0000 UTC m=+0.375278767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-4-15a6623c0c,}" Aug 13 00:32:59.972565 kubelet[2377]: I0813 00:32:59.971821 2377 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:32:59.975906 kubelet[2377]: E0813 00:32:59.975170 2377 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.135.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-4-15a6623c0c?timeout=10s\": dial tcp 95.217.135.102:6443: connect: connection refused" interval="200ms" Aug 13 00:32:59.975906 kubelet[2377]: W0813 00:32:59.975546 2377 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://95.217.135.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 95.217.135.102:6443: connect: connection refused Aug 13 00:32:59.975906 kubelet[2377]: E0813 00:32:59.975568 2377 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://95.217.135.102:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 95.217.135.102:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:32:59.977123 kubelet[2377]: E0813 00:32:59.977098 2377 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 00:32:59.977383 kubelet[2377]: I0813 00:32:59.977194 2377 factory.go:221] Registration of the containerd container factory successfully Aug 13 00:32:59.977383 kubelet[2377]: I0813 00:32:59.977206 2377 factory.go:221] Registration of the systemd container factory successfully Aug 13 00:32:59.983601 kubelet[2377]: I0813 00:32:59.983558 2377 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 13 00:32:59.984847 kubelet[2377]: I0813 00:32:59.984830 2377 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 13 00:32:59.984925 kubelet[2377]: I0813 00:32:59.984915 2377 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 13 00:32:59.985000 kubelet[2377]: I0813 00:32:59.984990 2377 kubelet.go:2321] "Starting kubelet main sync loop" Aug 13 00:32:59.985099 kubelet[2377]: E0813 00:32:59.985080 2377 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:32:59.990573 kubelet[2377]: W0813 00:32:59.990523 2377 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://95.217.135.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 95.217.135.102:6443: connect: connection refused Aug 13 00:32:59.990911 kubelet[2377]: E0813 00:32:59.990873 2377 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://95.217.135.102:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 95.217.135.102:6443: connect: connection refused" logger="UnhandledError" Aug 13 00:33:00.001775 kubelet[2377]: I0813 00:33:00.001760 2377 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 13 00:33:00.001775 kubelet[2377]: I0813 00:33:00.001772 2377 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 13 00:33:00.001866 kubelet[2377]: I0813 00:33:00.001784 2377 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:33:00.003921 kubelet[2377]: I0813 00:33:00.003893 2377 policy_none.go:49] "None policy: Start" Aug 13 00:33:00.004324 kubelet[2377]: I0813 00:33:00.004306 2377 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 13 00:33:00.004324 kubelet[2377]: I0813 00:33:00.004323 2377 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:33:00.009442 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 13 00:33:00.018718 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 13 00:33:00.021674 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 13 00:33:00.032112 kubelet[2377]: I0813 00:33:00.031973 2377 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 13 00:33:00.032279 kubelet[2377]: I0813 00:33:00.032259 2377 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:33:00.032399 kubelet[2377]: I0813 00:33:00.032327 2377 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:33:00.033576 kubelet[2377]: I0813 00:33:00.033553 2377 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:33:00.035165 kubelet[2377]: E0813 00:33:00.035141 2377 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372-1-0-4-15a6623c0c\" not found" Aug 13 00:33:00.102030 systemd[1]: Created slice kubepods-burstable-pod7083a26ff933664f33567a3bb2fe0187.slice - libcontainer container kubepods-burstable-pod7083a26ff933664f33567a3bb2fe0187.slice. Aug 13 00:33:00.124907 systemd[1]: Created slice kubepods-burstable-pod31ff89f84b5d7e2effe1140655f6b562.slice - libcontainer container kubepods-burstable-pod31ff89f84b5d7e2effe1140655f6b562.slice. Aug 13 00:33:00.134566 kubelet[2377]: I0813 00:33:00.134521 2377 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.135245 kubelet[2377]: E0813 00:33:00.135193 2377 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://95.217.135.102:6443/api/v1/nodes\": dial tcp 95.217.135.102:6443: connect: connection refused" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.137591 systemd[1]: Created slice kubepods-burstable-pod4664e7e52680640f7f77502f189cd141.slice - libcontainer container kubepods-burstable-pod4664e7e52680640f7f77502f189cd141.slice. Aug 13 00:33:00.176328 kubelet[2377]: E0813 00:33:00.176276 2377 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.135.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-4-15a6623c0c?timeout=10s\": dial tcp 95.217.135.102:6443: connect: connection refused" interval="400ms" Aug 13 00:33:00.265819 kubelet[2377]: I0813 00:33:00.265778 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7083a26ff933664f33567a3bb2fe0187-ca-certs\") pod \"kube-apiserver-ci-4372-1-0-4-15a6623c0c\" (UID: \"7083a26ff933664f33567a3bb2fe0187\") " pod="kube-system/kube-apiserver-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.265819 kubelet[2377]: I0813 00:33:00.265812 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7083a26ff933664f33567a3bb2fe0187-k8s-certs\") pod \"kube-apiserver-ci-4372-1-0-4-15a6623c0c\" (UID: \"7083a26ff933664f33567a3bb2fe0187\") " pod="kube-system/kube-apiserver-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.265819 kubelet[2377]: I0813 00:33:00.265828 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-k8s-certs\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.265819 kubelet[2377]: I0813 00:33:00.265841 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-kubeconfig\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.266187 kubelet[2377]: I0813 00:33:00.265853 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.266187 kubelet[2377]: I0813 00:33:00.265867 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4664e7e52680640f7f77502f189cd141-kubeconfig\") pod \"kube-scheduler-ci-4372-1-0-4-15a6623c0c\" (UID: \"4664e7e52680640f7f77502f189cd141\") " pod="kube-system/kube-scheduler-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.266187 kubelet[2377]: I0813 00:33:00.265879 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7083a26ff933664f33567a3bb2fe0187-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-1-0-4-15a6623c0c\" (UID: \"7083a26ff933664f33567a3bb2fe0187\") " pod="kube-system/kube-apiserver-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.266187 kubelet[2377]: I0813 00:33:00.265891 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-ca-certs\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.266187 kubelet[2377]: I0813 00:33:00.265904 2377 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.338253 kubelet[2377]: I0813 00:33:00.338189 2377 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.338736 kubelet[2377]: E0813 00:33:00.338692 2377 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://95.217.135.102:6443/api/v1/nodes\": dial tcp 95.217.135.102:6443: connect: connection refused" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.424487 containerd[1576]: time="2025-08-13T00:33:00.424305460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-1-0-4-15a6623c0c,Uid:7083a26ff933664f33567a3bb2fe0187,Namespace:kube-system,Attempt:0,}" Aug 13 00:33:00.440998 containerd[1576]: time="2025-08-13T00:33:00.440930594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-1-0-4-15a6623c0c,Uid:31ff89f84b5d7e2effe1140655f6b562,Namespace:kube-system,Attempt:0,}" Aug 13 00:33:00.442407 containerd[1576]: time="2025-08-13T00:33:00.442384821Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-1-0-4-15a6623c0c,Uid:4664e7e52680640f7f77502f189cd141,Namespace:kube-system,Attempt:0,}" Aug 13 00:33:00.543565 containerd[1576]: time="2025-08-13T00:33:00.543485054Z" level=info msg="connecting to shim bfa7d6a90a1c89159048fa7fdeae4e4a373e95e39fd533112b921b662a0e1061" address="unix:///run/containerd/s/4de826e8cea274fb507dca86cfa0b92293aa72d35b3a39182c32a8bf74b8dd2e" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:00.544823 containerd[1576]: time="2025-08-13T00:33:00.544771306Z" level=info msg="connecting to shim 86701f2c980ec674df0f0a94ac7abd1d68ccc39690c5cfa23c9d88a7361e223b" address="unix:///run/containerd/s/bc2362218cd5c9df655db949998ea59f203449695e745dd0867721d70a323288" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:00.545539 containerd[1576]: time="2025-08-13T00:33:00.545426865Z" level=info msg="connecting to shim b0284c1386e1a553d65ed112a51cbce2bc0f2e20045d358f3937ac8f19e4afad" address="unix:///run/containerd/s/99e57e2f43d68cfb666dd0bfa9664b8d816dd2b2ef224f65b48073e8d3ffb3cf" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:00.576954 kubelet[2377]: E0813 00:33:00.576894 2377 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.135.102:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-4-15a6623c0c?timeout=10s\": dial tcp 95.217.135.102:6443: connect: connection refused" interval="800ms" Aug 13 00:33:00.618616 systemd[1]: Started cri-containerd-86701f2c980ec674df0f0a94ac7abd1d68ccc39690c5cfa23c9d88a7361e223b.scope - libcontainer container 86701f2c980ec674df0f0a94ac7abd1d68ccc39690c5cfa23c9d88a7361e223b. Aug 13 00:33:00.623845 systemd[1]: Started cri-containerd-b0284c1386e1a553d65ed112a51cbce2bc0f2e20045d358f3937ac8f19e4afad.scope - libcontainer container b0284c1386e1a553d65ed112a51cbce2bc0f2e20045d358f3937ac8f19e4afad. Aug 13 00:33:00.625612 systemd[1]: Started cri-containerd-bfa7d6a90a1c89159048fa7fdeae4e4a373e95e39fd533112b921b662a0e1061.scope - libcontainer container bfa7d6a90a1c89159048fa7fdeae4e4a373e95e39fd533112b921b662a0e1061. Aug 13 00:33:00.686482 containerd[1576]: time="2025-08-13T00:33:00.685814426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-1-0-4-15a6623c0c,Uid:4664e7e52680640f7f77502f189cd141,Namespace:kube-system,Attempt:0,} returns sandbox id \"86701f2c980ec674df0f0a94ac7abd1d68ccc39690c5cfa23c9d88a7361e223b\"" Aug 13 00:33:00.696772 containerd[1576]: time="2025-08-13T00:33:00.696747707Z" level=info msg="CreateContainer within sandbox \"86701f2c980ec674df0f0a94ac7abd1d68ccc39690c5cfa23c9d88a7361e223b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 13 00:33:00.709143 containerd[1576]: time="2025-08-13T00:33:00.708949276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-1-0-4-15a6623c0c,Uid:7083a26ff933664f33567a3bb2fe0187,Namespace:kube-system,Attempt:0,} returns sandbox id \"bfa7d6a90a1c89159048fa7fdeae4e4a373e95e39fd533112b921b662a0e1061\"" Aug 13 00:33:00.713187 containerd[1576]: time="2025-08-13T00:33:00.712943496Z" level=info msg="CreateContainer within sandbox \"bfa7d6a90a1c89159048fa7fdeae4e4a373e95e39fd533112b921b662a0e1061\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 13 00:33:00.720431 containerd[1576]: time="2025-08-13T00:33:00.720412281Z" level=info msg="Container 5fd55096bea8d750b3882ebb926b03a732bfe606e70ecfdb1ac1c42cc74818bc: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:00.724450 containerd[1576]: time="2025-08-13T00:33:00.724321761Z" level=info msg="Container 9451ae158ac6fd1db9d4f67cd924a2a70a3d57cabf77e31d346d5234f0c10a48: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:00.728547 containerd[1576]: time="2025-08-13T00:33:00.728525964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-1-0-4-15a6623c0c,Uid:31ff89f84b5d7e2effe1140655f6b562,Namespace:kube-system,Attempt:0,} returns sandbox id \"b0284c1386e1a553d65ed112a51cbce2bc0f2e20045d358f3937ac8f19e4afad\"" Aug 13 00:33:00.729188 containerd[1576]: time="2025-08-13T00:33:00.729151978Z" level=info msg="CreateContainer within sandbox \"86701f2c980ec674df0f0a94ac7abd1d68ccc39690c5cfa23c9d88a7361e223b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5fd55096bea8d750b3882ebb926b03a732bfe606e70ecfdb1ac1c42cc74818bc\"" Aug 13 00:33:00.730877 containerd[1576]: time="2025-08-13T00:33:00.730789980Z" level=info msg="StartContainer for \"5fd55096bea8d750b3882ebb926b03a732bfe606e70ecfdb1ac1c42cc74818bc\"" Aug 13 00:33:00.730999 containerd[1576]: time="2025-08-13T00:33:00.730981309Z" level=info msg="CreateContainer within sandbox \"b0284c1386e1a553d65ed112a51cbce2bc0f2e20045d358f3937ac8f19e4afad\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 13 00:33:00.732161 containerd[1576]: time="2025-08-13T00:33:00.732119413Z" level=info msg="connecting to shim 5fd55096bea8d750b3882ebb926b03a732bfe606e70ecfdb1ac1c42cc74818bc" address="unix:///run/containerd/s/bc2362218cd5c9df655db949998ea59f203449695e745dd0867721d70a323288" protocol=ttrpc version=3 Aug 13 00:33:00.734090 containerd[1576]: time="2025-08-13T00:33:00.734055603Z" level=info msg="CreateContainer within sandbox \"bfa7d6a90a1c89159048fa7fdeae4e4a373e95e39fd533112b921b662a0e1061\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9451ae158ac6fd1db9d4f67cd924a2a70a3d57cabf77e31d346d5234f0c10a48\"" Aug 13 00:33:00.734605 containerd[1576]: time="2025-08-13T00:33:00.734549610Z" level=info msg="StartContainer for \"9451ae158ac6fd1db9d4f67cd924a2a70a3d57cabf77e31d346d5234f0c10a48\"" Aug 13 00:33:00.735721 containerd[1576]: time="2025-08-13T00:33:00.735653479Z" level=info msg="connecting to shim 9451ae158ac6fd1db9d4f67cd924a2a70a3d57cabf77e31d346d5234f0c10a48" address="unix:///run/containerd/s/4de826e8cea274fb507dca86cfa0b92293aa72d35b3a39182c32a8bf74b8dd2e" protocol=ttrpc version=3 Aug 13 00:33:00.741286 containerd[1576]: time="2025-08-13T00:33:00.741240566Z" level=info msg="Container 4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:00.741975 kubelet[2377]: I0813 00:33:00.741785 2377 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.742581 kubelet[2377]: E0813 00:33:00.742565 2377 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://95.217.135.102:6443/api/v1/nodes\": dial tcp 95.217.135.102:6443: connect: connection refused" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:00.747976 containerd[1576]: time="2025-08-13T00:33:00.747942393Z" level=info msg="CreateContainer within sandbox \"b0284c1386e1a553d65ed112a51cbce2bc0f2e20045d358f3937ac8f19e4afad\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152\"" Aug 13 00:33:00.748581 containerd[1576]: time="2025-08-13T00:33:00.748249729Z" level=info msg="StartContainer for \"4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152\"" Aug 13 00:33:00.750443 containerd[1576]: time="2025-08-13T00:33:00.750425769Z" level=info msg="connecting to shim 4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152" address="unix:///run/containerd/s/99e57e2f43d68cfb666dd0bfa9664b8d816dd2b2ef224f65b48073e8d3ffb3cf" protocol=ttrpc version=3 Aug 13 00:33:00.758702 systemd[1]: Started cri-containerd-5fd55096bea8d750b3882ebb926b03a732bfe606e70ecfdb1ac1c42cc74818bc.scope - libcontainer container 5fd55096bea8d750b3882ebb926b03a732bfe606e70ecfdb1ac1c42cc74818bc. Aug 13 00:33:00.759919 systemd[1]: Started cri-containerd-9451ae158ac6fd1db9d4f67cd924a2a70a3d57cabf77e31d346d5234f0c10a48.scope - libcontainer container 9451ae158ac6fd1db9d4f67cd924a2a70a3d57cabf77e31d346d5234f0c10a48. Aug 13 00:33:00.772519 systemd[1]: Started cri-containerd-4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152.scope - libcontainer container 4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152. Aug 13 00:33:00.828282 containerd[1576]: time="2025-08-13T00:33:00.828176009Z" level=info msg="StartContainer for \"9451ae158ac6fd1db9d4f67cd924a2a70a3d57cabf77e31d346d5234f0c10a48\" returns successfully" Aug 13 00:33:00.835271 containerd[1576]: time="2025-08-13T00:33:00.834247323Z" level=info msg="StartContainer for \"5fd55096bea8d750b3882ebb926b03a732bfe606e70ecfdb1ac1c42cc74818bc\" returns successfully" Aug 13 00:33:00.857448 containerd[1576]: time="2025-08-13T00:33:00.857411438Z" level=info msg="StartContainer for \"4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152\" returns successfully" Aug 13 00:33:01.545541 kubelet[2377]: I0813 00:33:01.545498 2377 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:01.956601 kubelet[2377]: E0813 00:33:01.956408 2377 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372-1-0-4-15a6623c0c\" not found" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:02.119309 kubelet[2377]: I0813 00:33:02.119248 2377 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:02.948836 kubelet[2377]: I0813 00:33:02.948729 2377 apiserver.go:52] "Watching apiserver" Aug 13 00:33:02.964677 kubelet[2377]: I0813 00:33:02.964636 2377 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 13 00:33:04.027910 systemd[1]: Reload requested from client PID 2646 ('systemctl') (unit session-7.scope)... Aug 13 00:33:04.027930 systemd[1]: Reloading... Aug 13 00:33:04.119417 zram_generator::config[2689]: No configuration found. Aug 13 00:33:04.198117 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:33:04.307178 systemd[1]: Reloading finished in 278 ms. Aug 13 00:33:04.335708 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:33:04.346429 systemd[1]: kubelet.service: Deactivated successfully. Aug 13 00:33:04.346617 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:33:04.346669 systemd[1]: kubelet.service: Consumed 686ms CPU time, 126.1M memory peak. Aug 13 00:33:04.348918 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:33:04.470804 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:33:04.477826 (kubelet)[2741]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:33:04.533406 kubelet[2741]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:33:04.533406 kubelet[2741]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 13 00:33:04.533406 kubelet[2741]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:33:04.533406 kubelet[2741]: I0813 00:33:04.532843 2741 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:33:04.542399 kubelet[2741]: I0813 00:33:04.542317 2741 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 13 00:33:04.542399 kubelet[2741]: I0813 00:33:04.542339 2741 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:33:04.542931 kubelet[2741]: I0813 00:33:04.542906 2741 server.go:934] "Client rotation is on, will bootstrap in background" Aug 13 00:33:04.545084 kubelet[2741]: I0813 00:33:04.544577 2741 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Aug 13 00:33:04.546481 kubelet[2741]: I0813 00:33:04.546449 2741 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:33:04.553837 kubelet[2741]: I0813 00:33:04.553816 2741 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 00:33:04.559924 kubelet[2741]: I0813 00:33:04.559860 2741 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:33:04.559970 kubelet[2741]: I0813 00:33:04.559948 2741 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 13 00:33:04.560277 kubelet[2741]: I0813 00:33:04.560023 2741 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:33:04.563216 kubelet[2741]: I0813 00:33:04.560046 2741 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-1-0-4-15a6623c0c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:33:04.563216 kubelet[2741]: I0813 00:33:04.561507 2741 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:33:04.563216 kubelet[2741]: I0813 00:33:04.561522 2741 container_manager_linux.go:300] "Creating device plugin manager" Aug 13 00:33:04.563216 kubelet[2741]: I0813 00:33:04.561557 2741 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:33:04.563216 kubelet[2741]: I0813 00:33:04.561658 2741 kubelet.go:408] "Attempting to sync node with API server" Aug 13 00:33:04.563468 kubelet[2741]: I0813 00:33:04.561669 2741 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:33:04.563468 kubelet[2741]: I0813 00:33:04.561697 2741 kubelet.go:314] "Adding apiserver pod source" Aug 13 00:33:04.563468 kubelet[2741]: I0813 00:33:04.561706 2741 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:33:04.563468 kubelet[2741]: I0813 00:33:04.562871 2741 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 00:33:04.563468 kubelet[2741]: I0813 00:33:04.563241 2741 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 13 00:33:04.563835 kubelet[2741]: I0813 00:33:04.563808 2741 server.go:1274] "Started kubelet" Aug 13 00:33:04.565836 kubelet[2741]: I0813 00:33:04.565656 2741 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:33:04.575281 kubelet[2741]: I0813 00:33:04.575166 2741 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:33:04.576408 kubelet[2741]: I0813 00:33:04.576011 2741 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:33:04.577709 kubelet[2741]: I0813 00:33:04.576976 2741 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:33:04.577709 kubelet[2741]: I0813 00:33:04.577240 2741 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:33:04.580067 kubelet[2741]: I0813 00:33:04.580057 2741 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 13 00:33:04.580558 kubelet[2741]: E0813 00:33:04.580546 2741 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372-1-0-4-15a6623c0c\" not found" Aug 13 00:33:04.583358 kubelet[2741]: I0813 00:33:04.583103 2741 server.go:449] "Adding debug handlers to kubelet server" Aug 13 00:33:04.584865 kubelet[2741]: I0813 00:33:04.584297 2741 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 13 00:33:04.585101 kubelet[2741]: I0813 00:33:04.585004 2741 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:33:04.588025 kubelet[2741]: I0813 00:33:04.587919 2741 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 13 00:33:04.589197 kubelet[2741]: I0813 00:33:04.589117 2741 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 13 00:33:04.589276 kubelet[2741]: I0813 00:33:04.589267 2741 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 13 00:33:04.589524 kubelet[2741]: I0813 00:33:04.589408 2741 kubelet.go:2321] "Starting kubelet main sync loop" Aug 13 00:33:04.589524 kubelet[2741]: E0813 00:33:04.589446 2741 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:33:04.595135 kubelet[2741]: I0813 00:33:04.595106 2741 factory.go:221] Registration of the systemd container factory successfully Aug 13 00:33:04.595214 kubelet[2741]: I0813 00:33:04.595189 2741 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:33:04.598431 kubelet[2741]: I0813 00:33:04.598406 2741 factory.go:221] Registration of the containerd container factory successfully Aug 13 00:33:04.614136 kubelet[2741]: E0813 00:33:04.614086 2741 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 00:33:04.648871 kubelet[2741]: I0813 00:33:04.648850 2741 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 13 00:33:04.649222 kubelet[2741]: I0813 00:33:04.649018 2741 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 13 00:33:04.649222 kubelet[2741]: I0813 00:33:04.649034 2741 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:33:04.649222 kubelet[2741]: I0813 00:33:04.649145 2741 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 13 00:33:04.649222 kubelet[2741]: I0813 00:33:04.649154 2741 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 13 00:33:04.649222 kubelet[2741]: I0813 00:33:04.649169 2741 policy_none.go:49] "None policy: Start" Aug 13 00:33:04.650591 kubelet[2741]: I0813 00:33:04.650580 2741 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 13 00:33:04.652288 kubelet[2741]: I0813 00:33:04.652110 2741 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:33:04.652288 kubelet[2741]: I0813 00:33:04.652242 2741 state_mem.go:75] "Updated machine memory state" Aug 13 00:33:04.657257 kubelet[2741]: I0813 00:33:04.655811 2741 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 13 00:33:04.657257 kubelet[2741]: I0813 00:33:04.655953 2741 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:33:04.657257 kubelet[2741]: I0813 00:33:04.655962 2741 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:33:04.657257 kubelet[2741]: I0813 00:33:04.656323 2741 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:33:04.758864 kubelet[2741]: I0813 00:33:04.758834 2741 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.768687 kubelet[2741]: I0813 00:33:04.768647 2741 kubelet_node_status.go:111] "Node was previously registered" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.768846 kubelet[2741]: I0813 00:33:04.768803 2741 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.886619 kubelet[2741]: I0813 00:33:04.886577 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.886619 kubelet[2741]: I0813 00:33:04.886619 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4664e7e52680640f7f77502f189cd141-kubeconfig\") pod \"kube-scheduler-ci-4372-1-0-4-15a6623c0c\" (UID: \"4664e7e52680640f7f77502f189cd141\") " pod="kube-system/kube-scheduler-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.886619 kubelet[2741]: I0813 00:33:04.886657 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7083a26ff933664f33567a3bb2fe0187-ca-certs\") pod \"kube-apiserver-ci-4372-1-0-4-15a6623c0c\" (UID: \"7083a26ff933664f33567a3bb2fe0187\") " pod="kube-system/kube-apiserver-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.886619 kubelet[2741]: I0813 00:33:04.886691 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7083a26ff933664f33567a3bb2fe0187-k8s-certs\") pod \"kube-apiserver-ci-4372-1-0-4-15a6623c0c\" (UID: \"7083a26ff933664f33567a3bb2fe0187\") " pod="kube-system/kube-apiserver-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.887006 kubelet[2741]: I0813 00:33:04.886747 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7083a26ff933664f33567a3bb2fe0187-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-1-0-4-15a6623c0c\" (UID: \"7083a26ff933664f33567a3bb2fe0187\") " pod="kube-system/kube-apiserver-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.887006 kubelet[2741]: I0813 00:33:04.886769 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.887006 kubelet[2741]: I0813 00:33:04.886800 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-k8s-certs\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.887006 kubelet[2741]: I0813 00:33:04.886825 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-kubeconfig\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:04.887006 kubelet[2741]: I0813 00:33:04.886845 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/31ff89f84b5d7e2effe1140655f6b562-ca-certs\") pod \"kube-controller-manager-ci-4372-1-0-4-15a6623c0c\" (UID: \"31ff89f84b5d7e2effe1140655f6b562\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:05.574180 kubelet[2741]: I0813 00:33:05.574062 2741 apiserver.go:52] "Watching apiserver" Aug 13 00:33:05.585269 kubelet[2741]: I0813 00:33:05.585227 2741 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 13 00:33:05.664767 kubelet[2741]: I0813 00:33:05.664561 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372-1-0-4-15a6623c0c" podStartSLOduration=1.664545835 podStartE2EDuration="1.664545835s" podCreationTimestamp="2025-08-13 00:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:33:05.664429206 +0000 UTC m=+1.181014173" watchObservedRunningTime="2025-08-13 00:33:05.664545835 +0000 UTC m=+1.181130801" Aug 13 00:33:05.684467 kubelet[2741]: I0813 00:33:05.684204 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372-1-0-4-15a6623c0c" podStartSLOduration=1.68418494 podStartE2EDuration="1.68418494s" podCreationTimestamp="2025-08-13 00:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:33:05.675197958 +0000 UTC m=+1.191782925" watchObservedRunningTime="2025-08-13 00:33:05.68418494 +0000 UTC m=+1.200769907" Aug 13 00:33:05.696216 kubelet[2741]: I0813 00:33:05.696168 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372-1-0-4-15a6623c0c" podStartSLOduration=1.69614167 podStartE2EDuration="1.69614167s" podCreationTimestamp="2025-08-13 00:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:33:05.685014876 +0000 UTC m=+1.201599843" watchObservedRunningTime="2025-08-13 00:33:05.69614167 +0000 UTC m=+1.212726647" Aug 13 00:33:09.073373 update_engine[1534]: I20250813 00:33:09.071409 1534 update_attempter.cc:509] Updating boot flags... Aug 13 00:33:09.217623 kubelet[2741]: I0813 00:33:09.217585 2741 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 13 00:33:09.218095 containerd[1576]: time="2025-08-13T00:33:09.218033566Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 13 00:33:09.218339 kubelet[2741]: I0813 00:33:09.218314 2741 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 13 00:33:09.825983 systemd[1]: Created slice kubepods-besteffort-pod2ee24c11_bf6c_434d_98eb_6ba788af84e2.slice - libcontainer container kubepods-besteffort-pod2ee24c11_bf6c_434d_98eb_6ba788af84e2.slice. Aug 13 00:33:09.924234 kubelet[2741]: I0813 00:33:09.924169 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wst6p\" (UniqueName: \"kubernetes.io/projected/2ee24c11-bf6c-434d-98eb-6ba788af84e2-kube-api-access-wst6p\") pod \"kube-proxy-62csm\" (UID: \"2ee24c11-bf6c-434d-98eb-6ba788af84e2\") " pod="kube-system/kube-proxy-62csm" Aug 13 00:33:09.924234 kubelet[2741]: I0813 00:33:09.924243 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2ee24c11-bf6c-434d-98eb-6ba788af84e2-xtables-lock\") pod \"kube-proxy-62csm\" (UID: \"2ee24c11-bf6c-434d-98eb-6ba788af84e2\") " pod="kube-system/kube-proxy-62csm" Aug 13 00:33:09.924234 kubelet[2741]: I0813 00:33:09.924265 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2ee24c11-bf6c-434d-98eb-6ba788af84e2-kube-proxy\") pod \"kube-proxy-62csm\" (UID: \"2ee24c11-bf6c-434d-98eb-6ba788af84e2\") " pod="kube-system/kube-proxy-62csm" Aug 13 00:33:09.924576 kubelet[2741]: I0813 00:33:09.924280 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2ee24c11-bf6c-434d-98eb-6ba788af84e2-lib-modules\") pod \"kube-proxy-62csm\" (UID: \"2ee24c11-bf6c-434d-98eb-6ba788af84e2\") " pod="kube-system/kube-proxy-62csm" Aug 13 00:33:10.136456 containerd[1576]: time="2025-08-13T00:33:10.136331107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-62csm,Uid:2ee24c11-bf6c-434d-98eb-6ba788af84e2,Namespace:kube-system,Attempt:0,}" Aug 13 00:33:10.169857 containerd[1576]: time="2025-08-13T00:33:10.169665733Z" level=info msg="connecting to shim f4834565d3758cd58e2b89143e89bd4965890861b1fa5a29a47ff8d0a77d58ab" address="unix:///run/containerd/s/90fae5ccb1da2ffa36ed0715c4de3a46d2caef7702e655937b922d16d52a6dff" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:10.215601 systemd[1]: Started cri-containerd-f4834565d3758cd58e2b89143e89bd4965890861b1fa5a29a47ff8d0a77d58ab.scope - libcontainer container f4834565d3758cd58e2b89143e89bd4965890861b1fa5a29a47ff8d0a77d58ab. Aug 13 00:33:10.257534 containerd[1576]: time="2025-08-13T00:33:10.257494060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-62csm,Uid:2ee24c11-bf6c-434d-98eb-6ba788af84e2,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4834565d3758cd58e2b89143e89bd4965890861b1fa5a29a47ff8d0a77d58ab\"" Aug 13 00:33:10.273542 containerd[1576]: time="2025-08-13T00:33:10.272482775Z" level=info msg="CreateContainer within sandbox \"f4834565d3758cd58e2b89143e89bd4965890861b1fa5a29a47ff8d0a77d58ab\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 13 00:33:10.293720 containerd[1576]: time="2025-08-13T00:33:10.293259785Z" level=info msg="Container 0aa109cd10eae0d802794328a1b7b5d03f17b036711d5376dfd8f130ab65fb4a: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:10.316897 systemd[1]: Created slice kubepods-besteffort-pode3575132_684c_4c9f_a043_3294437c7e84.slice - libcontainer container kubepods-besteffort-pode3575132_684c_4c9f_a043_3294437c7e84.slice. Aug 13 00:33:10.318903 containerd[1576]: time="2025-08-13T00:33:10.318869225Z" level=info msg="CreateContainer within sandbox \"f4834565d3758cd58e2b89143e89bd4965890861b1fa5a29a47ff8d0a77d58ab\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0aa109cd10eae0d802794328a1b7b5d03f17b036711d5376dfd8f130ab65fb4a\"" Aug 13 00:33:10.321211 containerd[1576]: time="2025-08-13T00:33:10.321176511Z" level=info msg="StartContainer for \"0aa109cd10eae0d802794328a1b7b5d03f17b036711d5376dfd8f130ab65fb4a\"" Aug 13 00:33:10.322836 containerd[1576]: time="2025-08-13T00:33:10.322804564Z" level=info msg="connecting to shim 0aa109cd10eae0d802794328a1b7b5d03f17b036711d5376dfd8f130ab65fb4a" address="unix:///run/containerd/s/90fae5ccb1da2ffa36ed0715c4de3a46d2caef7702e655937b922d16d52a6dff" protocol=ttrpc version=3 Aug 13 00:33:10.326016 kubelet[2741]: I0813 00:33:10.325985 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e3575132-684c-4c9f-a043-3294437c7e84-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-tw2hb\" (UID: \"e3575132-684c-4c9f-a043-3294437c7e84\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-tw2hb" Aug 13 00:33:10.326256 kubelet[2741]: I0813 00:33:10.326026 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c6rm\" (UniqueName: \"kubernetes.io/projected/e3575132-684c-4c9f-a043-3294437c7e84-kube-api-access-6c6rm\") pod \"tigera-operator-5bf8dfcb4-tw2hb\" (UID: \"e3575132-684c-4c9f-a043-3294437c7e84\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-tw2hb" Aug 13 00:33:10.339481 systemd[1]: Started cri-containerd-0aa109cd10eae0d802794328a1b7b5d03f17b036711d5376dfd8f130ab65fb4a.scope - libcontainer container 0aa109cd10eae0d802794328a1b7b5d03f17b036711d5376dfd8f130ab65fb4a. Aug 13 00:33:10.374225 containerd[1576]: time="2025-08-13T00:33:10.374157586Z" level=info msg="StartContainer for \"0aa109cd10eae0d802794328a1b7b5d03f17b036711d5376dfd8f130ab65fb4a\" returns successfully" Aug 13 00:33:10.622577 containerd[1576]: time="2025-08-13T00:33:10.622535530Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-tw2hb,Uid:e3575132-684c-4c9f-a043-3294437c7e84,Namespace:tigera-operator,Attempt:0,}" Aug 13 00:33:10.643762 containerd[1576]: time="2025-08-13T00:33:10.643489581Z" level=info msg="connecting to shim 14dfb0563cfb7323b71775626cee118113f676bec1b312c4eaa2d4005ec3ebde" address="unix:///run/containerd/s/fdf26b845c4c3c7999b60337cc6102bb73ce512396db6e7807bc5f6a7d8d15da" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:10.676671 systemd[1]: Started cri-containerd-14dfb0563cfb7323b71775626cee118113f676bec1b312c4eaa2d4005ec3ebde.scope - libcontainer container 14dfb0563cfb7323b71775626cee118113f676bec1b312c4eaa2d4005ec3ebde. Aug 13 00:33:10.733289 containerd[1576]: time="2025-08-13T00:33:10.733240243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-tw2hb,Uid:e3575132-684c-4c9f-a043-3294437c7e84,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"14dfb0563cfb7323b71775626cee118113f676bec1b312c4eaa2d4005ec3ebde\"" Aug 13 00:33:10.736732 containerd[1576]: time="2025-08-13T00:33:10.736690192Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 13 00:33:11.935814 kubelet[2741]: I0813 00:33:11.935566 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-62csm" podStartSLOduration=2.9355489439999998 podStartE2EDuration="2.935548944s" podCreationTimestamp="2025-08-13 00:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:33:10.665905253 +0000 UTC m=+6.182490230" watchObservedRunningTime="2025-08-13 00:33:11.935548944 +0000 UTC m=+7.452133911" Aug 13 00:33:12.529629 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount877012278.mount: Deactivated successfully. Aug 13 00:33:12.968619 containerd[1576]: time="2025-08-13T00:33:12.968564112Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:12.969527 containerd[1576]: time="2025-08-13T00:33:12.969495188Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 13 00:33:12.970467 containerd[1576]: time="2025-08-13T00:33:12.970428338Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:12.972473 containerd[1576]: time="2025-08-13T00:33:12.972436083Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:12.973270 containerd[1576]: time="2025-08-13T00:33:12.973033924Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.236315029s" Aug 13 00:33:12.973270 containerd[1576]: time="2025-08-13T00:33:12.973076674Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 13 00:33:12.974808 containerd[1576]: time="2025-08-13T00:33:12.974789576Z" level=info msg="CreateContainer within sandbox \"14dfb0563cfb7323b71775626cee118113f676bec1b312c4eaa2d4005ec3ebde\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 13 00:33:12.982963 containerd[1576]: time="2025-08-13T00:33:12.982944677Z" level=info msg="Container 6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:12.990142 containerd[1576]: time="2025-08-13T00:33:12.990090778Z" level=info msg="CreateContainer within sandbox \"14dfb0563cfb7323b71775626cee118113f676bec1b312c4eaa2d4005ec3ebde\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c\"" Aug 13 00:33:12.991507 containerd[1576]: time="2025-08-13T00:33:12.991486093Z" level=info msg="StartContainer for \"6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c\"" Aug 13 00:33:12.992170 containerd[1576]: time="2025-08-13T00:33:12.992138457Z" level=info msg="connecting to shim 6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c" address="unix:///run/containerd/s/fdf26b845c4c3c7999b60337cc6102bb73ce512396db6e7807bc5f6a7d8d15da" protocol=ttrpc version=3 Aug 13 00:33:13.010499 systemd[1]: Started cri-containerd-6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c.scope - libcontainer container 6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c. Aug 13 00:33:13.038071 containerd[1576]: time="2025-08-13T00:33:13.038008738Z" level=info msg="StartContainer for \"6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c\" returns successfully" Aug 13 00:33:13.668037 kubelet[2741]: I0813 00:33:13.667502 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-tw2hb" podStartSLOduration=1.428729817 podStartE2EDuration="3.667485692s" podCreationTimestamp="2025-08-13 00:33:10 +0000 UTC" firstStartedPulling="2025-08-13 00:33:10.734996646 +0000 UTC m=+6.251581613" lastFinishedPulling="2025-08-13 00:33:12.973752521 +0000 UTC m=+8.490337488" observedRunningTime="2025-08-13 00:33:13.667247004 +0000 UTC m=+9.183831971" watchObservedRunningTime="2025-08-13 00:33:13.667485692 +0000 UTC m=+9.184070669" Aug 13 00:33:18.615900 sudo[1819]: pam_unix(sudo:session): session closed for user root Aug 13 00:33:18.774376 sshd[1818]: Connection closed by 139.178.89.65 port 48804 Aug 13 00:33:18.773184 sshd-session[1811]: pam_unix(sshd:session): session closed for user core Aug 13 00:33:18.777026 systemd-logind[1532]: Session 7 logged out. Waiting for processes to exit. Aug 13 00:33:18.778059 systemd[1]: sshd@6-95.217.135.102:22-139.178.89.65:48804.service: Deactivated successfully. Aug 13 00:33:18.781483 systemd[1]: session-7.scope: Deactivated successfully. Aug 13 00:33:18.781877 systemd[1]: session-7.scope: Consumed 5.261s CPU time, 155.9M memory peak. Aug 13 00:33:18.784990 systemd-logind[1532]: Removed session 7. Aug 13 00:33:22.932329 systemd[1]: Created slice kubepods-besteffort-pod2e25a92b_a0b7_4b50_a9b5_c970a031d11b.slice - libcontainer container kubepods-besteffort-pod2e25a92b_a0b7_4b50_a9b5_c970a031d11b.slice. Aug 13 00:33:23.002022 kubelet[2741]: I0813 00:33:23.001971 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/2e25a92b-a0b7-4b50-a9b5-c970a031d11b-typha-certs\") pod \"calico-typha-579cf4b897-xgt2b\" (UID: \"2e25a92b-a0b7-4b50-a9b5-c970a031d11b\") " pod="calico-system/calico-typha-579cf4b897-xgt2b" Aug 13 00:33:23.002367 kubelet[2741]: I0813 00:33:23.002028 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zvdm\" (UniqueName: \"kubernetes.io/projected/2e25a92b-a0b7-4b50-a9b5-c970a031d11b-kube-api-access-2zvdm\") pod \"calico-typha-579cf4b897-xgt2b\" (UID: \"2e25a92b-a0b7-4b50-a9b5-c970a031d11b\") " pod="calico-system/calico-typha-579cf4b897-xgt2b" Aug 13 00:33:23.002367 kubelet[2741]: I0813 00:33:23.002063 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e25a92b-a0b7-4b50-a9b5-c970a031d11b-tigera-ca-bundle\") pod \"calico-typha-579cf4b897-xgt2b\" (UID: \"2e25a92b-a0b7-4b50-a9b5-c970a031d11b\") " pod="calico-system/calico-typha-579cf4b897-xgt2b" Aug 13 00:33:23.242568 containerd[1576]: time="2025-08-13T00:33:23.242162836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-579cf4b897-xgt2b,Uid:2e25a92b-a0b7-4b50-a9b5-c970a031d11b,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:23.274149 systemd[1]: Created slice kubepods-besteffort-poda28e0ca2_e337_48c9_99eb_9bfe7147a17b.slice - libcontainer container kubepods-besteffort-poda28e0ca2_e337_48c9_99eb_9bfe7147a17b.slice. Aug 13 00:33:23.277800 containerd[1576]: time="2025-08-13T00:33:23.277754739Z" level=info msg="connecting to shim 51741654e9d64418483264591c2a35718c33dfa24291b21699e1140b48ec0e03" address="unix:///run/containerd/s/1c05f798db8ca7be9e40b6323de1f30d7c3e99f099411dd92976139a7519457a" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:23.303792 kubelet[2741]: I0813 00:33:23.303760 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-var-lib-calico\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303894 kubelet[2741]: I0813 00:33:23.303797 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-cni-net-dir\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303894 kubelet[2741]: I0813 00:33:23.303824 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-xtables-lock\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303894 kubelet[2741]: I0813 00:33:23.303838 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-cni-bin-dir\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303894 kubelet[2741]: I0813 00:33:23.303851 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-flexvol-driver-host\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303894 kubelet[2741]: I0813 00:33:23.303866 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-policysync\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303986 kubelet[2741]: I0813 00:33:23.303881 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-tigera-ca-bundle\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303986 kubelet[2741]: I0813 00:33:23.303894 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-lib-modules\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303986 kubelet[2741]: I0813 00:33:23.303906 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-var-run-calico\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303986 kubelet[2741]: I0813 00:33:23.303918 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27n4b\" (UniqueName: \"kubernetes.io/projected/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-kube-api-access-27n4b\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.303986 kubelet[2741]: I0813 00:33:23.303932 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-cni-log-dir\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.304069 kubelet[2741]: I0813 00:33:23.303944 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a28e0ca2-e337-48c9-99eb-9bfe7147a17b-node-certs\") pod \"calico-node-4jt7z\" (UID: \"a28e0ca2-e337-48c9-99eb-9bfe7147a17b\") " pod="calico-system/calico-node-4jt7z" Aug 13 00:33:23.306508 systemd[1]: Started cri-containerd-51741654e9d64418483264591c2a35718c33dfa24291b21699e1140b48ec0e03.scope - libcontainer container 51741654e9d64418483264591c2a35718c33dfa24291b21699e1140b48ec0e03. Aug 13 00:33:23.353735 containerd[1576]: time="2025-08-13T00:33:23.353692763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-579cf4b897-xgt2b,Uid:2e25a92b-a0b7-4b50-a9b5-c970a031d11b,Namespace:calico-system,Attempt:0,} returns sandbox id \"51741654e9d64418483264591c2a35718c33dfa24291b21699e1140b48ec0e03\"" Aug 13 00:33:23.356008 containerd[1576]: time="2025-08-13T00:33:23.355980461Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 13 00:33:23.412382 kubelet[2741]: E0813 00:33:23.412126 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.412382 kubelet[2741]: W0813 00:33:23.412158 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.412382 kubelet[2741]: E0813 00:33:23.412184 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.419099 kubelet[2741]: E0813 00:33:23.419046 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.419099 kubelet[2741]: W0813 00:33:23.419059 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.419099 kubelet[2741]: E0813 00:33:23.419076 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.500549 kubelet[2741]: E0813 00:33:23.499406 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r465b" podUID="3e889cea-019a-4296-b7d3-2bcaff0d62fa" Aug 13 00:33:23.501323 kubelet[2741]: E0813 00:33:23.501163 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.501678 kubelet[2741]: W0813 00:33:23.501590 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.501678 kubelet[2741]: E0813 00:33:23.501611 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.502741 kubelet[2741]: E0813 00:33:23.502342 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.502741 kubelet[2741]: W0813 00:33:23.502592 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.502741 kubelet[2741]: E0813 00:33:23.502607 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.503645 kubelet[2741]: E0813 00:33:23.503617 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.503801 kubelet[2741]: W0813 00:33:23.503629 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.503801 kubelet[2741]: E0813 00:33:23.503707 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.504271 kubelet[2741]: E0813 00:33:23.504145 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.504271 kubelet[2741]: W0813 00:33:23.504159 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.504271 kubelet[2741]: E0813 00:33:23.504172 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.505134 kubelet[2741]: E0813 00:33:23.505050 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.505134 kubelet[2741]: W0813 00:33:23.505061 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.505134 kubelet[2741]: E0813 00:33:23.505070 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.505390 kubelet[2741]: E0813 00:33:23.505329 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.505746 kubelet[2741]: W0813 00:33:23.505529 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.505746 kubelet[2741]: E0813 00:33:23.505601 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.506397 kubelet[2741]: E0813 00:33:23.505941 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.506397 kubelet[2741]: W0813 00:33:23.505953 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.506397 kubelet[2741]: E0813 00:33:23.505962 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.506603 kubelet[2741]: E0813 00:33:23.506540 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.506603 kubelet[2741]: W0813 00:33:23.506552 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.506603 kubelet[2741]: E0813 00:33:23.506563 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.506741 kubelet[2741]: E0813 00:33:23.506725 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.506741 kubelet[2741]: W0813 00:33:23.506739 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.506895 kubelet[2741]: E0813 00:33:23.506748 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.508408 kubelet[2741]: E0813 00:33:23.508385 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.508408 kubelet[2741]: W0813 00:33:23.508401 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.508603 kubelet[2741]: E0813 00:33:23.508422 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.508603 kubelet[2741]: E0813 00:33:23.508561 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.508603 kubelet[2741]: W0813 00:33:23.508568 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.508603 kubelet[2741]: E0813 00:33:23.508578 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.508759 kubelet[2741]: E0813 00:33:23.508692 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.508759 kubelet[2741]: W0813 00:33:23.508701 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.508759 kubelet[2741]: E0813 00:33:23.508709 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.508880 kubelet[2741]: E0813 00:33:23.508861 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.508880 kubelet[2741]: W0813 00:33:23.508874 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.508936 kubelet[2741]: E0813 00:33:23.508883 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.509005 kubelet[2741]: E0813 00:33:23.508990 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.509005 kubelet[2741]: W0813 00:33:23.509003 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.509049 kubelet[2741]: E0813 00:33:23.509010 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.509131 kubelet[2741]: E0813 00:33:23.509117 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.509131 kubelet[2741]: W0813 00:33:23.509129 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.509276 kubelet[2741]: E0813 00:33:23.509137 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.509276 kubelet[2741]: E0813 00:33:23.509244 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.509276 kubelet[2741]: W0813 00:33:23.509251 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.509276 kubelet[2741]: E0813 00:33:23.509259 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.509460 kubelet[2741]: E0813 00:33:23.509444 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.509460 kubelet[2741]: W0813 00:33:23.509451 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.509460 kubelet[2741]: E0813 00:33:23.509459 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.509667 kubelet[2741]: E0813 00:33:23.509565 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.509667 kubelet[2741]: W0813 00:33:23.509574 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.509667 kubelet[2741]: E0813 00:33:23.509581 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.509748 kubelet[2741]: E0813 00:33:23.509693 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.509748 kubelet[2741]: W0813 00:33:23.509699 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.509748 kubelet[2741]: E0813 00:33:23.509706 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.509976 kubelet[2741]: E0813 00:33:23.509808 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.509976 kubelet[2741]: W0813 00:33:23.509817 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.509976 kubelet[2741]: E0813 00:33:23.509824 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.510086 kubelet[2741]: E0813 00:33:23.510023 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.510086 kubelet[2741]: W0813 00:33:23.510030 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.510086 kubelet[2741]: E0813 00:33:23.510037 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.510086 kubelet[2741]: I0813 00:33:23.510068 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e889cea-019a-4296-b7d3-2bcaff0d62fa-kubelet-dir\") pod \"csi-node-driver-r465b\" (UID: \"3e889cea-019a-4296-b7d3-2bcaff0d62fa\") " pod="calico-system/csi-node-driver-r465b" Aug 13 00:33:23.511009 kubelet[2741]: E0813 00:33:23.510989 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.511009 kubelet[2741]: W0813 00:33:23.511004 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.511089 kubelet[2741]: E0813 00:33:23.511025 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.511089 kubelet[2741]: I0813 00:33:23.511038 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3e889cea-019a-4296-b7d3-2bcaff0d62fa-varrun\") pod \"csi-node-driver-r465b\" (UID: \"3e889cea-019a-4296-b7d3-2bcaff0d62fa\") " pod="calico-system/csi-node-driver-r465b" Aug 13 00:33:23.511323 kubelet[2741]: E0813 00:33:23.511191 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.511323 kubelet[2741]: W0813 00:33:23.511198 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.511323 kubelet[2741]: E0813 00:33:23.511223 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.511323 kubelet[2741]: I0813 00:33:23.511235 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdmjs\" (UniqueName: \"kubernetes.io/projected/3e889cea-019a-4296-b7d3-2bcaff0d62fa-kube-api-access-kdmjs\") pod \"csi-node-driver-r465b\" (UID: \"3e889cea-019a-4296-b7d3-2bcaff0d62fa\") " pod="calico-system/csi-node-driver-r465b" Aug 13 00:33:23.511563 kubelet[2741]: E0813 00:33:23.511387 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.511563 kubelet[2741]: W0813 00:33:23.511394 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.511563 kubelet[2741]: E0813 00:33:23.511480 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.511563 kubelet[2741]: I0813 00:33:23.511499 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e889cea-019a-4296-b7d3-2bcaff0d62fa-socket-dir\") pod \"csi-node-driver-r465b\" (UID: \"3e889cea-019a-4296-b7d3-2bcaff0d62fa\") " pod="calico-system/csi-node-driver-r465b" Aug 13 00:33:23.511563 kubelet[2741]: E0813 00:33:23.511537 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.511563 kubelet[2741]: W0813 00:33:23.511542 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.511859 kubelet[2741]: E0813 00:33:23.511636 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.511859 kubelet[2741]: E0813 00:33:23.511676 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.511859 kubelet[2741]: W0813 00:33:23.511681 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.511859 kubelet[2741]: E0813 00:33:23.511698 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.511859 kubelet[2741]: E0813 00:33:23.511812 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.511859 kubelet[2741]: W0813 00:33:23.511818 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.511859 kubelet[2741]: E0813 00:33:23.511826 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.512262 kubelet[2741]: E0813 00:33:23.511922 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.512262 kubelet[2741]: W0813 00:33:23.511927 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.512262 kubelet[2741]: E0813 00:33:23.511943 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.512262 kubelet[2741]: I0813 00:33:23.511954 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e889cea-019a-4296-b7d3-2bcaff0d62fa-registration-dir\") pod \"csi-node-driver-r465b\" (UID: \"3e889cea-019a-4296-b7d3-2bcaff0d62fa\") " pod="calico-system/csi-node-driver-r465b" Aug 13 00:33:23.512262 kubelet[2741]: E0813 00:33:23.512069 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.512262 kubelet[2741]: W0813 00:33:23.512076 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.512262 kubelet[2741]: E0813 00:33:23.512093 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.512262 kubelet[2741]: E0813 00:33:23.512200 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.512262 kubelet[2741]: W0813 00:33:23.512205 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.512695 kubelet[2741]: E0813 00:33:23.512211 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.512695 kubelet[2741]: E0813 00:33:23.512333 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.512695 kubelet[2741]: W0813 00:33:23.512340 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.512695 kubelet[2741]: E0813 00:33:23.512462 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.512695 kubelet[2741]: E0813 00:33:23.512500 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.512695 kubelet[2741]: W0813 00:33:23.512505 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.512695 kubelet[2741]: E0813 00:33:23.512511 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.512695 kubelet[2741]: E0813 00:33:23.512612 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.512695 kubelet[2741]: W0813 00:33:23.512618 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.512695 kubelet[2741]: E0813 00:33:23.512624 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.512901 kubelet[2741]: E0813 00:33:23.512740 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.512901 kubelet[2741]: W0813 00:33:23.512746 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.512901 kubelet[2741]: E0813 00:33:23.512752 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.512901 kubelet[2741]: E0813 00:33:23.512879 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.512901 kubelet[2741]: W0813 00:33:23.512886 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.512901 kubelet[2741]: E0813 00:33:23.512893 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.581456 containerd[1576]: time="2025-08-13T00:33:23.581387467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4jt7z,Uid:a28e0ca2-e337-48c9-99eb-9bfe7147a17b,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:23.608401 containerd[1576]: time="2025-08-13T00:33:23.608281301Z" level=info msg="connecting to shim f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb" address="unix:///run/containerd/s/5bb22841eafecc86f5bd4e211cd35e1a349f68b33e815bc3efaca3918574118e" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:23.612662 kubelet[2741]: E0813 00:33:23.612645 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.612821 kubelet[2741]: W0813 00:33:23.612745 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.612821 kubelet[2741]: E0813 00:33:23.612768 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.613333 kubelet[2741]: E0813 00:33:23.613308 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.613333 kubelet[2741]: W0813 00:33:23.613320 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.614239 kubelet[2741]: E0813 00:33:23.613463 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.616456 kubelet[2741]: E0813 00:33:23.616439 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.616610 kubelet[2741]: W0813 00:33:23.616521 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.616703 kubelet[2741]: E0813 00:33:23.616665 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.617016 kubelet[2741]: E0813 00:33:23.616898 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.617016 kubelet[2741]: W0813 00:33:23.616956 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.617370 kubelet[2741]: E0813 00:33:23.617284 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.617850 kubelet[2741]: E0813 00:33:23.617827 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.618257 kubelet[2741]: W0813 00:33:23.617995 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.618257 kubelet[2741]: E0813 00:33:23.618237 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.619293 kubelet[2741]: E0813 00:33:23.619266 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.619293 kubelet[2741]: W0813 00:33:23.619279 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.619487 kubelet[2741]: E0813 00:33:23.619474 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.620136 kubelet[2741]: E0813 00:33:23.620111 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.620136 kubelet[2741]: W0813 00:33:23.620122 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.620397 kubelet[2741]: E0813 00:33:23.620374 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.621526 kubelet[2741]: E0813 00:33:23.621502 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.621526 kubelet[2741]: W0813 00:33:23.621513 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.621702 kubelet[2741]: E0813 00:33:23.621683 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.621883 kubelet[2741]: E0813 00:33:23.621855 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.621883 kubelet[2741]: W0813 00:33:23.621867 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.622069 kubelet[2741]: E0813 00:33:23.622050 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.622314 kubelet[2741]: E0813 00:33:23.622291 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.622314 kubelet[2741]: W0813 00:33:23.622302 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.623924 kubelet[2741]: E0813 00:33:23.623870 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.624617 kubelet[2741]: E0813 00:33:23.624539 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.624617 kubelet[2741]: W0813 00:33:23.624548 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.624731 kubelet[2741]: E0813 00:33:23.624720 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.624904 kubelet[2741]: E0813 00:33:23.624877 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.624904 kubelet[2741]: W0813 00:33:23.624894 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.625226 kubelet[2741]: E0813 00:33:23.625113 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.625487 kubelet[2741]: E0813 00:33:23.625467 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.625487 kubelet[2741]: W0813 00:33:23.625476 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.625912 kubelet[2741]: E0813 00:33:23.625900 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.626061 kubelet[2741]: E0813 00:33:23.626044 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.626061 kubelet[2741]: W0813 00:33:23.626052 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.626160 kubelet[2741]: E0813 00:33:23.626152 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.626457 kubelet[2741]: E0813 00:33:23.626447 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.626554 kubelet[2741]: W0813 00:33:23.626497 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.626781 kubelet[2741]: E0813 00:33:23.626619 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.626781 kubelet[2741]: W0813 00:33:23.626629 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.626781 kubelet[2741]: E0813 00:33:23.626722 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.626781 kubelet[2741]: W0813 00:33:23.626728 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.626902 kubelet[2741]: E0813 00:33:23.626844 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.626902 kubelet[2741]: W0813 00:33:23.626851 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.626902 kubelet[2741]: E0813 00:33:23.626858 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.626902 kubelet[2741]: E0813 00:33:23.626879 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.628708 kubelet[2741]: E0813 00:33:23.626975 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.628708 kubelet[2741]: W0813 00:33:23.626983 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.628708 kubelet[2741]: E0813 00:33:23.626989 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.628708 kubelet[2741]: E0813 00:33:23.627044 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.628708 kubelet[2741]: E0813 00:33:23.627057 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.628708 kubelet[2741]: E0813 00:33:23.627068 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.628708 kubelet[2741]: W0813 00:33:23.627074 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.628708 kubelet[2741]: E0813 00:33:23.627093 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.628708 kubelet[2741]: E0813 00:33:23.627219 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.628708 kubelet[2741]: W0813 00:33:23.627225 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.628900 kubelet[2741]: E0813 00:33:23.627243 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.628900 kubelet[2741]: E0813 00:33:23.627556 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.628900 kubelet[2741]: W0813 00:33:23.627563 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.628900 kubelet[2741]: E0813 00:33:23.627573 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.628900 kubelet[2741]: E0813 00:33:23.627669 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.628900 kubelet[2741]: W0813 00:33:23.627676 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.628900 kubelet[2741]: E0813 00:33:23.627697 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.628900 kubelet[2741]: E0813 00:33:23.627820 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.628900 kubelet[2741]: W0813 00:33:23.627827 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.628900 kubelet[2741]: E0813 00:33:23.627836 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.629053 kubelet[2741]: E0813 00:33:23.628451 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.629053 kubelet[2741]: W0813 00:33:23.628460 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.629053 kubelet[2741]: E0813 00:33:23.628469 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.632527 systemd[1]: Started cri-containerd-f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb.scope - libcontainer container f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb. Aug 13 00:33:23.637243 kubelet[2741]: E0813 00:33:23.637229 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:23.637473 kubelet[2741]: W0813 00:33:23.637275 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:23.637473 kubelet[2741]: E0813 00:33:23.637287 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:23.654132 containerd[1576]: time="2025-08-13T00:33:23.654092657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-4jt7z,Uid:a28e0ca2-e337-48c9-99eb-9bfe7147a17b,Namespace:calico-system,Attempt:0,} returns sandbox id \"f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb\"" Aug 13 00:33:25.061028 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2253809160.mount: Deactivated successfully. Aug 13 00:33:25.495731 containerd[1576]: time="2025-08-13T00:33:25.495680528Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:25.496715 containerd[1576]: time="2025-08-13T00:33:25.496596856Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 13 00:33:25.497382 containerd[1576]: time="2025-08-13T00:33:25.497342314Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:25.499046 containerd[1576]: time="2025-08-13T00:33:25.499020110Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:25.499462 containerd[1576]: time="2025-08-13T00:33:25.499433185Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.143402469s" Aug 13 00:33:25.499509 containerd[1576]: time="2025-08-13T00:33:25.499464925Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 13 00:33:25.501547 containerd[1576]: time="2025-08-13T00:33:25.501372640Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 13 00:33:25.516453 containerd[1576]: time="2025-08-13T00:33:25.516320128Z" level=info msg="CreateContainer within sandbox \"51741654e9d64418483264591c2a35718c33dfa24291b21699e1140b48ec0e03\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 13 00:33:25.544089 containerd[1576]: time="2025-08-13T00:33:25.542228175Z" level=info msg="Container 3b8741c019c1916a298cdf3a37fb25aeaf43710df8e2b7ba7b6270e59818c5f9: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:25.573095 containerd[1576]: time="2025-08-13T00:33:25.573047882Z" level=info msg="CreateContainer within sandbox \"51741654e9d64418483264591c2a35718c33dfa24291b21699e1140b48ec0e03\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3b8741c019c1916a298cdf3a37fb25aeaf43710df8e2b7ba7b6270e59818c5f9\"" Aug 13 00:33:25.573527 containerd[1576]: time="2025-08-13T00:33:25.573495271Z" level=info msg="StartContainer for \"3b8741c019c1916a298cdf3a37fb25aeaf43710df8e2b7ba7b6270e59818c5f9\"" Aug 13 00:33:25.574302 containerd[1576]: time="2025-08-13T00:33:25.574262770Z" level=info msg="connecting to shim 3b8741c019c1916a298cdf3a37fb25aeaf43710df8e2b7ba7b6270e59818c5f9" address="unix:///run/containerd/s/1c05f798db8ca7be9e40b6323de1f30d7c3e99f099411dd92976139a7519457a" protocol=ttrpc version=3 Aug 13 00:33:25.590878 kubelet[2741]: E0813 00:33:25.590585 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r465b" podUID="3e889cea-019a-4296-b7d3-2bcaff0d62fa" Aug 13 00:33:25.595567 systemd[1]: Started cri-containerd-3b8741c019c1916a298cdf3a37fb25aeaf43710df8e2b7ba7b6270e59818c5f9.scope - libcontainer container 3b8741c019c1916a298cdf3a37fb25aeaf43710df8e2b7ba7b6270e59818c5f9. Aug 13 00:33:25.653323 containerd[1576]: time="2025-08-13T00:33:25.653230535Z" level=info msg="StartContainer for \"3b8741c019c1916a298cdf3a37fb25aeaf43710df8e2b7ba7b6270e59818c5f9\" returns successfully" Aug 13 00:33:25.704372 kubelet[2741]: I0813 00:33:25.704084 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-579cf4b897-xgt2b" podStartSLOduration=1.558564225 podStartE2EDuration="3.703899258s" podCreationTimestamp="2025-08-13 00:33:22 +0000 UTC" firstStartedPulling="2025-08-13 00:33:23.355150445 +0000 UTC m=+18.871735412" lastFinishedPulling="2025-08-13 00:33:25.500485478 +0000 UTC m=+21.017070445" observedRunningTime="2025-08-13 00:33:25.702973191 +0000 UTC m=+21.219558159" watchObservedRunningTime="2025-08-13 00:33:25.703899258 +0000 UTC m=+21.220484225" Aug 13 00:33:25.725920 kubelet[2741]: E0813 00:33:25.725840 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.725920 kubelet[2741]: W0813 00:33:25.725861 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.725920 kubelet[2741]: E0813 00:33:25.725882 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.727373 kubelet[2741]: E0813 00:33:25.726557 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.727373 kubelet[2741]: W0813 00:33:25.727309 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.727373 kubelet[2741]: E0813 00:33:25.727322 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.727683 kubelet[2741]: E0813 00:33:25.727633 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.727683 kubelet[2741]: W0813 00:33:25.727642 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.727683 kubelet[2741]: E0813 00:33:25.727651 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.727935 kubelet[2741]: E0813 00:33:25.727883 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.727935 kubelet[2741]: W0813 00:33:25.727892 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.727935 kubelet[2741]: E0813 00:33:25.727900 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.728337 kubelet[2741]: E0813 00:33:25.728249 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.728571 kubelet[2741]: W0813 00:33:25.728424 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.728571 kubelet[2741]: E0813 00:33:25.728439 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.728910 kubelet[2741]: E0813 00:33:25.728808 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.728910 kubelet[2741]: W0813 00:33:25.728818 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.728910 kubelet[2741]: E0813 00:33:25.728826 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.729543 kubelet[2741]: E0813 00:33:25.729458 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.729543 kubelet[2741]: W0813 00:33:25.729468 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.729543 kubelet[2741]: E0813 00:33:25.729476 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.730308 kubelet[2741]: E0813 00:33:25.729940 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.730308 kubelet[2741]: W0813 00:33:25.729947 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.730308 kubelet[2741]: E0813 00:33:25.729955 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.730508 kubelet[2741]: E0813 00:33:25.730424 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.730508 kubelet[2741]: W0813 00:33:25.730432 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.730508 kubelet[2741]: E0813 00:33:25.730440 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.731511 kubelet[2741]: E0813 00:33:25.731437 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.731511 kubelet[2741]: W0813 00:33:25.731448 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.731511 kubelet[2741]: E0813 00:33:25.731456 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.732559 kubelet[2741]: E0813 00:33:25.732482 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.732559 kubelet[2741]: W0813 00:33:25.732492 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.732559 kubelet[2741]: E0813 00:33:25.732501 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.732763 kubelet[2741]: E0813 00:33:25.732683 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.732763 kubelet[2741]: W0813 00:33:25.732692 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.732763 kubelet[2741]: E0813 00:33:25.732700 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.733519 kubelet[2741]: E0813 00:33:25.733444 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.733519 kubelet[2741]: W0813 00:33:25.733454 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.733519 kubelet[2741]: E0813 00:33:25.733462 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.733715 kubelet[2741]: E0813 00:33:25.733647 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.733715 kubelet[2741]: W0813 00:33:25.733661 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.733715 kubelet[2741]: E0813 00:33:25.733669 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.734181 kubelet[2741]: E0813 00:33:25.734172 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.734447 kubelet[2741]: W0813 00:33:25.734380 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.734447 kubelet[2741]: E0813 00:33:25.734393 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.738849 kubelet[2741]: E0813 00:33:25.738839 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.738941 kubelet[2741]: W0813 00:33:25.738902 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.738941 kubelet[2741]: E0813 00:33:25.738914 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.739186 kubelet[2741]: E0813 00:33:25.739166 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.739186 kubelet[2741]: W0813 00:33:25.739176 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.739295 kubelet[2741]: E0813 00:33:25.739257 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.740464 kubelet[2741]: E0813 00:33:25.740429 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.740464 kubelet[2741]: W0813 00:33:25.740439 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.740599 kubelet[2741]: E0813 00:33:25.740542 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.741301 kubelet[2741]: E0813 00:33:25.741277 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.741301 kubelet[2741]: W0813 00:33:25.741287 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.741475 kubelet[2741]: E0813 00:33:25.741407 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.741613 kubelet[2741]: E0813 00:33:25.741594 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.741613 kubelet[2741]: W0813 00:33:25.741602 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.741754 kubelet[2741]: E0813 00:33:25.741698 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.742258 kubelet[2741]: E0813 00:33:25.742237 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.742258 kubelet[2741]: W0813 00:33:25.742247 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.742450 kubelet[2741]: E0813 00:33:25.742324 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.743333 kubelet[2741]: E0813 00:33:25.743313 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.743333 kubelet[2741]: W0813 00:33:25.743322 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.743546 kubelet[2741]: E0813 00:33:25.743449 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.744113 kubelet[2741]: E0813 00:33:25.744084 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.744205 kubelet[2741]: W0813 00:33:25.744160 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.744278 kubelet[2741]: E0813 00:33:25.744249 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.745255 kubelet[2741]: E0813 00:33:25.745233 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.745255 kubelet[2741]: W0813 00:33:25.745243 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.745406 kubelet[2741]: E0813 00:33:25.745371 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.745558 kubelet[2741]: E0813 00:33:25.745545 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.745633 kubelet[2741]: W0813 00:33:25.745601 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.745725 kubelet[2741]: E0813 00:33:25.745677 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.745861 kubelet[2741]: E0813 00:33:25.745806 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.745861 kubelet[2741]: W0813 00:33:25.745814 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.746859 kubelet[2741]: E0813 00:33:25.746848 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.746954 kubelet[2741]: E0813 00:33:25.746945 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.748295 kubelet[2741]: W0813 00:33:25.747635 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.748295 kubelet[2741]: E0813 00:33:25.747655 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.749437 kubelet[2741]: E0813 00:33:25.749404 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.749437 kubelet[2741]: W0813 00:33:25.749425 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.749605 kubelet[2741]: E0813 00:33:25.749580 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.749717 kubelet[2741]: E0813 00:33:25.749697 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.749717 kubelet[2741]: W0813 00:33:25.749707 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.749842 kubelet[2741]: E0813 00:33:25.749784 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.750587 kubelet[2741]: E0813 00:33:25.750575 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.750662 kubelet[2741]: W0813 00:33:25.750637 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.750906 kubelet[2741]: E0813 00:33:25.750705 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.752373 kubelet[2741]: E0813 00:33:25.750985 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.752464 kubelet[2741]: W0813 00:33:25.752451 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.752683 kubelet[2741]: E0813 00:33:25.752516 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.752861 kubelet[2741]: E0813 00:33:25.752853 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.752914 kubelet[2741]: W0813 00:33:25.752905 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.752966 kubelet[2741]: E0813 00:33:25.752958 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:25.753116 kubelet[2741]: E0813 00:33:25.753108 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:25.753171 kubelet[2741]: W0813 00:33:25.753163 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:25.753216 kubelet[2741]: E0813 00:33:25.753207 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.689978 kubelet[2741]: I0813 00:33:26.689927 2741 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:33:26.739430 kubelet[2741]: E0813 00:33:26.739375 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.739430 kubelet[2741]: W0813 00:33:26.739399 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.739430 kubelet[2741]: E0813 00:33:26.739430 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.739732 kubelet[2741]: E0813 00:33:26.739579 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.739732 kubelet[2741]: W0813 00:33:26.739588 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.739732 kubelet[2741]: E0813 00:33:26.739596 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.739732 kubelet[2741]: E0813 00:33:26.739714 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.739732 kubelet[2741]: W0813 00:33:26.739721 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.739732 kubelet[2741]: E0813 00:33:26.739729 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.739943 kubelet[2741]: E0813 00:33:26.739848 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.739943 kubelet[2741]: W0813 00:33:26.739855 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.739943 kubelet[2741]: E0813 00:33:26.739863 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.740046 kubelet[2741]: E0813 00:33:26.739979 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.740046 kubelet[2741]: W0813 00:33:26.739987 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.740046 kubelet[2741]: E0813 00:33:26.739994 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.740153 kubelet[2741]: E0813 00:33:26.740108 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.740153 kubelet[2741]: W0813 00:33:26.740115 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.740153 kubelet[2741]: E0813 00:33:26.740123 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.740269 kubelet[2741]: E0813 00:33:26.740231 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.740269 kubelet[2741]: W0813 00:33:26.740239 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.740269 kubelet[2741]: E0813 00:33:26.740246 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.740552 kubelet[2741]: E0813 00:33:26.740375 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.740552 kubelet[2741]: W0813 00:33:26.740383 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.740552 kubelet[2741]: E0813 00:33:26.740391 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.740552 kubelet[2741]: E0813 00:33:26.740528 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.740552 kubelet[2741]: W0813 00:33:26.740535 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.740552 kubelet[2741]: E0813 00:33:26.740542 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.740774 kubelet[2741]: E0813 00:33:26.740643 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.740774 kubelet[2741]: W0813 00:33:26.740650 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.740774 kubelet[2741]: E0813 00:33:26.740657 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.740774 kubelet[2741]: E0813 00:33:26.740761 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.740774 kubelet[2741]: W0813 00:33:26.740767 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.740774 kubelet[2741]: E0813 00:33:26.740774 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.742391 kubelet[2741]: E0813 00:33:26.740923 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.742391 kubelet[2741]: W0813 00:33:26.740931 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.742391 kubelet[2741]: E0813 00:33:26.740938 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.742391 kubelet[2741]: E0813 00:33:26.741057 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.742391 kubelet[2741]: W0813 00:33:26.741064 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.742391 kubelet[2741]: E0813 00:33:26.741086 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.742391 kubelet[2741]: E0813 00:33:26.741288 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.742391 kubelet[2741]: W0813 00:33:26.741300 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.742391 kubelet[2741]: E0813 00:33:26.741310 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.742391 kubelet[2741]: E0813 00:33:26.741704 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.742813 kubelet[2741]: W0813 00:33:26.741714 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.742813 kubelet[2741]: E0813 00:33:26.741723 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.750286 kubelet[2741]: E0813 00:33:26.750215 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.750286 kubelet[2741]: W0813 00:33:26.750281 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.750566 kubelet[2741]: E0813 00:33:26.750345 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.750937 kubelet[2741]: E0813 00:33:26.750906 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.750937 kubelet[2741]: W0813 00:33:26.750932 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.751072 kubelet[2741]: E0813 00:33:26.750970 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.751342 kubelet[2741]: E0813 00:33:26.751304 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.751407 kubelet[2741]: W0813 00:33:26.751341 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.751479 kubelet[2741]: E0813 00:33:26.751447 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.751889 kubelet[2741]: E0813 00:33:26.751763 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.751889 kubelet[2741]: W0813 00:33:26.751779 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.751889 kubelet[2741]: E0813 00:33:26.751805 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.752070 kubelet[2741]: E0813 00:33:26.752055 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.752177 kubelet[2741]: W0813 00:33:26.752133 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.752177 kubelet[2741]: E0813 00:33:26.752166 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.752608 kubelet[2741]: E0813 00:33:26.752392 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.752608 kubelet[2741]: W0813 00:33:26.752402 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.752608 kubelet[2741]: E0813 00:33:26.752470 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.752608 kubelet[2741]: E0813 00:33:26.752557 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.752608 kubelet[2741]: W0813 00:33:26.752566 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.752800 kubelet[2741]: E0813 00:33:26.752677 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.752800 kubelet[2741]: W0813 00:33:26.752684 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.752875 kubelet[2741]: E0813 00:33:26.752816 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.752875 kubelet[2741]: W0813 00:33:26.752824 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.752875 kubelet[2741]: E0813 00:33:26.752833 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.753113 kubelet[2741]: E0813 00:33:26.753005 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.753113 kubelet[2741]: E0813 00:33:26.753076 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.753602 kubelet[2741]: E0813 00:33:26.753145 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.753602 kubelet[2741]: W0813 00:33:26.753155 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.753602 kubelet[2741]: E0813 00:33:26.753174 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.753602 kubelet[2741]: E0813 00:33:26.753321 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.753602 kubelet[2741]: W0813 00:33:26.753329 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.753602 kubelet[2741]: E0813 00:33:26.753363 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.753602 kubelet[2741]: E0813 00:33:26.753538 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.753602 kubelet[2741]: W0813 00:33:26.753551 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.753602 kubelet[2741]: E0813 00:33:26.753573 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.753989 kubelet[2741]: E0813 00:33:26.753712 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.753989 kubelet[2741]: W0813 00:33:26.753720 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.753989 kubelet[2741]: E0813 00:33:26.753728 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.754532 kubelet[2741]: E0813 00:33:26.754485 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.754532 kubelet[2741]: W0813 00:33:26.754516 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.754628 kubelet[2741]: E0813 00:33:26.754547 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.754800 kubelet[2741]: E0813 00:33:26.754772 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.754800 kubelet[2741]: W0813 00:33:26.754793 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.754867 kubelet[2741]: E0813 00:33:26.754818 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.755135 kubelet[2741]: E0813 00:33:26.755107 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.755135 kubelet[2741]: W0813 00:33:26.755129 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.755210 kubelet[2741]: E0813 00:33:26.755154 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.755894 kubelet[2741]: E0813 00:33:26.755851 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.755894 kubelet[2741]: W0813 00:33:26.755874 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.756091 kubelet[2741]: E0813 00:33:26.756060 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:26.756191 kubelet[2741]: E0813 00:33:26.756170 2741 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:33:26.756191 kubelet[2741]: W0813 00:33:26.756187 2741 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:33:26.756258 kubelet[2741]: E0813 00:33:26.756201 2741 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:33:27.222120 containerd[1576]: time="2025-08-13T00:33:27.222072420Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:27.222978 containerd[1576]: time="2025-08-13T00:33:27.222947479Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 13 00:33:27.223873 containerd[1576]: time="2025-08-13T00:33:27.223831958Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:27.225280 containerd[1576]: time="2025-08-13T00:33:27.225240869Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:27.225696 containerd[1576]: time="2025-08-13T00:33:27.225675194Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.724273989s" Aug 13 00:33:27.225774 containerd[1576]: time="2025-08-13T00:33:27.225760985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 13 00:33:27.228282 containerd[1576]: time="2025-08-13T00:33:27.228252697Z" level=info msg="CreateContainer within sandbox \"f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 13 00:33:27.261917 containerd[1576]: time="2025-08-13T00:33:27.261752760Z" level=info msg="Container c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:27.269818 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2146559190.mount: Deactivated successfully. Aug 13 00:33:27.289922 containerd[1576]: time="2025-08-13T00:33:27.289868927Z" level=info msg="CreateContainer within sandbox \"f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b\"" Aug 13 00:33:27.290485 containerd[1576]: time="2025-08-13T00:33:27.290459906Z" level=info msg="StartContainer for \"c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b\"" Aug 13 00:33:27.296573 containerd[1576]: time="2025-08-13T00:33:27.296542581Z" level=info msg="connecting to shim c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b" address="unix:///run/containerd/s/5bb22841eafecc86f5bd4e211cd35e1a349f68b33e815bc3efaca3918574118e" protocol=ttrpc version=3 Aug 13 00:33:27.318539 systemd[1]: Started cri-containerd-c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b.scope - libcontainer container c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b. Aug 13 00:33:27.357217 containerd[1576]: time="2025-08-13T00:33:27.357063619Z" level=info msg="StartContainer for \"c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b\" returns successfully" Aug 13 00:33:27.363220 systemd[1]: cri-containerd-c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b.scope: Deactivated successfully. Aug 13 00:33:27.379695 containerd[1576]: time="2025-08-13T00:33:27.379654467Z" level=info msg="received exit event container_id:\"c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b\" id:\"c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b\" pid:3454 exited_at:{seconds:1755045207 nanos:366517845}" Aug 13 00:33:27.380339 containerd[1576]: time="2025-08-13T00:33:27.380302251Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b\" id:\"c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b\" pid:3454 exited_at:{seconds:1755045207 nanos:366517845}" Aug 13 00:33:27.398966 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c973846d07e2a44463e15d6ff90cb37d2d999d827ba93a38bfa9fbae3c8e291b-rootfs.mount: Deactivated successfully. Aug 13 00:33:27.590319 kubelet[2741]: E0813 00:33:27.590146 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r465b" podUID="3e889cea-019a-4296-b7d3-2bcaff0d62fa" Aug 13 00:33:27.696092 containerd[1576]: time="2025-08-13T00:33:27.696035284Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 13 00:33:29.590055 kubelet[2741]: E0813 00:33:29.589998 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-r465b" podUID="3e889cea-019a-4296-b7d3-2bcaff0d62fa" Aug 13 00:33:31.036483 containerd[1576]: time="2025-08-13T00:33:31.036434561Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:31.037920 containerd[1576]: time="2025-08-13T00:33:31.037581021Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 13 00:33:31.038492 containerd[1576]: time="2025-08-13T00:33:31.038417209Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:31.040524 containerd[1576]: time="2025-08-13T00:33:31.040311521Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:31.041085 containerd[1576]: time="2025-08-13T00:33:31.040811799Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 3.344250589s" Aug 13 00:33:31.041085 containerd[1576]: time="2025-08-13T00:33:31.040835583Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 13 00:33:31.044040 containerd[1576]: time="2025-08-13T00:33:31.044005097Z" level=info msg="CreateContainer within sandbox \"f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 13 00:33:31.052984 containerd[1576]: time="2025-08-13T00:33:31.052965307Z" level=info msg="Container e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:31.069261 containerd[1576]: time="2025-08-13T00:33:31.069211549Z" level=info msg="CreateContainer within sandbox \"f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692\"" Aug 13 00:33:31.069981 containerd[1576]: time="2025-08-13T00:33:31.069778473Z" level=info msg="StartContainer for \"e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692\"" Aug 13 00:33:31.071576 containerd[1576]: time="2025-08-13T00:33:31.071520810Z" level=info msg="connecting to shim e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692" address="unix:///run/containerd/s/5bb22841eafecc86f5bd4e211cd35e1a349f68b33e815bc3efaca3918574118e" protocol=ttrpc version=3 Aug 13 00:33:31.098124 systemd[1]: Started cri-containerd-e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692.scope - libcontainer container e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692. Aug 13 00:33:31.133794 containerd[1576]: time="2025-08-13T00:33:31.133764489Z" level=info msg="StartContainer for \"e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692\" returns successfully" Aug 13 00:33:31.508538 systemd[1]: cri-containerd-e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692.scope: Deactivated successfully. Aug 13 00:33:31.508767 systemd[1]: cri-containerd-e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692.scope: Consumed 351ms CPU time, 170.6M memory peak, 8.5M read from disk, 171.2M written to disk. Aug 13 00:33:31.528859 containerd[1576]: time="2025-08-13T00:33:31.528806854Z" level=info msg="received exit event container_id:\"e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692\" id:\"e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692\" pid:3514 exited_at:{seconds:1755045211 nanos:511645017}" Aug 13 00:33:31.533130 containerd[1576]: time="2025-08-13T00:33:31.532644611Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692\" id:\"e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692\" pid:3514 exited_at:{seconds:1755045211 nanos:511645017}" Aug 13 00:33:31.553474 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e60f639b4477b176166f83df3885af2d9500a1cd31b46efeaedca7199b89c692-rootfs.mount: Deactivated successfully. Aug 13 00:33:31.585070 kubelet[2741]: I0813 00:33:31.585004 2741 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Aug 13 00:33:31.599471 systemd[1]: Created slice kubepods-besteffort-pod3e889cea_019a_4296_b7d3_2bcaff0d62fa.slice - libcontainer container kubepods-besteffort-pod3e889cea_019a_4296_b7d3_2bcaff0d62fa.slice. Aug 13 00:33:31.603342 containerd[1576]: time="2025-08-13T00:33:31.602998957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-r465b,Uid:3e889cea-019a-4296-b7d3-2bcaff0d62fa,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:31.665759 systemd[1]: Created slice kubepods-besteffort-pode88815b6_96db_45a3_8446_237a9c6bee1f.slice - libcontainer container kubepods-besteffort-pode88815b6_96db_45a3_8446_237a9c6bee1f.slice. Aug 13 00:33:31.673472 systemd[1]: Created slice kubepods-burstable-pod54f0db10_9dd8_41e1_a015_ed494fda532f.slice - libcontainer container kubepods-burstable-pod54f0db10_9dd8_41e1_a015_ed494fda532f.slice. Aug 13 00:33:31.684172 systemd[1]: Created slice kubepods-burstable-podd0d66f86_eb5e_4570_9700_e2acb101d000.slice - libcontainer container kubepods-burstable-podd0d66f86_eb5e_4570_9700_e2acb101d000.slice. Aug 13 00:33:31.686457 kubelet[2741]: I0813 00:33:31.686412 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kh9s\" (UniqueName: \"kubernetes.io/projected/54f0db10-9dd8-41e1-a015-ed494fda532f-kube-api-access-5kh9s\") pod \"coredns-7c65d6cfc9-xjs9d\" (UID: \"54f0db10-9dd8-41e1-a015-ed494fda532f\") " pod="kube-system/coredns-7c65d6cfc9-xjs9d" Aug 13 00:33:31.686457 kubelet[2741]: I0813 00:33:31.686456 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7d00ab8-fa2b-489b-811e-3590e0a3f2d9-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-wrddn\" (UID: \"f7d00ab8-fa2b-489b-811e-3590e0a3f2d9\") " pod="calico-system/goldmane-58fd7646b9-wrddn" Aug 13 00:33:31.687176 kubelet[2741]: I0813 00:33:31.686474 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm495\" (UniqueName: \"kubernetes.io/projected/e88815b6-96db-45a3-8446-237a9c6bee1f-kube-api-access-bm495\") pod \"calico-apiserver-5c7f84bf9d-rgxvb\" (UID: \"e88815b6-96db-45a3-8446-237a9c6bee1f\") " pod="calico-apiserver/calico-apiserver-5c7f84bf9d-rgxvb" Aug 13 00:33:31.687176 kubelet[2741]: I0813 00:33:31.686486 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d8cp\" (UniqueName: \"kubernetes.io/projected/ec8ee7ae-5bd1-4c63-8236-83729cf25c5d-kube-api-access-7d8cp\") pod \"calico-kube-controllers-74d8bfcddc-n9p85\" (UID: \"ec8ee7ae-5bd1-4c63-8236-83729cf25c5d\") " pod="calico-system/calico-kube-controllers-74d8bfcddc-n9p85" Aug 13 00:33:31.687176 kubelet[2741]: I0813 00:33:31.686499 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-whisker-backend-key-pair\") pod \"whisker-8b96746ff-84bk8\" (UID: \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\") " pod="calico-system/whisker-8b96746ff-84bk8" Aug 13 00:33:31.687176 kubelet[2741]: I0813 00:33:31.686513 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdqs2\" (UniqueName: \"kubernetes.io/projected/0cf431c3-630f-4c35-a27a-549702718992-kube-api-access-tdqs2\") pod \"calico-apiserver-5c7f84bf9d-8qmvb\" (UID: \"0cf431c3-630f-4c35-a27a-549702718992\") " pod="calico-apiserver/calico-apiserver-5c7f84bf9d-8qmvb" Aug 13 00:33:31.687176 kubelet[2741]: I0813 00:33:31.686525 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f7d00ab8-fa2b-489b-811e-3590e0a3f2d9-goldmane-key-pair\") pod \"goldmane-58fd7646b9-wrddn\" (UID: \"f7d00ab8-fa2b-489b-811e-3590e0a3f2d9\") " pod="calico-system/goldmane-58fd7646b9-wrddn" Aug 13 00:33:31.687375 kubelet[2741]: I0813 00:33:31.686538 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqct7\" (UniqueName: \"kubernetes.io/projected/f7d00ab8-fa2b-489b-811e-3590e0a3f2d9-kube-api-access-lqct7\") pod \"goldmane-58fd7646b9-wrddn\" (UID: \"f7d00ab8-fa2b-489b-811e-3590e0a3f2d9\") " pod="calico-system/goldmane-58fd7646b9-wrddn" Aug 13 00:33:31.687375 kubelet[2741]: I0813 00:33:31.686551 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0cf431c3-630f-4c35-a27a-549702718992-calico-apiserver-certs\") pod \"calico-apiserver-5c7f84bf9d-8qmvb\" (UID: \"0cf431c3-630f-4c35-a27a-549702718992\") " pod="calico-apiserver/calico-apiserver-5c7f84bf9d-8qmvb" Aug 13 00:33:31.687375 kubelet[2741]: I0813 00:33:31.686564 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d0d66f86-eb5e-4570-9700-e2acb101d000-config-volume\") pod \"coredns-7c65d6cfc9-rf9dq\" (UID: \"d0d66f86-eb5e-4570-9700-e2acb101d000\") " pod="kube-system/coredns-7c65d6cfc9-rf9dq" Aug 13 00:33:31.687375 kubelet[2741]: I0813 00:33:31.686575 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8rxj\" (UniqueName: \"kubernetes.io/projected/d0d66f86-eb5e-4570-9700-e2acb101d000-kube-api-access-t8rxj\") pod \"coredns-7c65d6cfc9-rf9dq\" (UID: \"d0d66f86-eb5e-4570-9700-e2acb101d000\") " pod="kube-system/coredns-7c65d6cfc9-rf9dq" Aug 13 00:33:31.687375 kubelet[2741]: I0813 00:33:31.686589 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54f0db10-9dd8-41e1-a015-ed494fda532f-config-volume\") pod \"coredns-7c65d6cfc9-xjs9d\" (UID: \"54f0db10-9dd8-41e1-a015-ed494fda532f\") " pod="kube-system/coredns-7c65d6cfc9-xjs9d" Aug 13 00:33:31.687555 kubelet[2741]: I0813 00:33:31.686605 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7d00ab8-fa2b-489b-811e-3590e0a3f2d9-config\") pod \"goldmane-58fd7646b9-wrddn\" (UID: \"f7d00ab8-fa2b-489b-811e-3590e0a3f2d9\") " pod="calico-system/goldmane-58fd7646b9-wrddn" Aug 13 00:33:31.687555 kubelet[2741]: I0813 00:33:31.686616 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e88815b6-96db-45a3-8446-237a9c6bee1f-calico-apiserver-certs\") pod \"calico-apiserver-5c7f84bf9d-rgxvb\" (UID: \"e88815b6-96db-45a3-8446-237a9c6bee1f\") " pod="calico-apiserver/calico-apiserver-5c7f84bf9d-rgxvb" Aug 13 00:33:31.687555 kubelet[2741]: I0813 00:33:31.686628 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec8ee7ae-5bd1-4c63-8236-83729cf25c5d-tigera-ca-bundle\") pod \"calico-kube-controllers-74d8bfcddc-n9p85\" (UID: \"ec8ee7ae-5bd1-4c63-8236-83729cf25c5d\") " pod="calico-system/calico-kube-controllers-74d8bfcddc-n9p85" Aug 13 00:33:31.687555 kubelet[2741]: I0813 00:33:31.686642 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-whisker-ca-bundle\") pod \"whisker-8b96746ff-84bk8\" (UID: \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\") " pod="calico-system/whisker-8b96746ff-84bk8" Aug 13 00:33:31.687555 kubelet[2741]: I0813 00:33:31.686656 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r7ll\" (UniqueName: \"kubernetes.io/projected/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-kube-api-access-2r7ll\") pod \"whisker-8b96746ff-84bk8\" (UID: \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\") " pod="calico-system/whisker-8b96746ff-84bk8" Aug 13 00:33:31.693262 systemd[1]: Created slice kubepods-besteffort-podf7d00ab8_fa2b_489b_811e_3590e0a3f2d9.slice - libcontainer container kubepods-besteffort-podf7d00ab8_fa2b_489b_811e_3590e0a3f2d9.slice. Aug 13 00:33:31.702476 systemd[1]: Created slice kubepods-besteffort-podfa7b7a0c_b718_4ff0_b03f_b5bbd4b6a0c4.slice - libcontainer container kubepods-besteffort-podfa7b7a0c_b718_4ff0_b03f_b5bbd4b6a0c4.slice. Aug 13 00:33:31.708668 systemd[1]: Created slice kubepods-besteffort-podec8ee7ae_5bd1_4c63_8236_83729cf25c5d.slice - libcontainer container kubepods-besteffort-podec8ee7ae_5bd1_4c63_8236_83729cf25c5d.slice. Aug 13 00:33:31.713971 systemd[1]: Created slice kubepods-besteffort-pod0cf431c3_630f_4c35_a27a_549702718992.slice - libcontainer container kubepods-besteffort-pod0cf431c3_630f_4c35_a27a_549702718992.slice. Aug 13 00:33:31.734067 containerd[1576]: time="2025-08-13T00:33:31.733995680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 13 00:33:31.822683 containerd[1576]: time="2025-08-13T00:33:31.822594587Z" level=error msg="Failed to destroy network for sandbox \"c2939a10d92e8cde9f2094e40c2173769cdbad67c5c81a233ce4fd665d9ac54c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:31.824540 containerd[1576]: time="2025-08-13T00:33:31.824390304Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-r465b,Uid:3e889cea-019a-4296-b7d3-2bcaff0d62fa,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2939a10d92e8cde9f2094e40c2173769cdbad67c5c81a233ce4fd665d9ac54c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:31.825945 kubelet[2741]: E0813 00:33:31.825889 2741 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2939a10d92e8cde9f2094e40c2173769cdbad67c5c81a233ce4fd665d9ac54c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:31.826309 kubelet[2741]: E0813 00:33:31.826292 2741 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2939a10d92e8cde9f2094e40c2173769cdbad67c5c81a233ce4fd665d9ac54c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-r465b" Aug 13 00:33:31.826553 kubelet[2741]: E0813 00:33:31.826442 2741 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2939a10d92e8cde9f2094e40c2173769cdbad67c5c81a233ce4fd665d9ac54c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-r465b" Aug 13 00:33:31.827068 kubelet[2741]: E0813 00:33:31.826715 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-r465b_calico-system(3e889cea-019a-4296-b7d3-2bcaff0d62fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-r465b_calico-system(3e889cea-019a-4296-b7d3-2bcaff0d62fa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2939a10d92e8cde9f2094e40c2173769cdbad67c5c81a233ce4fd665d9ac54c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-r465b" podUID="3e889cea-019a-4296-b7d3-2bcaff0d62fa" Aug 13 00:33:31.970052 containerd[1576]: time="2025-08-13T00:33:31.969988955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f84bf9d-rgxvb,Uid:e88815b6-96db-45a3-8446-237a9c6bee1f,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:33:31.984252 containerd[1576]: time="2025-08-13T00:33:31.984206363Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xjs9d,Uid:54f0db10-9dd8-41e1-a015-ed494fda532f,Namespace:kube-system,Attempt:0,}" Aug 13 00:33:31.989998 containerd[1576]: time="2025-08-13T00:33:31.989815811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-rf9dq,Uid:d0d66f86-eb5e-4570-9700-e2acb101d000,Namespace:kube-system,Attempt:0,}" Aug 13 00:33:31.998748 containerd[1576]: time="2025-08-13T00:33:31.998722411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-wrddn,Uid:f7d00ab8-fa2b-489b-811e-3590e0a3f2d9,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:32.006562 containerd[1576]: time="2025-08-13T00:33:32.006534149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8b96746ff-84bk8,Uid:fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:32.014249 containerd[1576]: time="2025-08-13T00:33:32.013574630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-74d8bfcddc-n9p85,Uid:ec8ee7ae-5bd1-4c63-8236-83729cf25c5d,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:32.028666 containerd[1576]: time="2025-08-13T00:33:32.028632894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f84bf9d-8qmvb,Uid:0cf431c3-630f-4c35-a27a-549702718992,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:33:32.078600 systemd[1]: run-netns-cni\x2dd5d01f1e\x2d5326\x2d4c0a\x2dcbf2\x2d044c499341cb.mount: Deactivated successfully. Aug 13 00:33:32.089365 containerd[1576]: time="2025-08-13T00:33:32.089196974Z" level=error msg="Failed to destroy network for sandbox \"24fa7f868763b931d030fb2e9d3d05fe69b1c75d40da6f8bbd7473205572e21c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.090874 systemd[1]: run-netns-cni\x2d226db6db\x2d5559\x2d4e67\x2d7c63\x2d6691e3d5c896.mount: Deactivated successfully. Aug 13 00:33:32.095170 containerd[1576]: time="2025-08-13T00:33:32.095137854Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f84bf9d-rgxvb,Uid:e88815b6-96db-45a3-8446-237a9c6bee1f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"24fa7f868763b931d030fb2e9d3d05fe69b1c75d40da6f8bbd7473205572e21c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.095663 kubelet[2741]: E0813 00:33:32.095596 2741 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24fa7f868763b931d030fb2e9d3d05fe69b1c75d40da6f8bbd7473205572e21c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.095919 kubelet[2741]: E0813 00:33:32.095832 2741 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24fa7f868763b931d030fb2e9d3d05fe69b1c75d40da6f8bbd7473205572e21c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c7f84bf9d-rgxvb" Aug 13 00:33:32.095919 kubelet[2741]: E0813 00:33:32.095855 2741 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"24fa7f868763b931d030fb2e9d3d05fe69b1c75d40da6f8bbd7473205572e21c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c7f84bf9d-rgxvb" Aug 13 00:33:32.096153 kubelet[2741]: E0813 00:33:32.095894 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c7f84bf9d-rgxvb_calico-apiserver(e88815b6-96db-45a3-8446-237a9c6bee1f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c7f84bf9d-rgxvb_calico-apiserver(e88815b6-96db-45a3-8446-237a9c6bee1f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"24fa7f868763b931d030fb2e9d3d05fe69b1c75d40da6f8bbd7473205572e21c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c7f84bf9d-rgxvb" podUID="e88815b6-96db-45a3-8446-237a9c6bee1f" Aug 13 00:33:32.149920 containerd[1576]: time="2025-08-13T00:33:32.149827679Z" level=error msg="Failed to destroy network for sandbox \"163c8a3007853479ecb6d7227dc0eae56c97288eb7a29fd8adda518fb8a0cab0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.153336 systemd[1]: run-netns-cni\x2d8c7f71dc\x2df538\x2d7a93\x2d2231\x2dcafb0f762ec0.mount: Deactivated successfully. Aug 13 00:33:32.154305 containerd[1576]: time="2025-08-13T00:33:32.154046529Z" level=error msg="Failed to destroy network for sandbox \"2b445e344d44d9252e3bf58ac8d88fcd5c3cd977b864747d7e84503326172dd1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.158244 systemd[1]: run-netns-cni\x2d77a44fcf\x2d5746\x2df7c5\x2daf1f\x2d78c7e0499dcf.mount: Deactivated successfully. Aug 13 00:33:32.159663 containerd[1576]: time="2025-08-13T00:33:32.159633045Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-wrddn,Uid:f7d00ab8-fa2b-489b-811e-3590e0a3f2d9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b445e344d44d9252e3bf58ac8d88fcd5c3cd977b864747d7e84503326172dd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.160380 kubelet[2741]: E0813 00:33:32.160071 2741 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b445e344d44d9252e3bf58ac8d88fcd5c3cd977b864747d7e84503326172dd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.160380 kubelet[2741]: E0813 00:33:32.160125 2741 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b445e344d44d9252e3bf58ac8d88fcd5c3cd977b864747d7e84503326172dd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-wrddn" Aug 13 00:33:32.160380 kubelet[2741]: E0813 00:33:32.160144 2741 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b445e344d44d9252e3bf58ac8d88fcd5c3cd977b864747d7e84503326172dd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-wrddn" Aug 13 00:33:32.160508 kubelet[2741]: E0813 00:33:32.160196 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-wrddn_calico-system(f7d00ab8-fa2b-489b-811e-3590e0a3f2d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-wrddn_calico-system(f7d00ab8-fa2b-489b-811e-3590e0a3f2d9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b445e344d44d9252e3bf58ac8d88fcd5c3cd977b864747d7e84503326172dd1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-wrddn" podUID="f7d00ab8-fa2b-489b-811e-3590e0a3f2d9" Aug 13 00:33:32.163610 containerd[1576]: time="2025-08-13T00:33:32.163525743Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xjs9d,Uid:54f0db10-9dd8-41e1-a015-ed494fda532f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"163c8a3007853479ecb6d7227dc0eae56c97288eb7a29fd8adda518fb8a0cab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.163827 kubelet[2741]: E0813 00:33:32.163788 2741 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"163c8a3007853479ecb6d7227dc0eae56c97288eb7a29fd8adda518fb8a0cab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.164033 kubelet[2741]: E0813 00:33:32.164019 2741 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"163c8a3007853479ecb6d7227dc0eae56c97288eb7a29fd8adda518fb8a0cab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xjs9d" Aug 13 00:33:32.164178 kubelet[2741]: E0813 00:33:32.164113 2741 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"163c8a3007853479ecb6d7227dc0eae56c97288eb7a29fd8adda518fb8a0cab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xjs9d" Aug 13 00:33:32.164302 kubelet[2741]: E0813 00:33:32.164244 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-xjs9d_kube-system(54f0db10-9dd8-41e1-a015-ed494fda532f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-xjs9d_kube-system(54f0db10-9dd8-41e1-a015-ed494fda532f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"163c8a3007853479ecb6d7227dc0eae56c97288eb7a29fd8adda518fb8a0cab0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-xjs9d" podUID="54f0db10-9dd8-41e1-a015-ed494fda532f" Aug 13 00:33:32.176521 containerd[1576]: time="2025-08-13T00:33:32.176491295Z" level=error msg="Failed to destroy network for sandbox \"f638c6cb58e559449f502626a92d25d85220779028d59d999e28fe1f740d507b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.177933 containerd[1576]: time="2025-08-13T00:33:32.177781303Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-rf9dq,Uid:d0d66f86-eb5e-4570-9700-e2acb101d000,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f638c6cb58e559449f502626a92d25d85220779028d59d999e28fe1f740d507b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.178257 systemd[1]: run-netns-cni\x2d47c5d7b3\x2d00d5\x2d9f8c\x2de8c2\x2d0848b7e8800e.mount: Deactivated successfully. Aug 13 00:33:32.178369 kubelet[2741]: E0813 00:33:32.178214 2741 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f638c6cb58e559449f502626a92d25d85220779028d59d999e28fe1f740d507b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.178418 kubelet[2741]: E0813 00:33:32.178391 2741 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f638c6cb58e559449f502626a92d25d85220779028d59d999e28fe1f740d507b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-rf9dq" Aug 13 00:33:32.178418 kubelet[2741]: E0813 00:33:32.178408 2741 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f638c6cb58e559449f502626a92d25d85220779028d59d999e28fe1f740d507b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-rf9dq" Aug 13 00:33:32.179824 kubelet[2741]: E0813 00:33:32.178564 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-rf9dq_kube-system(d0d66f86-eb5e-4570-9700-e2acb101d000)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-rf9dq_kube-system(d0d66f86-eb5e-4570-9700-e2acb101d000)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f638c6cb58e559449f502626a92d25d85220779028d59d999e28fe1f740d507b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-rf9dq" podUID="d0d66f86-eb5e-4570-9700-e2acb101d000" Aug 13 00:33:32.186285 containerd[1576]: time="2025-08-13T00:33:32.186251355Z" level=error msg="Failed to destroy network for sandbox \"9e63f40ab81c9e33b10dfb10943549a55720b8067ca7cda8f3c4f02db1b71339\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.187882 containerd[1576]: time="2025-08-13T00:33:32.187854822Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8b96746ff-84bk8,Uid:fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e63f40ab81c9e33b10dfb10943549a55720b8067ca7cda8f3c4f02db1b71339\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.188022 kubelet[2741]: E0813 00:33:32.187994 2741 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e63f40ab81c9e33b10dfb10943549a55720b8067ca7cda8f3c4f02db1b71339\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.188074 kubelet[2741]: E0813 00:33:32.188034 2741 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e63f40ab81c9e33b10dfb10943549a55720b8067ca7cda8f3c4f02db1b71339\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-8b96746ff-84bk8" Aug 13 00:33:32.188074 kubelet[2741]: E0813 00:33:32.188047 2741 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e63f40ab81c9e33b10dfb10943549a55720b8067ca7cda8f3c4f02db1b71339\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-8b96746ff-84bk8" Aug 13 00:33:32.188222 kubelet[2741]: E0813 00:33:32.188178 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-8b96746ff-84bk8_calico-system(fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-8b96746ff-84bk8_calico-system(fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9e63f40ab81c9e33b10dfb10943549a55720b8067ca7cda8f3c4f02db1b71339\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-8b96746ff-84bk8" podUID="fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4" Aug 13 00:33:32.206504 containerd[1576]: time="2025-08-13T00:33:32.206459556Z" level=error msg="Failed to destroy network for sandbox \"0b5a5f767827ee16a6df1027476baf4bf2c166b7da8d9444f4932eaf06d37154\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.207304 containerd[1576]: time="2025-08-13T00:33:32.207262121Z" level=error msg="Failed to destroy network for sandbox \"795f96978848a0dafffa481060cac4ec1c6d47b7665719e3ab7a136da7595cf0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.207966 containerd[1576]: time="2025-08-13T00:33:32.207934622Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-74d8bfcddc-n9p85,Uid:ec8ee7ae-5bd1-4c63-8236-83729cf25c5d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b5a5f767827ee16a6df1027476baf4bf2c166b7da8d9444f4932eaf06d37154\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.208324 kubelet[2741]: E0813 00:33:32.208279 2741 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b5a5f767827ee16a6df1027476baf4bf2c166b7da8d9444f4932eaf06d37154\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.208496 kubelet[2741]: E0813 00:33:32.208340 2741 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b5a5f767827ee16a6df1027476baf4bf2c166b7da8d9444f4932eaf06d37154\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-74d8bfcddc-n9p85" Aug 13 00:33:32.208496 kubelet[2741]: E0813 00:33:32.208386 2741 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b5a5f767827ee16a6df1027476baf4bf2c166b7da8d9444f4932eaf06d37154\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-74d8bfcddc-n9p85" Aug 13 00:33:32.208496 kubelet[2741]: E0813 00:33:32.208447 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-74d8bfcddc-n9p85_calico-system(ec8ee7ae-5bd1-4c63-8236-83729cf25c5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-74d8bfcddc-n9p85_calico-system(ec8ee7ae-5bd1-4c63-8236-83729cf25c5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b5a5f767827ee16a6df1027476baf4bf2c166b7da8d9444f4932eaf06d37154\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-74d8bfcddc-n9p85" podUID="ec8ee7ae-5bd1-4c63-8236-83729cf25c5d" Aug 13 00:33:32.208833 containerd[1576]: time="2025-08-13T00:33:32.208810263Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f84bf9d-8qmvb,Uid:0cf431c3-630f-4c35-a27a-549702718992,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"795f96978848a0dafffa481060cac4ec1c6d47b7665719e3ab7a136da7595cf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.209234 kubelet[2741]: E0813 00:33:32.209035 2741 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"795f96978848a0dafffa481060cac4ec1c6d47b7665719e3ab7a136da7595cf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:33:32.209234 kubelet[2741]: E0813 00:33:32.209062 2741 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"795f96978848a0dafffa481060cac4ec1c6d47b7665719e3ab7a136da7595cf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c7f84bf9d-8qmvb" Aug 13 00:33:32.209234 kubelet[2741]: E0813 00:33:32.209075 2741 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"795f96978848a0dafffa481060cac4ec1c6d47b7665719e3ab7a136da7595cf0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c7f84bf9d-8qmvb" Aug 13 00:33:32.209310 kubelet[2741]: E0813 00:33:32.209101 2741 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c7f84bf9d-8qmvb_calico-apiserver(0cf431c3-630f-4c35-a27a-549702718992)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c7f84bf9d-8qmvb_calico-apiserver(0cf431c3-630f-4c35-a27a-549702718992)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"795f96978848a0dafffa481060cac4ec1c6d47b7665719e3ab7a136da7595cf0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c7f84bf9d-8qmvb" podUID="0cf431c3-630f-4c35-a27a-549702718992" Aug 13 00:33:33.055656 systemd[1]: run-netns-cni\x2d8ba7b71c\x2d8f4e\x2d4d07\x2d4019\x2d307f9d04bd2f.mount: Deactivated successfully. Aug 13 00:33:33.055753 systemd[1]: run-netns-cni\x2d18fd1b50\x2d119e\x2dc266\x2d6ebc\x2d277652b76021.mount: Deactivated successfully. Aug 13 00:33:33.055806 systemd[1]: run-netns-cni\x2d69e67b59\x2dcaf0\x2d367b\x2d144e\x2d3ba4ccfc0cb1.mount: Deactivated successfully. Aug 13 00:33:35.596003 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2157259125.mount: Deactivated successfully. Aug 13 00:33:35.618685 containerd[1576]: time="2025-08-13T00:33:35.618624425Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:35.620583 containerd[1576]: time="2025-08-13T00:33:35.620540698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 13 00:33:35.621185 containerd[1576]: time="2025-08-13T00:33:35.621052559Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:35.622974 containerd[1576]: time="2025-08-13T00:33:35.622936831Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:35.623332 containerd[1576]: time="2025-08-13T00:33:35.623302257Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 3.889232427s" Aug 13 00:33:35.623389 containerd[1576]: time="2025-08-13T00:33:35.623334587Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 13 00:33:35.635366 containerd[1576]: time="2025-08-13T00:33:35.635320683Z" level=info msg="CreateContainer within sandbox \"f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 13 00:33:35.651168 containerd[1576]: time="2025-08-13T00:33:35.648883813Z" level=info msg="Container 1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:35.654408 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3055880185.mount: Deactivated successfully. Aug 13 00:33:35.663359 containerd[1576]: time="2025-08-13T00:33:35.663297771Z" level=info msg="CreateContainer within sandbox \"f1eb894c7f84d5745a6ecb8667e246634113c816fd9d0a9323003324cb9230eb\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\"" Aug 13 00:33:35.663934 containerd[1576]: time="2025-08-13T00:33:35.663855085Z" level=info msg="StartContainer for \"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\"" Aug 13 00:33:35.665740 containerd[1576]: time="2025-08-13T00:33:35.665712058Z" level=info msg="connecting to shim 1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7" address="unix:///run/containerd/s/5bb22841eafecc86f5bd4e211cd35e1a349f68b33e815bc3efaca3918574118e" protocol=ttrpc version=3 Aug 13 00:33:35.727576 systemd[1]: Started cri-containerd-1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7.scope - libcontainer container 1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7. Aug 13 00:33:35.773269 containerd[1576]: time="2025-08-13T00:33:35.773217275Z" level=info msg="StartContainer for \"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\" returns successfully" Aug 13 00:33:35.847590 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 13 00:33:35.849895 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 13 00:33:36.117261 kubelet[2741]: I0813 00:33:36.116960 2741 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-whisker-ca-bundle\") pod \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\" (UID: \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\") " Aug 13 00:33:36.117261 kubelet[2741]: I0813 00:33:36.117002 2741 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-whisker-backend-key-pair\") pod \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\" (UID: \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\") " Aug 13 00:33:36.117261 kubelet[2741]: I0813 00:33:36.117019 2741 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r7ll\" (UniqueName: \"kubernetes.io/projected/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-kube-api-access-2r7ll\") pod \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\" (UID: \"fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4\") " Aug 13 00:33:36.123368 kubelet[2741]: I0813 00:33:36.122579 2741 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4" (UID: "fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Aug 13 00:33:36.129952 kubelet[2741]: I0813 00:33:36.129930 2741 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4" (UID: "fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 13 00:33:36.130203 kubelet[2741]: I0813 00:33:36.130182 2741 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-kube-api-access-2r7ll" (OuterVolumeSpecName: "kube-api-access-2r7ll") pod "fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4" (UID: "fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4"). InnerVolumeSpecName "kube-api-access-2r7ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 13 00:33:36.218306 kubelet[2741]: I0813 00:33:36.217952 2741 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-whisker-ca-bundle\") on node \"ci-4372-1-0-4-15a6623c0c\" DevicePath \"\"" Aug 13 00:33:36.218715 kubelet[2741]: I0813 00:33:36.218664 2741 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-whisker-backend-key-pair\") on node \"ci-4372-1-0-4-15a6623c0c\" DevicePath \"\"" Aug 13 00:33:36.218715 kubelet[2741]: I0813 00:33:36.218683 2741 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r7ll\" (UniqueName: \"kubernetes.io/projected/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4-kube-api-access-2r7ll\") on node \"ci-4372-1-0-4-15a6623c0c\" DevicePath \"\"" Aug 13 00:33:36.599073 systemd[1]: var-lib-kubelet-pods-fa7b7a0c\x2db718\x2d4ff0\x2db03f\x2db5bbd4b6a0c4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2r7ll.mount: Deactivated successfully. Aug 13 00:33:36.599319 systemd[1]: var-lib-kubelet-pods-fa7b7a0c\x2db718\x2d4ff0\x2db03f\x2db5bbd4b6a0c4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 13 00:33:36.602681 systemd[1]: Removed slice kubepods-besteffort-podfa7b7a0c_b718_4ff0_b03f_b5bbd4b6a0c4.slice - libcontainer container kubepods-besteffort-podfa7b7a0c_b718_4ff0_b03f_b5bbd4b6a0c4.slice. Aug 13 00:33:36.777379 kubelet[2741]: I0813 00:33:36.776669 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-4jt7z" podStartSLOduration=1.802748093 podStartE2EDuration="13.776652496s" podCreationTimestamp="2025-08-13 00:33:23 +0000 UTC" firstStartedPulling="2025-08-13 00:33:23.655281837 +0000 UTC m=+19.171866804" lastFinishedPulling="2025-08-13 00:33:35.62918624 +0000 UTC m=+31.145771207" observedRunningTime="2025-08-13 00:33:36.772735771 +0000 UTC m=+32.289320749" watchObservedRunningTime="2025-08-13 00:33:36.776652496 +0000 UTC m=+32.293237463" Aug 13 00:33:36.857482 kubelet[2741]: W0813 00:33:36.857094 2741 reflector.go:561] object-"calico-system"/"whisker-backend-key-pair": failed to list *v1.Secret: secrets "whisker-backend-key-pair" is forbidden: User "system:node:ci-4372-1-0-4-15a6623c0c" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4372-1-0-4-15a6623c0c' and this object Aug 13 00:33:36.857482 kubelet[2741]: E0813 00:33:36.857395 2741 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"whisker-backend-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:ci-4372-1-0-4-15a6623c0c\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4372-1-0-4-15a6623c0c' and this object" logger="UnhandledError" Aug 13 00:33:36.858235 kubelet[2741]: W0813 00:33:36.858068 2741 reflector.go:561] object-"calico-system"/"whisker-ca-bundle": failed to list *v1.ConfigMap: configmaps "whisker-ca-bundle" is forbidden: User "system:node:ci-4372-1-0-4-15a6623c0c" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4372-1-0-4-15a6623c0c' and this object Aug 13 00:33:36.858235 kubelet[2741]: E0813 00:33:36.858095 2741 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"whisker-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:ci-4372-1-0-4-15a6623c0c\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4372-1-0-4-15a6623c0c' and this object" logger="UnhandledError" Aug 13 00:33:36.860406 systemd[1]: Created slice kubepods-besteffort-podc91f4e19_1004_48ef_a9d8_5b715bc0c46d.slice - libcontainer container kubepods-besteffort-podc91f4e19_1004_48ef_a9d8_5b715bc0c46d.slice. Aug 13 00:33:36.921697 containerd[1576]: time="2025-08-13T00:33:36.921517945Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\" id:\"ff2af2d7c5ceacc2f4e41872c47be6485672abc72716fbad8d7cf5f159018dc5\" pid:3855 exit_status:1 exited_at:{seconds:1755045216 nanos:921100623}" Aug 13 00:33:37.024753 kubelet[2741]: I0813 00:33:37.024663 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8psxz\" (UniqueName: \"kubernetes.io/projected/c91f4e19-1004-48ef-a9d8-5b715bc0c46d-kube-api-access-8psxz\") pod \"whisker-788c959bb5-scz89\" (UID: \"c91f4e19-1004-48ef-a9d8-5b715bc0c46d\") " pod="calico-system/whisker-788c959bb5-scz89" Aug 13 00:33:37.024951 kubelet[2741]: I0813 00:33:37.024777 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c91f4e19-1004-48ef-a9d8-5b715bc0c46d-whisker-backend-key-pair\") pod \"whisker-788c959bb5-scz89\" (UID: \"c91f4e19-1004-48ef-a9d8-5b715bc0c46d\") " pod="calico-system/whisker-788c959bb5-scz89" Aug 13 00:33:37.024951 kubelet[2741]: I0813 00:33:37.024849 2741 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c91f4e19-1004-48ef-a9d8-5b715bc0c46d-whisker-ca-bundle\") pod \"whisker-788c959bb5-scz89\" (UID: \"c91f4e19-1004-48ef-a9d8-5b715bc0c46d\") " pod="calico-system/whisker-788c959bb5-scz89" Aug 13 00:33:37.836283 containerd[1576]: time="2025-08-13T00:33:37.836231423Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\" id:\"1e46f0bb7f7ea0fb63aea1925c89ac5b88d4ce4d5a7dd4a05d6396ef12679024\" pid:3975 exit_status:1 exited_at:{seconds:1755045217 nanos:835924848}" Aug 13 00:33:38.127618 kubelet[2741]: E0813 00:33:38.127536 2741 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Aug 13 00:33:38.128021 kubelet[2741]: E0813 00:33:38.127676 2741 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c91f4e19-1004-48ef-a9d8-5b715bc0c46d-whisker-ca-bundle podName:c91f4e19-1004-48ef-a9d8-5b715bc0c46d nodeName:}" failed. No retries permitted until 2025-08-13 00:33:38.627653585 +0000 UTC m=+34.144238572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/c91f4e19-1004-48ef-a9d8-5b715bc0c46d-whisker-ca-bundle") pod "whisker-788c959bb5-scz89" (UID: "c91f4e19-1004-48ef-a9d8-5b715bc0c46d") : failed to sync configmap cache: timed out waiting for the condition Aug 13 00:33:38.128446 kubelet[2741]: E0813 00:33:38.128409 2741 secret.go:189] Couldn't get secret calico-system/whisker-backend-key-pair: failed to sync secret cache: timed out waiting for the condition Aug 13 00:33:38.129325 kubelet[2741]: E0813 00:33:38.129287 2741 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c91f4e19-1004-48ef-a9d8-5b715bc0c46d-whisker-backend-key-pair podName:c91f4e19-1004-48ef-a9d8-5b715bc0c46d nodeName:}" failed. No retries permitted until 2025-08-13 00:33:38.629267931 +0000 UTC m=+34.145852908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-backend-key-pair" (UniqueName: "kubernetes.io/secret/c91f4e19-1004-48ef-a9d8-5b715bc0c46d-whisker-backend-key-pair") pod "whisker-788c959bb5-scz89" (UID: "c91f4e19-1004-48ef-a9d8-5b715bc0c46d") : failed to sync secret cache: timed out waiting for the condition Aug 13 00:33:38.592636 kubelet[2741]: I0813 00:33:38.592552 2741 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4" path="/var/lib/kubelet/pods/fa7b7a0c-b718-4ff0-b03f-b5bbd4b6a0c4/volumes" Aug 13 00:33:38.666276 containerd[1576]: time="2025-08-13T00:33:38.666230276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-788c959bb5-scz89,Uid:c91f4e19-1004-48ef-a9d8-5b715bc0c46d,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:38.983869 systemd-networkd[1480]: calic6bc1dbac57: Link UP Aug 13 00:33:38.985399 systemd-networkd[1480]: calic6bc1dbac57: Gained carrier Aug 13 00:33:39.002954 containerd[1576]: 2025-08-13 00:33:38.722 [INFO][4011] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:33:39.002954 containerd[1576]: 2025-08-13 00:33:38.764 [INFO][4011] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0 whisker-788c959bb5- calico-system c91f4e19-1004-48ef-a9d8-5b715bc0c46d 886 0 2025-08-13 00:33:36 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:788c959bb5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372-1-0-4-15a6623c0c whisker-788c959bb5-scz89 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calic6bc1dbac57 [] [] }} ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Namespace="calico-system" Pod="whisker-788c959bb5-scz89" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-" Aug 13 00:33:39.002954 containerd[1576]: 2025-08-13 00:33:38.765 [INFO][4011] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Namespace="calico-system" Pod="whisker-788c959bb5-scz89" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" Aug 13 00:33:39.002954 containerd[1576]: 2025-08-13 00:33:38.918 [INFO][4023] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" HandleID="k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Workload="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.921 [INFO][4023] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" HandleID="k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Workload="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5610), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-4-15a6623c0c", "pod":"whisker-788c959bb5-scz89", "timestamp":"2025-08-13 00:33:38.918710481 +0000 UTC"}, Hostname:"ci-4372-1-0-4-15a6623c0c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.921 [INFO][4023] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.922 [INFO][4023] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.922 [INFO][4023] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-4-15a6623c0c' Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.935 [INFO][4023] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.947 [INFO][4023] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.954 [INFO][4023] ipam/ipam.go 511: Trying affinity for 192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.956 [INFO][4023] ipam/ipam.go 158: Attempting to load block cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.003184 containerd[1576]: 2025-08-13 00:33:38.958 [INFO][4023] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.004003 containerd[1576]: 2025-08-13 00:33:38.958 [INFO][4023] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.64.128/26 handle="k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.004003 containerd[1576]: 2025-08-13 00:33:38.960 [INFO][4023] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840 Aug 13 00:33:39.004003 containerd[1576]: 2025-08-13 00:33:38.966 [INFO][4023] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.64.128/26 handle="k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.004003 containerd[1576]: 2025-08-13 00:33:38.972 [INFO][4023] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.64.129/26] block=192.168.64.128/26 handle="k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.004003 containerd[1576]: 2025-08-13 00:33:38.972 [INFO][4023] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.64.129/26] handle="k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:39.004003 containerd[1576]: 2025-08-13 00:33:38.972 [INFO][4023] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:33:39.004003 containerd[1576]: 2025-08-13 00:33:38.972 [INFO][4023] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.64.129/26] IPv6=[] ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" HandleID="k8s-pod-network.37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Workload="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" Aug 13 00:33:39.004387 containerd[1576]: 2025-08-13 00:33:38.974 [INFO][4011] cni-plugin/k8s.go 418: Populated endpoint ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Namespace="calico-system" Pod="whisker-788c959bb5-scz89" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0", GenerateName:"whisker-788c959bb5-", Namespace:"calico-system", SelfLink:"", UID:"c91f4e19-1004-48ef-a9d8-5b715bc0c46d", ResourceVersion:"886", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"788c959bb5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"", Pod:"whisker-788c959bb5-scz89", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.64.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic6bc1dbac57", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:39.004387 containerd[1576]: 2025-08-13 00:33:38.974 [INFO][4011] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.64.129/32] ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Namespace="calico-system" Pod="whisker-788c959bb5-scz89" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" Aug 13 00:33:39.004598 containerd[1576]: 2025-08-13 00:33:38.974 [INFO][4011] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic6bc1dbac57 ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Namespace="calico-system" Pod="whisker-788c959bb5-scz89" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" Aug 13 00:33:39.004598 containerd[1576]: 2025-08-13 00:33:38.986 [INFO][4011] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Namespace="calico-system" Pod="whisker-788c959bb5-scz89" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" Aug 13 00:33:39.004763 containerd[1576]: 2025-08-13 00:33:38.986 [INFO][4011] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Namespace="calico-system" Pod="whisker-788c959bb5-scz89" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0", GenerateName:"whisker-788c959bb5-", Namespace:"calico-system", SelfLink:"", UID:"c91f4e19-1004-48ef-a9d8-5b715bc0c46d", ResourceVersion:"886", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"788c959bb5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840", Pod:"whisker-788c959bb5-scz89", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.64.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calic6bc1dbac57", MAC:"42:c0:3f:fb:4c:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:39.004961 containerd[1576]: 2025-08-13 00:33:38.996 [INFO][4011] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" Namespace="calico-system" Pod="whisker-788c959bb5-scz89" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-whisker--788c959bb5--scz89-eth0" Aug 13 00:33:39.070679 containerd[1576]: time="2025-08-13T00:33:39.070622795Z" level=info msg="connecting to shim 37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840" address="unix:///run/containerd/s/a49c9b3e1350477501e0e54865d2877fbdef00b2fd90dc9603947723c6f908ae" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:39.091486 systemd[1]: Started cri-containerd-37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840.scope - libcontainer container 37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840. Aug 13 00:33:39.133603 containerd[1576]: time="2025-08-13T00:33:39.133559105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-788c959bb5-scz89,Uid:c91f4e19-1004-48ef-a9d8-5b715bc0c46d,Namespace:calico-system,Attempt:0,} returns sandbox id \"37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840\"" Aug 13 00:33:39.135495 containerd[1576]: time="2025-08-13T00:33:39.135342539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 13 00:33:40.039675 systemd-networkd[1480]: calic6bc1dbac57: Gained IPv6LL Aug 13 00:33:40.735390 containerd[1576]: time="2025-08-13T00:33:40.734341950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:40.737669 containerd[1576]: time="2025-08-13T00:33:40.737634754Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 13 00:33:40.738798 containerd[1576]: time="2025-08-13T00:33:40.738742872Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:40.744865 containerd[1576]: time="2025-08-13T00:33:40.742980086Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:40.745406 containerd[1576]: time="2025-08-13T00:33:40.744847107Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.60923874s" Aug 13 00:33:40.745677 containerd[1576]: time="2025-08-13T00:33:40.745584550Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 13 00:33:40.751038 containerd[1576]: time="2025-08-13T00:33:40.750980378Z" level=info msg="CreateContainer within sandbox \"37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 13 00:33:40.767038 containerd[1576]: time="2025-08-13T00:33:40.766994274Z" level=info msg="Container dcb593fd151ef798c8e11c544191e570f5db9aa5bc6b39607f36d62b5a74f9a5: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:40.774981 containerd[1576]: time="2025-08-13T00:33:40.774938330Z" level=info msg="CreateContainer within sandbox \"37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"dcb593fd151ef798c8e11c544191e570f5db9aa5bc6b39607f36d62b5a74f9a5\"" Aug 13 00:33:40.775410 containerd[1576]: time="2025-08-13T00:33:40.775357016Z" level=info msg="StartContainer for \"dcb593fd151ef798c8e11c544191e570f5db9aa5bc6b39607f36d62b5a74f9a5\"" Aug 13 00:33:40.776251 containerd[1576]: time="2025-08-13T00:33:40.776208472Z" level=info msg="connecting to shim dcb593fd151ef798c8e11c544191e570f5db9aa5bc6b39607f36d62b5a74f9a5" address="unix:///run/containerd/s/a49c9b3e1350477501e0e54865d2877fbdef00b2fd90dc9603947723c6f908ae" protocol=ttrpc version=3 Aug 13 00:33:40.796664 systemd[1]: Started cri-containerd-dcb593fd151ef798c8e11c544191e570f5db9aa5bc6b39607f36d62b5a74f9a5.scope - libcontainer container dcb593fd151ef798c8e11c544191e570f5db9aa5bc6b39607f36d62b5a74f9a5. Aug 13 00:33:40.861423 containerd[1576]: time="2025-08-13T00:33:40.861337655Z" level=info msg="StartContainer for \"dcb593fd151ef798c8e11c544191e570f5db9aa5bc6b39607f36d62b5a74f9a5\" returns successfully" Aug 13 00:33:40.862589 containerd[1576]: time="2025-08-13T00:33:40.862542713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 13 00:33:42.592649 containerd[1576]: time="2025-08-13T00:33:42.592587641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-r465b,Uid:3e889cea-019a-4296-b7d3-2bcaff0d62fa,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:42.595081 containerd[1576]: time="2025-08-13T00:33:42.594585557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-rf9dq,Uid:d0d66f86-eb5e-4570-9700-e2acb101d000,Namespace:kube-system,Attempt:0,}" Aug 13 00:33:42.595666 containerd[1576]: time="2025-08-13T00:33:42.595627251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xjs9d,Uid:54f0db10-9dd8-41e1-a015-ed494fda532f,Namespace:kube-system,Attempt:0,}" Aug 13 00:33:42.811872 systemd-networkd[1480]: calif1bc7a21a95: Link UP Aug 13 00:33:42.812665 systemd-networkd[1480]: calif1bc7a21a95: Gained carrier Aug 13 00:33:42.830622 containerd[1576]: 2025-08-13 00:33:42.684 [INFO][4215] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:33:42.830622 containerd[1576]: 2025-08-13 00:33:42.712 [INFO][4215] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0 coredns-7c65d6cfc9- kube-system 54f0db10-9dd8-41e1-a015-ed494fda532f 822 0 2025-08-13 00:33:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-1-0-4-15a6623c0c coredns-7c65d6cfc9-xjs9d eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif1bc7a21a95 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xjs9d" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-" Aug 13 00:33:42.830622 containerd[1576]: 2025-08-13 00:33:42.712 [INFO][4215] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xjs9d" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" Aug 13 00:33:42.830622 containerd[1576]: 2025-08-13 00:33:42.755 [INFO][4232] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" HandleID="k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Workload="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.755 [INFO][4232] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" HandleID="k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Workload="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5980), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-1-0-4-15a6623c0c", "pod":"coredns-7c65d6cfc9-xjs9d", "timestamp":"2025-08-13 00:33:42.755159554 +0000 UTC"}, Hostname:"ci-4372-1-0-4-15a6623c0c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.755 [INFO][4232] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.755 [INFO][4232] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.755 [INFO][4232] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-4-15a6623c0c' Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.761 [INFO][4232] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.767 [INFO][4232] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.781 [INFO][4232] ipam/ipam.go 511: Trying affinity for 192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.783 [INFO][4232] ipam/ipam.go 158: Attempting to load block cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830795 containerd[1576]: 2025-08-13 00:33:42.786 [INFO][4232] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830949 containerd[1576]: 2025-08-13 00:33:42.786 [INFO][4232] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.64.128/26 handle="k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830949 containerd[1576]: 2025-08-13 00:33:42.790 [INFO][4232] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f Aug 13 00:33:42.830949 containerd[1576]: 2025-08-13 00:33:42.797 [INFO][4232] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.64.128/26 handle="k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830949 containerd[1576]: 2025-08-13 00:33:42.803 [INFO][4232] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.64.130/26] block=192.168.64.128/26 handle="k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830949 containerd[1576]: 2025-08-13 00:33:42.803 [INFO][4232] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.64.130/26] handle="k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.830949 containerd[1576]: 2025-08-13 00:33:42.803 [INFO][4232] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:33:42.830949 containerd[1576]: 2025-08-13 00:33:42.803 [INFO][4232] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.64.130/26] IPv6=[] ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" HandleID="k8s-pod-network.7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Workload="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" Aug 13 00:33:42.832262 containerd[1576]: 2025-08-13 00:33:42.807 [INFO][4215] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xjs9d" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"54f0db10-9dd8-41e1-a015-ed494fda532f", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"", Pod:"coredns-7c65d6cfc9-xjs9d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.64.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif1bc7a21a95", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:42.832262 containerd[1576]: 2025-08-13 00:33:42.808 [INFO][4215] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.64.130/32] ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xjs9d" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" Aug 13 00:33:42.832262 containerd[1576]: 2025-08-13 00:33:42.808 [INFO][4215] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif1bc7a21a95 ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xjs9d" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" Aug 13 00:33:42.832262 containerd[1576]: 2025-08-13 00:33:42.812 [INFO][4215] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xjs9d" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" Aug 13 00:33:42.832262 containerd[1576]: 2025-08-13 00:33:42.812 [INFO][4215] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xjs9d" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"54f0db10-9dd8-41e1-a015-ed494fda532f", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f", Pod:"coredns-7c65d6cfc9-xjs9d", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.64.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif1bc7a21a95", MAC:"92:d6:63:f4:84:38", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:42.832262 containerd[1576]: 2025-08-13 00:33:42.821 [INFO][4215] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xjs9d" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--xjs9d-eth0" Aug 13 00:33:42.860334 containerd[1576]: time="2025-08-13T00:33:42.859791867Z" level=info msg="connecting to shim 7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f" address="unix:///run/containerd/s/193fdabeb2debf25f58cde69ebd1060999d6ef754e15adf51edab839a3868c10" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:42.899543 systemd[1]: Started cri-containerd-7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f.scope - libcontainer container 7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f. Aug 13 00:33:42.934784 systemd-networkd[1480]: calib89d9ed065f: Link UP Aug 13 00:33:42.936362 systemd-networkd[1480]: calib89d9ed065f: Gained carrier Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.668 [INFO][4190] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.685 [INFO][4190] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0 csi-node-driver- calico-system 3e889cea-019a-4296-b7d3-2bcaff0d62fa 722 0 2025-08-13 00:33:23 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372-1-0-4-15a6623c0c csi-node-driver-r465b eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib89d9ed065f [] [] }} ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Namespace="calico-system" Pod="csi-node-driver-r465b" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.685 [INFO][4190] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Namespace="calico-system" Pod="csi-node-driver-r465b" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.774 [INFO][4224] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" HandleID="k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Workload="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.774 [INFO][4224] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" HandleID="k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Workload="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032cf30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-4-15a6623c0c", "pod":"csi-node-driver-r465b", "timestamp":"2025-08-13 00:33:42.774849905 +0000 UTC"}, Hostname:"ci-4372-1-0-4-15a6623c0c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.775 [INFO][4224] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.803 [INFO][4224] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.803 [INFO][4224] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-4-15a6623c0c' Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.864 [INFO][4224] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.871 [INFO][4224] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.881 [INFO][4224] ipam/ipam.go 511: Trying affinity for 192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.883 [INFO][4224] ipam/ipam.go 158: Attempting to load block cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.887 [INFO][4224] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.888 [INFO][4224] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.64.128/26 handle="k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.890 [INFO][4224] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455 Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.897 [INFO][4224] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.64.128/26 handle="k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.910 [INFO][4224] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.64.131/26] block=192.168.64.128/26 handle="k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.910 [INFO][4224] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.64.131/26] handle="k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.910 [INFO][4224] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:33:42.960476 containerd[1576]: 2025-08-13 00:33:42.910 [INFO][4224] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.64.131/26] IPv6=[] ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" HandleID="k8s-pod-network.5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Workload="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" Aug 13 00:33:42.961032 containerd[1576]: 2025-08-13 00:33:42.923 [INFO][4190] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Namespace="calico-system" Pod="csi-node-driver-r465b" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3e889cea-019a-4296-b7d3-2bcaff0d62fa", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"", Pod:"csi-node-driver-r465b", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.64.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib89d9ed065f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:42.961032 containerd[1576]: 2025-08-13 00:33:42.927 [INFO][4190] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.64.131/32] ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Namespace="calico-system" Pod="csi-node-driver-r465b" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" Aug 13 00:33:42.961032 containerd[1576]: 2025-08-13 00:33:42.927 [INFO][4190] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib89d9ed065f ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Namespace="calico-system" Pod="csi-node-driver-r465b" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" Aug 13 00:33:42.961032 containerd[1576]: 2025-08-13 00:33:42.935 [INFO][4190] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Namespace="calico-system" Pod="csi-node-driver-r465b" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" Aug 13 00:33:42.961032 containerd[1576]: 2025-08-13 00:33:42.936 [INFO][4190] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Namespace="calico-system" Pod="csi-node-driver-r465b" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3e889cea-019a-4296-b7d3-2bcaff0d62fa", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455", Pod:"csi-node-driver-r465b", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.64.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib89d9ed065f", MAC:"ce:3a:3a:e3:03:7e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:42.961032 containerd[1576]: 2025-08-13 00:33:42.954 [INFO][4190] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" Namespace="calico-system" Pod="csi-node-driver-r465b" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-csi--node--driver--r465b-eth0" Aug 13 00:33:43.020156 systemd-networkd[1480]: cali458f08cb254: Link UP Aug 13 00:33:43.021835 systemd-networkd[1480]: cali458f08cb254: Gained carrier Aug 13 00:33:43.034972 containerd[1576]: time="2025-08-13T00:33:43.034852817Z" level=info msg="connecting to shim 5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455" address="unix:///run/containerd/s/cdabbe9fcd395babcecbc324b6b47430fe11c634906e6c3dac436cf3732076fa" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.683 [INFO][4206] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.709 [INFO][4206] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0 coredns-7c65d6cfc9- kube-system d0d66f86-eb5e-4570-9700-e2acb101d000 820 0 2025-08-13 00:33:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-1-0-4-15a6623c0c coredns-7c65d6cfc9-rf9dq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali458f08cb254 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rf9dq" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.709 [INFO][4206] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rf9dq" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.789 [INFO][4230] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" HandleID="k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Workload="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.791 [INFO][4230] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" HandleID="k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Workload="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fbd0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-1-0-4-15a6623c0c", "pod":"coredns-7c65d6cfc9-rf9dq", "timestamp":"2025-08-13 00:33:42.789227053 +0000 UTC"}, Hostname:"ci-4372-1-0-4-15a6623c0c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.792 [INFO][4230] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.911 [INFO][4230] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.911 [INFO][4230] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-4-15a6623c0c' Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.965 [INFO][4230] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.977 [INFO][4230] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.984 [INFO][4230] ipam/ipam.go 511: Trying affinity for 192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.988 [INFO][4230] ipam/ipam.go 158: Attempting to load block cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.991 [INFO][4230] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.991 [INFO][4230] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.64.128/26 handle="k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.992 [INFO][4230] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:42.999 [INFO][4230] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.64.128/26 handle="k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:43.005 [INFO][4230] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.64.132/26] block=192.168.64.128/26 handle="k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:43.005 [INFO][4230] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.64.132/26] handle="k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:43.006 [INFO][4230] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:33:43.054585 containerd[1576]: 2025-08-13 00:33:43.006 [INFO][4230] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.64.132/26] IPv6=[] ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" HandleID="k8s-pod-network.2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Workload="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" Aug 13 00:33:43.055500 containerd[1576]: 2025-08-13 00:33:43.014 [INFO][4206] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rf9dq" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d0d66f86-eb5e-4570-9700-e2acb101d000", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"", Pod:"coredns-7c65d6cfc9-rf9dq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.64.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali458f08cb254", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:43.055500 containerd[1576]: 2025-08-13 00:33:43.014 [INFO][4206] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.64.132/32] ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rf9dq" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" Aug 13 00:33:43.055500 containerd[1576]: 2025-08-13 00:33:43.014 [INFO][4206] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali458f08cb254 ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rf9dq" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" Aug 13 00:33:43.055500 containerd[1576]: 2025-08-13 00:33:43.027 [INFO][4206] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rf9dq" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" Aug 13 00:33:43.055500 containerd[1576]: 2025-08-13 00:33:43.030 [INFO][4206] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rf9dq" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"d0d66f86-eb5e-4570-9700-e2acb101d000", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e", Pod:"coredns-7c65d6cfc9-rf9dq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.64.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali458f08cb254", MAC:"ea:03:dd:aa:da:d1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:43.055500 containerd[1576]: 2025-08-13 00:33:43.046 [INFO][4206] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rf9dq" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-coredns--7c65d6cfc9--rf9dq-eth0" Aug 13 00:33:43.063504 containerd[1576]: time="2025-08-13T00:33:43.063407691Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xjs9d,Uid:54f0db10-9dd8-41e1-a015-ed494fda532f,Namespace:kube-system,Attempt:0,} returns sandbox id \"7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f\"" Aug 13 00:33:43.073863 containerd[1576]: time="2025-08-13T00:33:43.073436765Z" level=info msg="CreateContainer within sandbox \"7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:33:43.092576 systemd[1]: Started cri-containerd-5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455.scope - libcontainer container 5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455. Aug 13 00:33:43.120739 containerd[1576]: time="2025-08-13T00:33:43.120702923Z" level=info msg="Container e9ff97ca99a64be712586d2aff31342c913fee3fa09ce835b21f88d987ec4a3c: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:43.147890 containerd[1576]: time="2025-08-13T00:33:43.147861417Z" level=info msg="CreateContainer within sandbox \"7f91e97338f39515e12cbb07b5cbd20cfee434ef048ceff0bfde71161160a62f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e9ff97ca99a64be712586d2aff31342c913fee3fa09ce835b21f88d987ec4a3c\"" Aug 13 00:33:43.149564 containerd[1576]: time="2025-08-13T00:33:43.149519626Z" level=info msg="StartContainer for \"e9ff97ca99a64be712586d2aff31342c913fee3fa09ce835b21f88d987ec4a3c\"" Aug 13 00:33:43.152598 containerd[1576]: time="2025-08-13T00:33:43.152579113Z" level=info msg="connecting to shim 2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e" address="unix:///run/containerd/s/552f8427da99e21868e69a4638810aa65b0cd927a527191f2ae4afd94f8d301b" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:43.154489 containerd[1576]: time="2025-08-13T00:33:43.154469448Z" level=info msg="connecting to shim e9ff97ca99a64be712586d2aff31342c913fee3fa09ce835b21f88d987ec4a3c" address="unix:///run/containerd/s/193fdabeb2debf25f58cde69ebd1060999d6ef754e15adf51edab839a3868c10" protocol=ttrpc version=3 Aug 13 00:33:43.176372 containerd[1576]: time="2025-08-13T00:33:43.176320840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-r465b,Uid:3e889cea-019a-4296-b7d3-2bcaff0d62fa,Namespace:calico-system,Attempt:0,} returns sandbox id \"5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455\"" Aug 13 00:33:43.204505 systemd[1]: Started cri-containerd-e9ff97ca99a64be712586d2aff31342c913fee3fa09ce835b21f88d987ec4a3c.scope - libcontainer container e9ff97ca99a64be712586d2aff31342c913fee3fa09ce835b21f88d987ec4a3c. Aug 13 00:33:43.208004 systemd[1]: Started cri-containerd-2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e.scope - libcontainer container 2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e. Aug 13 00:33:43.246340 containerd[1576]: time="2025-08-13T00:33:43.246308252Z" level=info msg="StartContainer for \"e9ff97ca99a64be712586d2aff31342c913fee3fa09ce835b21f88d987ec4a3c\" returns successfully" Aug 13 00:33:43.281594 containerd[1576]: time="2025-08-13T00:33:43.281509496Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-rf9dq,Uid:d0d66f86-eb5e-4570-9700-e2acb101d000,Namespace:kube-system,Attempt:0,} returns sandbox id \"2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e\"" Aug 13 00:33:43.286117 containerd[1576]: time="2025-08-13T00:33:43.286081028Z" level=info msg="CreateContainer within sandbox \"2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:33:43.297090 containerd[1576]: time="2025-08-13T00:33:43.297011333Z" level=info msg="Container 02053735605f09014e12b6a011b7448e9b63f4ee0f70b8e5d18db6055f3f4f78: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:43.319091 containerd[1576]: time="2025-08-13T00:33:43.319046370Z" level=info msg="CreateContainer within sandbox \"2ba9a060933cc5c4f41a516a126fa069b8ecaf623f78964fcf096e7b1585b72e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"02053735605f09014e12b6a011b7448e9b63f4ee0f70b8e5d18db6055f3f4f78\"" Aug 13 00:33:43.320874 containerd[1576]: time="2025-08-13T00:33:43.320662761Z" level=info msg="StartContainer for \"02053735605f09014e12b6a011b7448e9b63f4ee0f70b8e5d18db6055f3f4f78\"" Aug 13 00:33:43.322003 containerd[1576]: time="2025-08-13T00:33:43.321975002Z" level=info msg="connecting to shim 02053735605f09014e12b6a011b7448e9b63f4ee0f70b8e5d18db6055f3f4f78" address="unix:///run/containerd/s/552f8427da99e21868e69a4638810aa65b0cd927a527191f2ae4afd94f8d301b" protocol=ttrpc version=3 Aug 13 00:33:43.350499 systemd[1]: Started cri-containerd-02053735605f09014e12b6a011b7448e9b63f4ee0f70b8e5d18db6055f3f4f78.scope - libcontainer container 02053735605f09014e12b6a011b7448e9b63f4ee0f70b8e5d18db6055f3f4f78. Aug 13 00:33:43.362583 containerd[1576]: time="2025-08-13T00:33:43.362527130Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:43.364505 containerd[1576]: time="2025-08-13T00:33:43.364484851Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 13 00:33:43.367229 containerd[1576]: time="2025-08-13T00:33:43.367105425Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:43.369789 containerd[1576]: time="2025-08-13T00:33:43.369622545Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:43.371931 containerd[1576]: time="2025-08-13T00:33:43.371458317Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 2.508740205s" Aug 13 00:33:43.371931 containerd[1576]: time="2025-08-13T00:33:43.371480398Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 13 00:33:43.372535 containerd[1576]: time="2025-08-13T00:33:43.372490983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 13 00:33:43.374082 containerd[1576]: time="2025-08-13T00:33:43.374064874Z" level=info msg="CreateContainer within sandbox \"37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 13 00:33:43.381508 containerd[1576]: time="2025-08-13T00:33:43.381479156Z" level=info msg="Container da3de3b925b3eab6dab3296e1056678d9057f0ce6136b3c5340f69379e0b9f70: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:43.395649 containerd[1576]: time="2025-08-13T00:33:43.395570769Z" level=info msg="StartContainer for \"02053735605f09014e12b6a011b7448e9b63f4ee0f70b8e5d18db6055f3f4f78\" returns successfully" Aug 13 00:33:43.399178 containerd[1576]: time="2025-08-13T00:33:43.399127729Z" level=info msg="CreateContainer within sandbox \"37d663437a1331d60919af1874ead6030773fd2689bbf51647ee1a3c5bf13840\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"da3de3b925b3eab6dab3296e1056678d9057f0ce6136b3c5340f69379e0b9f70\"" Aug 13 00:33:43.399476 containerd[1576]: time="2025-08-13T00:33:43.399441137Z" level=info msg="StartContainer for \"da3de3b925b3eab6dab3296e1056678d9057f0ce6136b3c5340f69379e0b9f70\"" Aug 13 00:33:43.400200 containerd[1576]: time="2025-08-13T00:33:43.400175774Z" level=info msg="connecting to shim da3de3b925b3eab6dab3296e1056678d9057f0ce6136b3c5340f69379e0b9f70" address="unix:///run/containerd/s/a49c9b3e1350477501e0e54865d2877fbdef00b2fd90dc9603947723c6f908ae" protocol=ttrpc version=3 Aug 13 00:33:43.421517 systemd[1]: Started cri-containerd-da3de3b925b3eab6dab3296e1056678d9057f0ce6136b3c5340f69379e0b9f70.scope - libcontainer container da3de3b925b3eab6dab3296e1056678d9057f0ce6136b3c5340f69379e0b9f70. Aug 13 00:33:43.464881 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2896161049.mount: Deactivated successfully. Aug 13 00:33:43.492063 containerd[1576]: time="2025-08-13T00:33:43.492029867Z" level=info msg="StartContainer for \"da3de3b925b3eab6dab3296e1056678d9057f0ce6136b3c5340f69379e0b9f70\" returns successfully" Aug 13 00:33:43.648991 containerd[1576]: time="2025-08-13T00:33:43.648812124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f84bf9d-rgxvb,Uid:e88815b6-96db-45a3-8446-237a9c6bee1f,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:33:43.895208 systemd-networkd[1480]: cali54be761b51d: Link UP Aug 13 00:33:43.896337 systemd-networkd[1480]: cali54be761b51d: Gained carrier Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.749 [INFO][4524] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.769 [INFO][4524] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0 calico-apiserver-5c7f84bf9d- calico-apiserver e88815b6-96db-45a3-8446-237a9c6bee1f 817 0 2025-08-13 00:33:19 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c7f84bf9d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-4-15a6623c0c calico-apiserver-5c7f84bf9d-rgxvb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali54be761b51d [] [] }} ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-rgxvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.769 [INFO][4524] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-rgxvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.813 [INFO][4537] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" HandleID="k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.813 [INFO][4537] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" HandleID="k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-4-15a6623c0c", "pod":"calico-apiserver-5c7f84bf9d-rgxvb", "timestamp":"2025-08-13 00:33:43.813169516 +0000 UTC"}, Hostname:"ci-4372-1-0-4-15a6623c0c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.813 [INFO][4537] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.813 [INFO][4537] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.813 [INFO][4537] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-4-15a6623c0c' Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.829 [INFO][4537] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.841 [INFO][4537] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.855 [INFO][4537] ipam/ipam.go 511: Trying affinity for 192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.859 [INFO][4537] ipam/ipam.go 158: Attempting to load block cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.864 [INFO][4537] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.864 [INFO][4537] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.64.128/26 handle="k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.867 [INFO][4537] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4 Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.874 [INFO][4537] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.64.128/26 handle="k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.883 [INFO][4537] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.64.133/26] block=192.168.64.128/26 handle="k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.884 [INFO][4537] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.64.133/26] handle="k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.884 [INFO][4537] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:33:43.922652 containerd[1576]: 2025-08-13 00:33:43.884 [INFO][4537] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.64.133/26] IPv6=[] ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" HandleID="k8s-pod-network.0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" Aug 13 00:33:43.925620 containerd[1576]: 2025-08-13 00:33:43.888 [INFO][4524] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-rgxvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0", GenerateName:"calico-apiserver-5c7f84bf9d-", Namespace:"calico-apiserver", SelfLink:"", UID:"e88815b6-96db-45a3-8446-237a9c6bee1f", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c7f84bf9d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"", Pod:"calico-apiserver-5c7f84bf9d-rgxvb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.64.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali54be761b51d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:43.925620 containerd[1576]: 2025-08-13 00:33:43.888 [INFO][4524] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.64.133/32] ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-rgxvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" Aug 13 00:33:43.925620 containerd[1576]: 2025-08-13 00:33:43.888 [INFO][4524] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali54be761b51d ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-rgxvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" Aug 13 00:33:43.925620 containerd[1576]: 2025-08-13 00:33:43.896 [INFO][4524] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-rgxvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" Aug 13 00:33:43.925620 containerd[1576]: 2025-08-13 00:33:43.897 [INFO][4524] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-rgxvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0", GenerateName:"calico-apiserver-5c7f84bf9d-", Namespace:"calico-apiserver", SelfLink:"", UID:"e88815b6-96db-45a3-8446-237a9c6bee1f", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c7f84bf9d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4", Pod:"calico-apiserver-5c7f84bf9d-rgxvb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.64.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali54be761b51d", MAC:"06:78:b9:97:79:bb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:43.925620 containerd[1576]: 2025-08-13 00:33:43.916 [INFO][4524] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-rgxvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--rgxvb-eth0" Aug 13 00:33:43.927109 kubelet[2741]: I0813 00:33:43.927072 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-xjs9d" podStartSLOduration=33.92705581 podStartE2EDuration="33.92705581s" podCreationTimestamp="2025-08-13 00:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:33:43.904660097 +0000 UTC m=+39.421245063" watchObservedRunningTime="2025-08-13 00:33:43.92705581 +0000 UTC m=+39.443640777" Aug 13 00:33:43.934819 kubelet[2741]: I0813 00:33:43.934607 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-rf9dq" podStartSLOduration=33.934592582 podStartE2EDuration="33.934592582s" podCreationTimestamp="2025-08-13 00:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:33:43.934198083 +0000 UTC m=+39.450783070" watchObservedRunningTime="2025-08-13 00:33:43.934592582 +0000 UTC m=+39.451177549" Aug 13 00:33:43.973974 containerd[1576]: time="2025-08-13T00:33:43.973938168Z" level=info msg="connecting to shim 0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4" address="unix:///run/containerd/s/9604d955b2be2a42d3bc9cbb52f854656ccc60fb08c44f11d3d21462e64dddc1" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:43.995596 kubelet[2741]: I0813 00:33:43.994115 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-788c959bb5-scz89" podStartSLOduration=3.757051586 podStartE2EDuration="7.994097628s" podCreationTimestamp="2025-08-13 00:33:36 +0000 UTC" firstStartedPulling="2025-08-13 00:33:39.1349486 +0000 UTC m=+34.651533568" lastFinishedPulling="2025-08-13 00:33:43.371994643 +0000 UTC m=+38.888579610" observedRunningTime="2025-08-13 00:33:43.990835722 +0000 UTC m=+39.507420688" watchObservedRunningTime="2025-08-13 00:33:43.994097628 +0000 UTC m=+39.510682595" Aug 13 00:33:44.007536 systemd[1]: Started cri-containerd-0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4.scope - libcontainer container 0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4. Aug 13 00:33:44.007738 systemd-networkd[1480]: calib89d9ed065f: Gained IPv6LL Aug 13 00:33:44.050736 containerd[1576]: time="2025-08-13T00:33:44.050612618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f84bf9d-rgxvb,Uid:e88815b6-96db-45a3-8446-237a9c6bee1f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4\"" Aug 13 00:33:44.583818 systemd-networkd[1480]: calif1bc7a21a95: Gained IPv6LL Aug 13 00:33:45.028900 containerd[1576]: time="2025-08-13T00:33:45.028848039Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:45.030439 containerd[1576]: time="2025-08-13T00:33:45.030401411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 13 00:33:45.031517 containerd[1576]: time="2025-08-13T00:33:45.031320665Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:45.032123 systemd-networkd[1480]: cali458f08cb254: Gained IPv6LL Aug 13 00:33:45.033799 containerd[1576]: time="2025-08-13T00:33:45.033760510Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:45.034244 containerd[1576]: time="2025-08-13T00:33:45.034218910Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.661708329s" Aug 13 00:33:45.034284 containerd[1576]: time="2025-08-13T00:33:45.034245209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 13 00:33:45.035226 containerd[1576]: time="2025-08-13T00:33:45.035144064Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:33:45.036524 containerd[1576]: time="2025-08-13T00:33:45.036495819Z" level=info msg="CreateContainer within sandbox \"5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 13 00:33:45.067449 containerd[1576]: time="2025-08-13T00:33:45.067417730Z" level=info msg="Container 881c8398c7b6b8536560335e5f6178afb5df43c71a7b7997fbcfcc3c2d84cab3: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:45.070181 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3543893751.mount: Deactivated successfully. Aug 13 00:33:45.100569 containerd[1576]: time="2025-08-13T00:33:45.100526070Z" level=info msg="CreateContainer within sandbox \"5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"881c8398c7b6b8536560335e5f6178afb5df43c71a7b7997fbcfcc3c2d84cab3\"" Aug 13 00:33:45.102268 containerd[1576]: time="2025-08-13T00:33:45.101173073Z" level=info msg="StartContainer for \"881c8398c7b6b8536560335e5f6178afb5df43c71a7b7997fbcfcc3c2d84cab3\"" Aug 13 00:33:45.102413 containerd[1576]: time="2025-08-13T00:33:45.102394353Z" level=info msg="connecting to shim 881c8398c7b6b8536560335e5f6178afb5df43c71a7b7997fbcfcc3c2d84cab3" address="unix:///run/containerd/s/cdabbe9fcd395babcecbc324b6b47430fe11c634906e6c3dac436cf3732076fa" protocol=ttrpc version=3 Aug 13 00:33:45.122468 systemd[1]: Started cri-containerd-881c8398c7b6b8536560335e5f6178afb5df43c71a7b7997fbcfcc3c2d84cab3.scope - libcontainer container 881c8398c7b6b8536560335e5f6178afb5df43c71a7b7997fbcfcc3c2d84cab3. Aug 13 00:33:45.157526 containerd[1576]: time="2025-08-13T00:33:45.157491574Z" level=info msg="StartContainer for \"881c8398c7b6b8536560335e5f6178afb5df43c71a7b7997fbcfcc3c2d84cab3\" returns successfully" Aug 13 00:33:45.863567 systemd-networkd[1480]: cali54be761b51d: Gained IPv6LL Aug 13 00:33:46.590321 containerd[1576]: time="2025-08-13T00:33:46.590274477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f84bf9d-8qmvb,Uid:0cf431c3-630f-4c35-a27a-549702718992,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:33:46.792295 systemd-networkd[1480]: cali276c39a94eb: Link UP Aug 13 00:33:46.797535 systemd-networkd[1480]: cali276c39a94eb: Gained carrier Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.616 [INFO][4708] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.627 [INFO][4708] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0 calico-apiserver-5c7f84bf9d- calico-apiserver 0cf431c3-630f-4c35-a27a-549702718992 825 0 2025-08-13 00:33:19 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c7f84bf9d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-4-15a6623c0c calico-apiserver-5c7f84bf9d-8qmvb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali276c39a94eb [] [] }} ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-8qmvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.627 [INFO][4708] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-8qmvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.660 [INFO][4719] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" HandleID="k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.661 [INFO][4719] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" HandleID="k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf270), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-4-15a6623c0c", "pod":"calico-apiserver-5c7f84bf9d-8qmvb", "timestamp":"2025-08-13 00:33:46.660924802 +0000 UTC"}, Hostname:"ci-4372-1-0-4-15a6623c0c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.661 [INFO][4719] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.661 [INFO][4719] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.661 [INFO][4719] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-4-15a6623c0c' Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.667 [INFO][4719] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.752 [INFO][4719] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.757 [INFO][4719] ipam/ipam.go 511: Trying affinity for 192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.759 [INFO][4719] ipam/ipam.go 158: Attempting to load block cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.761 [INFO][4719] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.761 [INFO][4719] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.64.128/26 handle="k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.763 [INFO][4719] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6 Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.768 [INFO][4719] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.64.128/26 handle="k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.779 [INFO][4719] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.64.134/26] block=192.168.64.128/26 handle="k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.779 [INFO][4719] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.64.134/26] handle="k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.779 [INFO][4719] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:33:46.828599 containerd[1576]: 2025-08-13 00:33:46.779 [INFO][4719] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.64.134/26] IPv6=[] ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" HandleID="k8s-pod-network.e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" Aug 13 00:33:46.831583 containerd[1576]: 2025-08-13 00:33:46.785 [INFO][4708] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-8qmvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0", GenerateName:"calico-apiserver-5c7f84bf9d-", Namespace:"calico-apiserver", SelfLink:"", UID:"0cf431c3-630f-4c35-a27a-549702718992", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c7f84bf9d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"", Pod:"calico-apiserver-5c7f84bf9d-8qmvb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.64.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali276c39a94eb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:46.831583 containerd[1576]: 2025-08-13 00:33:46.786 [INFO][4708] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.64.134/32] ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-8qmvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" Aug 13 00:33:46.831583 containerd[1576]: 2025-08-13 00:33:46.786 [INFO][4708] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali276c39a94eb ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-8qmvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" Aug 13 00:33:46.831583 containerd[1576]: 2025-08-13 00:33:46.799 [INFO][4708] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-8qmvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" Aug 13 00:33:46.831583 containerd[1576]: 2025-08-13 00:33:46.801 [INFO][4708] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-8qmvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0", GenerateName:"calico-apiserver-5c7f84bf9d-", Namespace:"calico-apiserver", SelfLink:"", UID:"0cf431c3-630f-4c35-a27a-549702718992", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c7f84bf9d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6", Pod:"calico-apiserver-5c7f84bf9d-8qmvb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.64.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali276c39a94eb", MAC:"36:d4:ee:76:33:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:46.831583 containerd[1576]: 2025-08-13 00:33:46.817 [INFO][4708] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f84bf9d-8qmvb" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--apiserver--5c7f84bf9d--8qmvb-eth0" Aug 13 00:33:46.866807 containerd[1576]: time="2025-08-13T00:33:46.866723671Z" level=info msg="connecting to shim e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6" address="unix:///run/containerd/s/c6f9d9f7460a318ae94ada51e1bf92d6ddf3d25a436cbd2a928aa1bdc2d8ec46" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:46.899563 systemd[1]: Started cri-containerd-e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6.scope - libcontainer container e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6. Aug 13 00:33:46.945915 containerd[1576]: time="2025-08-13T00:33:46.945795075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f84bf9d-8qmvb,Uid:0cf431c3-630f-4c35-a27a-549702718992,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6\"" Aug 13 00:33:47.590740 containerd[1576]: time="2025-08-13T00:33:47.590529471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-74d8bfcddc-n9p85,Uid:ec8ee7ae-5bd1-4c63-8236-83729cf25c5d,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:47.591297 containerd[1576]: time="2025-08-13T00:33:47.591257106Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-wrddn,Uid:f7d00ab8-fa2b-489b-811e-3590e0a3f2d9,Namespace:calico-system,Attempt:0,}" Aug 13 00:33:47.719863 systemd-networkd[1480]: calib3e8f178733: Link UP Aug 13 00:33:47.720060 systemd-networkd[1480]: calib3e8f178733: Gained carrier Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.621 [INFO][4799] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.639 [INFO][4799] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0 calico-kube-controllers-74d8bfcddc- calico-system ec8ee7ae-5bd1-4c63-8236-83729cf25c5d 826 0 2025-08-13 00:33:23 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:74d8bfcddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372-1-0-4-15a6623c0c calico-kube-controllers-74d8bfcddc-n9p85 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calib3e8f178733 [] [] }} ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Namespace="calico-system" Pod="calico-kube-controllers-74d8bfcddc-n9p85" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.639 [INFO][4799] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Namespace="calico-system" Pod="calico-kube-controllers-74d8bfcddc-n9p85" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.668 [INFO][4829] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" HandleID="k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.668 [INFO][4829] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" HandleID="k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-4-15a6623c0c", "pod":"calico-kube-controllers-74d8bfcddc-n9p85", "timestamp":"2025-08-13 00:33:47.668584791 +0000 UTC"}, Hostname:"ci-4372-1-0-4-15a6623c0c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.669 [INFO][4829] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.669 [INFO][4829] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.669 [INFO][4829] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-4-15a6623c0c' Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.675 [INFO][4829] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.680 [INFO][4829] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.685 [INFO][4829] ipam/ipam.go 511: Trying affinity for 192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.687 [INFO][4829] ipam/ipam.go 158: Attempting to load block cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.690 [INFO][4829] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.690 [INFO][4829] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.64.128/26 handle="k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.692 [INFO][4829] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0 Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.696 [INFO][4829] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.64.128/26 handle="k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.713 [INFO][4829] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.64.135/26] block=192.168.64.128/26 handle="k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.713 [INFO][4829] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.64.135/26] handle="k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.713 [INFO][4829] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:33:47.735679 containerd[1576]: 2025-08-13 00:33:47.713 [INFO][4829] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.64.135/26] IPv6=[] ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" HandleID="k8s-pod-network.2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Workload="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" Aug 13 00:33:47.736295 containerd[1576]: 2025-08-13 00:33:47.715 [INFO][4799] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Namespace="calico-system" Pod="calico-kube-controllers-74d8bfcddc-n9p85" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0", GenerateName:"calico-kube-controllers-74d8bfcddc-", Namespace:"calico-system", SelfLink:"", UID:"ec8ee7ae-5bd1-4c63-8236-83729cf25c5d", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"74d8bfcddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"", Pod:"calico-kube-controllers-74d8bfcddc-n9p85", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.64.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib3e8f178733", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:47.736295 containerd[1576]: 2025-08-13 00:33:47.716 [INFO][4799] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.64.135/32] ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Namespace="calico-system" Pod="calico-kube-controllers-74d8bfcddc-n9p85" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" Aug 13 00:33:47.736295 containerd[1576]: 2025-08-13 00:33:47.716 [INFO][4799] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib3e8f178733 ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Namespace="calico-system" Pod="calico-kube-controllers-74d8bfcddc-n9p85" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" Aug 13 00:33:47.736295 containerd[1576]: 2025-08-13 00:33:47.720 [INFO][4799] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Namespace="calico-system" Pod="calico-kube-controllers-74d8bfcddc-n9p85" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" Aug 13 00:33:47.736295 containerd[1576]: 2025-08-13 00:33:47.720 [INFO][4799] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Namespace="calico-system" Pod="calico-kube-controllers-74d8bfcddc-n9p85" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0", GenerateName:"calico-kube-controllers-74d8bfcddc-", Namespace:"calico-system", SelfLink:"", UID:"ec8ee7ae-5bd1-4c63-8236-83729cf25c5d", ResourceVersion:"826", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"74d8bfcddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0", Pod:"calico-kube-controllers-74d8bfcddc-n9p85", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.64.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calib3e8f178733", MAC:"72:eb:20:84:27:db", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:47.736295 containerd[1576]: 2025-08-13 00:33:47.733 [INFO][4799] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" Namespace="calico-system" Pod="calico-kube-controllers-74d8bfcddc-n9p85" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-calico--kube--controllers--74d8bfcddc--n9p85-eth0" Aug 13 00:33:47.760910 containerd[1576]: time="2025-08-13T00:33:47.760872187Z" level=info msg="connecting to shim 2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0" address="unix:///run/containerd/s/6bfdd3cff5eda9b5dd17e4f979b988516c1059b3aad6ea1edd049d3f0989113f" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:47.791600 systemd[1]: Started cri-containerd-2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0.scope - libcontainer container 2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0. Aug 13 00:33:47.824841 systemd-networkd[1480]: calidbac8a1dd83: Link UP Aug 13 00:33:47.826030 systemd-networkd[1480]: calidbac8a1dd83: Gained carrier Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.624 [INFO][4811] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.636 [INFO][4811] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0 goldmane-58fd7646b9- calico-system f7d00ab8-fa2b-489b-811e-3590e0a3f2d9 821 0 2025-08-13 00:33:22 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372-1-0-4-15a6623c0c goldmane-58fd7646b9-wrddn eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calidbac8a1dd83 [] [] }} ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Namespace="calico-system" Pod="goldmane-58fd7646b9-wrddn" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.636 [INFO][4811] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Namespace="calico-system" Pod="goldmane-58fd7646b9-wrddn" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.671 [INFO][4824] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" HandleID="k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Workload="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.671 [INFO][4824] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" HandleID="k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Workload="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf220), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-4-15a6623c0c", "pod":"goldmane-58fd7646b9-wrddn", "timestamp":"2025-08-13 00:33:47.671574637 +0000 UTC"}, Hostname:"ci-4372-1-0-4-15a6623c0c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.671 [INFO][4824] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.713 [INFO][4824] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.713 [INFO][4824] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-4-15a6623c0c' Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.778 [INFO][4824] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.787 [INFO][4824] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.795 [INFO][4824] ipam/ipam.go 511: Trying affinity for 192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.797 [INFO][4824] ipam/ipam.go 158: Attempting to load block cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.800 [INFO][4824] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.64.128/26 host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.800 [INFO][4824] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.64.128/26 handle="k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.802 [INFO][4824] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9 Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.809 [INFO][4824] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.64.128/26 handle="k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.817 [INFO][4824] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.64.136/26] block=192.168.64.128/26 handle="k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.817 [INFO][4824] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.64.136/26] handle="k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" host="ci-4372-1-0-4-15a6623c0c" Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.817 [INFO][4824] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:33:47.844726 containerd[1576]: 2025-08-13 00:33:47.817 [INFO][4824] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.64.136/26] IPv6=[] ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" HandleID="k8s-pod-network.c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Workload="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" Aug 13 00:33:47.846860 containerd[1576]: 2025-08-13 00:33:47.819 [INFO][4811] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Namespace="calico-system" Pod="goldmane-58fd7646b9-wrddn" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"f7d00ab8-fa2b-489b-811e-3590e0a3f2d9", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"", Pod:"goldmane-58fd7646b9-wrddn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.64.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calidbac8a1dd83", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:47.846860 containerd[1576]: 2025-08-13 00:33:47.820 [INFO][4811] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.64.136/32] ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Namespace="calico-system" Pod="goldmane-58fd7646b9-wrddn" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" Aug 13 00:33:47.846860 containerd[1576]: 2025-08-13 00:33:47.820 [INFO][4811] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidbac8a1dd83 ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Namespace="calico-system" Pod="goldmane-58fd7646b9-wrddn" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" Aug 13 00:33:47.846860 containerd[1576]: 2025-08-13 00:33:47.825 [INFO][4811] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Namespace="calico-system" Pod="goldmane-58fd7646b9-wrddn" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" Aug 13 00:33:47.846860 containerd[1576]: 2025-08-13 00:33:47.826 [INFO][4811] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Namespace="calico-system" Pod="goldmane-58fd7646b9-wrddn" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"f7d00ab8-fa2b-489b-811e-3590e0a3f2d9", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 33, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-4-15a6623c0c", ContainerID:"c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9", Pod:"goldmane-58fd7646b9-wrddn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.64.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calidbac8a1dd83", MAC:"0e:8e:10:f2:8a:35", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:33:47.846860 containerd[1576]: 2025-08-13 00:33:47.841 [INFO][4811] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" Namespace="calico-system" Pod="goldmane-58fd7646b9-wrddn" WorkloadEndpoint="ci--4372--1--0--4--15a6623c0c-k8s-goldmane--58fd7646b9--wrddn-eth0" Aug 13 00:33:47.862175 containerd[1576]: time="2025-08-13T00:33:47.862096781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-74d8bfcddc-n9p85,Uid:ec8ee7ae-5bd1-4c63-8236-83729cf25c5d,Namespace:calico-system,Attempt:0,} returns sandbox id \"2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0\"" Aug 13 00:33:47.878894 containerd[1576]: time="2025-08-13T00:33:47.878820048Z" level=info msg="connecting to shim c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9" address="unix:///run/containerd/s/aaef7e9ee0d17c66b2efe82787209f44deb9d1fad020ce399e23a0a30b9309a7" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:33:47.899488 systemd[1]: Started cri-containerd-c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9.scope - libcontainer container c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9. Aug 13 00:33:47.946924 containerd[1576]: time="2025-08-13T00:33:47.946880907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-wrddn,Uid:f7d00ab8-fa2b-489b-811e-3590e0a3f2d9,Namespace:calico-system,Attempt:0,} returns sandbox id \"c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9\"" Aug 13 00:33:48.807499 systemd-networkd[1480]: cali276c39a94eb: Gained IPv6LL Aug 13 00:33:49.221665 containerd[1576]: time="2025-08-13T00:33:49.221606367Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:49.222559 containerd[1576]: time="2025-08-13T00:33:49.222509781Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 13 00:33:49.223445 containerd[1576]: time="2025-08-13T00:33:49.223398768Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:49.225028 containerd[1576]: time="2025-08-13T00:33:49.224985773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:49.225706 containerd[1576]: time="2025-08-13T00:33:49.225325791Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 4.190130941s" Aug 13 00:33:49.225706 containerd[1576]: time="2025-08-13T00:33:49.225378720Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 00:33:49.226610 containerd[1576]: time="2025-08-13T00:33:49.226581506Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 13 00:33:49.228107 containerd[1576]: time="2025-08-13T00:33:49.228077390Z" level=info msg="CreateContainer within sandbox \"0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:33:49.241367 containerd[1576]: time="2025-08-13T00:33:49.240708225Z" level=info msg="Container 1690efec22f167240167f851b4a5401bd8eb8a6a690830dd5b01c9603249f0a5: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:49.247530 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3657645060.mount: Deactivated successfully. Aug 13 00:33:49.255932 systemd-networkd[1480]: calib3e8f178733: Gained IPv6LL Aug 13 00:33:49.261280 containerd[1576]: time="2025-08-13T00:33:49.261241355Z" level=info msg="CreateContainer within sandbox \"0a082316060e2002e337b827d512c6fb0b08356c702bb66a4c733170c338b7f4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1690efec22f167240167f851b4a5401bd8eb8a6a690830dd5b01c9603249f0a5\"" Aug 13 00:33:49.261945 containerd[1576]: time="2025-08-13T00:33:49.261914798Z" level=info msg="StartContainer for \"1690efec22f167240167f851b4a5401bd8eb8a6a690830dd5b01c9603249f0a5\"" Aug 13 00:33:49.263252 containerd[1576]: time="2025-08-13T00:33:49.263196220Z" level=info msg="connecting to shim 1690efec22f167240167f851b4a5401bd8eb8a6a690830dd5b01c9603249f0a5" address="unix:///run/containerd/s/9604d955b2be2a42d3bc9cbb52f854656ccc60fb08c44f11d3d21462e64dddc1" protocol=ttrpc version=3 Aug 13 00:33:49.285477 systemd[1]: Started cri-containerd-1690efec22f167240167f851b4a5401bd8eb8a6a690830dd5b01c9603249f0a5.scope - libcontainer container 1690efec22f167240167f851b4a5401bd8eb8a6a690830dd5b01c9603249f0a5. Aug 13 00:33:49.328919 containerd[1576]: time="2025-08-13T00:33:49.328870656Z" level=info msg="StartContainer for \"1690efec22f167240167f851b4a5401bd8eb8a6a690830dd5b01c9603249f0a5\" returns successfully" Aug 13 00:33:49.703972 systemd-networkd[1480]: calidbac8a1dd83: Gained IPv6LL Aug 13 00:33:50.988983 kubelet[2741]: I0813 00:33:50.988910 2741 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:33:51.504428 containerd[1576]: time="2025-08-13T00:33:51.504373790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:51.505401 containerd[1576]: time="2025-08-13T00:33:51.505371090Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 13 00:33:51.506438 containerd[1576]: time="2025-08-13T00:33:51.506416320Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:51.508277 containerd[1576]: time="2025-08-13T00:33:51.508236303Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:51.508941 containerd[1576]: time="2025-08-13T00:33:51.508702627Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 2.282091987s" Aug 13 00:33:51.508941 containerd[1576]: time="2025-08-13T00:33:51.508731381Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 13 00:33:51.510669 containerd[1576]: time="2025-08-13T00:33:51.510555732Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:33:51.511504 containerd[1576]: time="2025-08-13T00:33:51.511458475Z" level=info msg="CreateContainer within sandbox \"5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 13 00:33:51.521284 containerd[1576]: time="2025-08-13T00:33:51.520511320Z" level=info msg="Container ea827d27434ef32115ca27e7652a1b546b352bdcc649cfdf95f3d974dca2d6f5: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:51.524933 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2746467655.mount: Deactivated successfully. Aug 13 00:33:51.543636 containerd[1576]: time="2025-08-13T00:33:51.543600192Z" level=info msg="CreateContainer within sandbox \"5d42afa26b8b9b8b5ef27fc30eb8e7dfb567687469290d3e087962328853b455\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"ea827d27434ef32115ca27e7652a1b546b352bdcc649cfdf95f3d974dca2d6f5\"" Aug 13 00:33:51.544379 containerd[1576]: time="2025-08-13T00:33:51.544268295Z" level=info msg="StartContainer for \"ea827d27434ef32115ca27e7652a1b546b352bdcc649cfdf95f3d974dca2d6f5\"" Aug 13 00:33:51.545763 containerd[1576]: time="2025-08-13T00:33:51.545732661Z" level=info msg="connecting to shim ea827d27434ef32115ca27e7652a1b546b352bdcc649cfdf95f3d974dca2d6f5" address="unix:///run/containerd/s/cdabbe9fcd395babcecbc324b6b47430fe11c634906e6c3dac436cf3732076fa" protocol=ttrpc version=3 Aug 13 00:33:51.566496 systemd[1]: Started cri-containerd-ea827d27434ef32115ca27e7652a1b546b352bdcc649cfdf95f3d974dca2d6f5.scope - libcontainer container ea827d27434ef32115ca27e7652a1b546b352bdcc649cfdf95f3d974dca2d6f5. Aug 13 00:33:51.600280 containerd[1576]: time="2025-08-13T00:33:51.600234407Z" level=info msg="StartContainer for \"ea827d27434ef32115ca27e7652a1b546b352bdcc649cfdf95f3d974dca2d6f5\" returns successfully" Aug 13 00:33:51.851413 kubelet[2741]: I0813 00:33:51.851191 2741 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 13 00:33:51.854441 kubelet[2741]: I0813 00:33:51.854410 2741 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 13 00:33:52.047300 kubelet[2741]: I0813 00:33:52.041827 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c7f84bf9d-rgxvb" podStartSLOduration=27.867572234 podStartE2EDuration="33.041780571s" podCreationTimestamp="2025-08-13 00:33:19 +0000 UTC" firstStartedPulling="2025-08-13 00:33:44.051783433 +0000 UTC m=+39.568368400" lastFinishedPulling="2025-08-13 00:33:49.22599177 +0000 UTC m=+44.742576737" observedRunningTime="2025-08-13 00:33:49.964112185 +0000 UTC m=+45.480697151" watchObservedRunningTime="2025-08-13 00:33:52.041780571 +0000 UTC m=+47.558365559" Aug 13 00:33:52.050188 kubelet[2741]: I0813 00:33:52.048625 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-r465b" podStartSLOduration=20.718237729 podStartE2EDuration="29.048606742s" podCreationTimestamp="2025-08-13 00:33:23 +0000 UTC" firstStartedPulling="2025-08-13 00:33:43.179050999 +0000 UTC m=+38.695635976" lastFinishedPulling="2025-08-13 00:33:51.509419992 +0000 UTC m=+47.026004989" observedRunningTime="2025-08-13 00:33:52.047037649 +0000 UTC m=+47.563622636" watchObservedRunningTime="2025-08-13 00:33:52.048606742 +0000 UTC m=+47.565191728" Aug 13 00:33:52.069629 containerd[1576]: time="2025-08-13T00:33:52.069567545Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:52.071840 containerd[1576]: time="2025-08-13T00:33:52.071773291Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 13 00:33:52.073576 containerd[1576]: time="2025-08-13T00:33:52.073441058Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 562.671065ms" Aug 13 00:33:52.073576 containerd[1576]: time="2025-08-13T00:33:52.073494849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 00:33:52.075167 containerd[1576]: time="2025-08-13T00:33:52.074944707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 13 00:33:52.079168 containerd[1576]: time="2025-08-13T00:33:52.079120387Z" level=info msg="CreateContainer within sandbox \"e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:33:52.093028 containerd[1576]: time="2025-08-13T00:33:52.092115365Z" level=info msg="Container fcb8be85c9a7872ee9848395c3220fa8cd518b2a89a1746481b232b6c9ccfaf1: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:52.113752 containerd[1576]: time="2025-08-13T00:33:52.113528054Z" level=info msg="CreateContainer within sandbox \"e95f3ead89f279da072ec73acd9aea5c25feea969d407c08967d78c6c22d35e6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fcb8be85c9a7872ee9848395c3220fa8cd518b2a89a1746481b232b6c9ccfaf1\"" Aug 13 00:33:52.116300 containerd[1576]: time="2025-08-13T00:33:52.116216135Z" level=info msg="StartContainer for \"fcb8be85c9a7872ee9848395c3220fa8cd518b2a89a1746481b232b6c9ccfaf1\"" Aug 13 00:33:52.118391 containerd[1576]: time="2025-08-13T00:33:52.118318698Z" level=info msg="connecting to shim fcb8be85c9a7872ee9848395c3220fa8cd518b2a89a1746481b232b6c9ccfaf1" address="unix:///run/containerd/s/c6f9d9f7460a318ae94ada51e1bf92d6ddf3d25a436cbd2a928aa1bdc2d8ec46" protocol=ttrpc version=3 Aug 13 00:33:52.150526 systemd[1]: Started cri-containerd-fcb8be85c9a7872ee9848395c3220fa8cd518b2a89a1746481b232b6c9ccfaf1.scope - libcontainer container fcb8be85c9a7872ee9848395c3220fa8cd518b2a89a1746481b232b6c9ccfaf1. Aug 13 00:33:52.209581 containerd[1576]: time="2025-08-13T00:33:52.209541037Z" level=info msg="StartContainer for \"fcb8be85c9a7872ee9848395c3220fa8cd518b2a89a1746481b232b6c9ccfaf1\" returns successfully" Aug 13 00:33:52.826374 kubelet[2741]: I0813 00:33:52.824844 2741 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:33:53.037266 kubelet[2741]: I0813 00:33:53.037009 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c7f84bf9d-8qmvb" podStartSLOduration=28.910244559 podStartE2EDuration="34.036992327s" podCreationTimestamp="2025-08-13 00:33:19 +0000 UTC" firstStartedPulling="2025-08-13 00:33:46.947683797 +0000 UTC m=+42.464268764" lastFinishedPulling="2025-08-13 00:33:52.074431555 +0000 UTC m=+47.591016532" observedRunningTime="2025-08-13 00:33:53.035573677 +0000 UTC m=+48.552158644" watchObservedRunningTime="2025-08-13 00:33:53.036992327 +0000 UTC m=+48.553577294" Aug 13 00:33:54.026409 kubelet[2741]: I0813 00:33:54.026374 2741 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:33:54.219576 systemd-networkd[1480]: vxlan.calico: Link UP Aug 13 00:33:54.219581 systemd-networkd[1480]: vxlan.calico: Gained carrier Aug 13 00:33:55.592470 systemd-networkd[1480]: vxlan.calico: Gained IPv6LL Aug 13 00:33:56.981447 containerd[1576]: time="2025-08-13T00:33:56.981400450Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:56.985920 containerd[1576]: time="2025-08-13T00:33:56.985773410Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 13 00:33:56.990551 containerd[1576]: time="2025-08-13T00:33:56.990523566Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:56.992396 containerd[1576]: time="2025-08-13T00:33:56.992375639Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:33:56.994588 containerd[1576]: time="2025-08-13T00:33:56.993030247Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 4.917784986s" Aug 13 00:33:56.994588 containerd[1576]: time="2025-08-13T00:33:56.993057859Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 13 00:33:57.007217 containerd[1576]: time="2025-08-13T00:33:57.007187422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 13 00:33:57.079665 containerd[1576]: time="2025-08-13T00:33:57.078085644Z" level=info msg="CreateContainer within sandbox \"2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 13 00:33:57.100153 containerd[1576]: time="2025-08-13T00:33:57.100120790Z" level=info msg="Container 1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:33:57.105043 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount105194477.mount: Deactivated successfully. Aug 13 00:33:57.116365 containerd[1576]: time="2025-08-13T00:33:57.116292394Z" level=info msg="CreateContainer within sandbox \"2301927d278528845cc8678964f45bd035b4ea62d7b90170520b4df574654ec0\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\"" Aug 13 00:33:57.117324 containerd[1576]: time="2025-08-13T00:33:57.117217178Z" level=info msg="StartContainer for \"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\"" Aug 13 00:33:57.118233 containerd[1576]: time="2025-08-13T00:33:57.118206914Z" level=info msg="connecting to shim 1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184" address="unix:///run/containerd/s/6bfdd3cff5eda9b5dd17e4f979b988516c1059b3aad6ea1edd049d3f0989113f" protocol=ttrpc version=3 Aug 13 00:33:57.208500 systemd[1]: Started cri-containerd-1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184.scope - libcontainer container 1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184. Aug 13 00:33:57.315678 containerd[1576]: time="2025-08-13T00:33:57.315534911Z" level=info msg="StartContainer for \"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\" returns successfully" Aug 13 00:33:58.354583 containerd[1576]: time="2025-08-13T00:33:58.354531823Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\" id:\"c5d9d36f8c6e4c488c3a8b4bbe1c5253d31846e0a76aedc5754e20c6ed04dec2\" pid:5347 exited_at:{seconds:1755045238 nanos:242669481}" Aug 13 00:33:58.360387 kubelet[2741]: I0813 00:33:58.360255 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-74d8bfcddc-n9p85" podStartSLOduration=26.217871387 podStartE2EDuration="35.36023694s" podCreationTimestamp="2025-08-13 00:33:23 +0000 UTC" firstStartedPulling="2025-08-13 00:33:47.864606046 +0000 UTC m=+43.381191012" lastFinishedPulling="2025-08-13 00:33:57.006971597 +0000 UTC m=+52.523556565" observedRunningTime="2025-08-13 00:33:58.173814852 +0000 UTC m=+53.690399839" watchObservedRunningTime="2025-08-13 00:33:58.36023694 +0000 UTC m=+53.876821908" Aug 13 00:34:00.062374 containerd[1576]: time="2025-08-13T00:34:00.062166752Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\" id:\"e59e53825f3906a59fe7b2213246b0ee18557ad7a140ad9b3b4e23dd44c82eb2\" pid:5380 exited_at:{seconds:1755045240 nanos:61721307}" Aug 13 00:34:00.203192 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3508927800.mount: Deactivated successfully. Aug 13 00:34:00.606250 containerd[1576]: time="2025-08-13T00:34:00.606199252Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:00.607447 containerd[1576]: time="2025-08-13T00:34:00.607421854Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 13 00:34:00.608435 containerd[1576]: time="2025-08-13T00:34:00.608405569Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:00.610553 containerd[1576]: time="2025-08-13T00:34:00.610518200Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:00.613154 containerd[1576]: time="2025-08-13T00:34:00.613120730Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.603721121s" Aug 13 00:34:00.613154 containerd[1576]: time="2025-08-13T00:34:00.613149164Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 13 00:34:00.661283 containerd[1576]: time="2025-08-13T00:34:00.660904610Z" level=info msg="CreateContainer within sandbox \"c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 13 00:34:00.739135 containerd[1576]: time="2025-08-13T00:34:00.739090205Z" level=info msg="Container ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:00.752297 containerd[1576]: time="2025-08-13T00:34:00.752264598Z" level=info msg="CreateContainer within sandbox \"c35e50aeb0450a9ac7418c36c7c4ee1b2018561a51e9fa0a98854793561682d9\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\"" Aug 13 00:34:00.756384 containerd[1576]: time="2025-08-13T00:34:00.756341011Z" level=info msg="StartContainer for \"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\"" Aug 13 00:34:00.758315 containerd[1576]: time="2025-08-13T00:34:00.758282161Z" level=info msg="connecting to shim ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124" address="unix:///run/containerd/s/aaef7e9ee0d17c66b2efe82787209f44deb9d1fad020ce399e23a0a30b9309a7" protocol=ttrpc version=3 Aug 13 00:34:00.850322 systemd[1]: Started cri-containerd-ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124.scope - libcontainer container ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124. Aug 13 00:34:00.909590 containerd[1576]: time="2025-08-13T00:34:00.909558488Z" level=info msg="StartContainer for \"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\" returns successfully" Aug 13 00:34:01.151618 kubelet[2741]: I0813 00:34:01.151545 2741 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-wrddn" podStartSLOduration=26.445339173 podStartE2EDuration="39.117285125s" podCreationTimestamp="2025-08-13 00:33:22 +0000 UTC" firstStartedPulling="2025-08-13 00:33:47.950022378 +0000 UTC m=+43.466607345" lastFinishedPulling="2025-08-13 00:34:00.62196832 +0000 UTC m=+56.138553297" observedRunningTime="2025-08-13 00:34:01.116686593 +0000 UTC m=+56.633271570" watchObservedRunningTime="2025-08-13 00:34:01.117285125 +0000 UTC m=+56.633870092" Aug 13 00:34:01.274220 containerd[1576]: time="2025-08-13T00:34:01.273914348Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\" id:\"c602031a1eb8877dcfd666d26f2313b118714fd77ae9de7d657f64524719f535\" pid:5443 exited_at:{seconds:1755045241 nanos:244664903}" Aug 13 00:34:12.079237 containerd[1576]: time="2025-08-13T00:34:12.079118115Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\" id:\"7002d15ae29ee5b0bfabfd87fa2623019c94d6ecba0520d01aa52c0de0530c1a\" pid:5482 exited_at:{seconds:1755045252 nanos:78443139}" Aug 13 00:34:23.698701 kubelet[2741]: I0813 00:34:23.698638 2741 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:26.119763 kubelet[2741]: I0813 00:34:26.119562 2741 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:34:26.888254 containerd[1576]: time="2025-08-13T00:34:26.880140003Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\" id:\"bff3d71f004d362c3afaee091cc1c317a04cf0bcd0eb2fbb5ef0d89e431ad93d\" pid:5520 exited_at:{seconds:1755045266 nanos:879813747}" Aug 13 00:34:27.996855 containerd[1576]: time="2025-08-13T00:34:27.996815775Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\" id:\"401b48cb03230bc6755a13695decaca7ef6fe645faa7c9b5104fbaee728b145b\" pid:5542 exited_at:{seconds:1755045267 nanos:996437901}" Aug 13 00:34:30.102948 containerd[1576]: time="2025-08-13T00:34:30.102782290Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\" id:\"75b701eb8df2660016da290e5bab5ef3e31502a2457c848c5fe7c497bc3ae17a\" pid:5563 exited_at:{seconds:1755045270 nanos:101429218}" Aug 13 00:34:40.289651 systemd[1]: Started sshd@8-95.217.135.102:22-139.178.89.65:51030.service - OpenSSH per-connection server daemon (139.178.89.65:51030). Aug 13 00:34:41.346427 sshd[5588]: Accepted publickey for core from 139.178.89.65 port 51030 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:41.350935 sshd-session[5588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:41.366336 systemd-logind[1532]: New session 8 of user core. Aug 13 00:34:41.371489 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 13 00:34:41.880636 containerd[1576]: time="2025-08-13T00:34:41.880591678Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\" id:\"36b3a03718b25e277af0dbcc2cf45753d0c701e6b980c295a28547dbc75099c0\" pid:5604 exited_at:{seconds:1755045281 nanos:880164013}" Aug 13 00:34:42.559258 sshd[5592]: Connection closed by 139.178.89.65 port 51030 Aug 13 00:34:42.560726 sshd-session[5588]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:42.566427 systemd[1]: sshd@8-95.217.135.102:22-139.178.89.65:51030.service: Deactivated successfully. Aug 13 00:34:42.569848 systemd[1]: session-8.scope: Deactivated successfully. Aug 13 00:34:42.570862 systemd-logind[1532]: Session 8 logged out. Waiting for processes to exit. Aug 13 00:34:42.572605 systemd-logind[1532]: Removed session 8. Aug 13 00:34:43.016538 containerd[1576]: time="2025-08-13T00:34:43.016495356Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\" id:\"27b855c4faf6f3f4d297ef372f1e5b60a5eb1cde3efd194ec5a96fb5d061f72e\" pid:5637 exited_at:{seconds:1755045283 nanos:16238633}" Aug 13 00:34:47.737003 systemd[1]: Started sshd@9-95.217.135.102:22-139.178.89.65:51046.service - OpenSSH per-connection server daemon (139.178.89.65:51046). Aug 13 00:34:48.814482 sshd[5649]: Accepted publickey for core from 139.178.89.65 port 51046 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:48.819193 sshd-session[5649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:48.837953 systemd-logind[1532]: New session 9 of user core. Aug 13 00:34:48.844939 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 13 00:34:49.828957 sshd[5651]: Connection closed by 139.178.89.65 port 51046 Aug 13 00:34:49.833502 sshd-session[5649]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:49.843157 systemd[1]: sshd@9-95.217.135.102:22-139.178.89.65:51046.service: Deactivated successfully. Aug 13 00:34:49.843648 systemd-logind[1532]: Session 9 logged out. Waiting for processes to exit. Aug 13 00:34:49.849160 systemd[1]: session-9.scope: Deactivated successfully. Aug 13 00:34:49.854592 systemd-logind[1532]: Removed session 9. Aug 13 00:34:49.997080 systemd[1]: Started sshd@10-95.217.135.102:22-139.178.89.65:57530.service - OpenSSH per-connection server daemon (139.178.89.65:57530). Aug 13 00:34:50.996969 sshd[5664]: Accepted publickey for core from 139.178.89.65 port 57530 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:50.998676 sshd-session[5664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:51.004232 systemd-logind[1532]: New session 10 of user core. Aug 13 00:34:51.013462 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 13 00:34:51.770275 sshd[5666]: Connection closed by 139.178.89.65 port 57530 Aug 13 00:34:51.772476 sshd-session[5664]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:51.775030 systemd-logind[1532]: Session 10 logged out. Waiting for processes to exit. Aug 13 00:34:51.777829 systemd[1]: sshd@10-95.217.135.102:22-139.178.89.65:57530.service: Deactivated successfully. Aug 13 00:34:51.779162 systemd[1]: session-10.scope: Deactivated successfully. Aug 13 00:34:51.781177 systemd-logind[1532]: Removed session 10. Aug 13 00:34:51.936384 systemd[1]: Started sshd@11-95.217.135.102:22-139.178.89.65:57546.service - OpenSSH per-connection server daemon (139.178.89.65:57546). Aug 13 00:34:52.918807 sshd[5676]: Accepted publickey for core from 139.178.89.65 port 57546 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:52.920156 sshd-session[5676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:52.925205 systemd-logind[1532]: New session 11 of user core. Aug 13 00:34:52.932509 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 13 00:34:53.683816 sshd[5678]: Connection closed by 139.178.89.65 port 57546 Aug 13 00:34:53.684716 sshd-session[5676]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:53.690452 systemd[1]: sshd@11-95.217.135.102:22-139.178.89.65:57546.service: Deactivated successfully. Aug 13 00:34:53.690728 systemd-logind[1532]: Session 11 logged out. Waiting for processes to exit. Aug 13 00:34:53.696161 systemd[1]: session-11.scope: Deactivated successfully. Aug 13 00:34:53.700751 systemd-logind[1532]: Removed session 11. Aug 13 00:34:58.009541 containerd[1576]: time="2025-08-13T00:34:58.009493392Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\" id:\"4ec0b0bb4c7fc9c1c5fdcd980ed6b8c1c9958c45061c2b62f78fb2eeb35d7fb0\" pid:5705 exited_at:{seconds:1755045298 nanos:2753558}" Aug 13 00:34:58.855607 systemd[1]: Started sshd@12-95.217.135.102:22-139.178.89.65:57550.service - OpenSSH per-connection server daemon (139.178.89.65:57550). Aug 13 00:34:59.850055 sshd[5716]: Accepted publickey for core from 139.178.89.65 port 57550 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:59.851709 sshd-session[5716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:59.856566 systemd-logind[1532]: New session 12 of user core. Aug 13 00:34:59.859495 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 13 00:35:00.036769 containerd[1576]: time="2025-08-13T00:35:00.036581276Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\" id:\"dbebb2831eaa053f0bcf5515de336706aac6d4f6eb38ce0ef42863b7e87d1942\" pid:5732 exited_at:{seconds:1755045300 nanos:36230276}" Aug 13 00:35:00.624482 sshd[5718]: Connection closed by 139.178.89.65 port 57550 Aug 13 00:35:00.625731 sshd-session[5716]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:00.631239 systemd[1]: sshd@12-95.217.135.102:22-139.178.89.65:57550.service: Deactivated successfully. Aug 13 00:35:00.633120 systemd[1]: session-12.scope: Deactivated successfully. Aug 13 00:35:00.634219 systemd-logind[1532]: Session 12 logged out. Waiting for processes to exit. Aug 13 00:35:00.635415 systemd-logind[1532]: Removed session 12. Aug 13 00:35:05.793435 systemd[1]: Started sshd@13-95.217.135.102:22-139.178.89.65:44618.service - OpenSSH per-connection server daemon (139.178.89.65:44618). Aug 13 00:35:06.820466 sshd[5757]: Accepted publickey for core from 139.178.89.65 port 44618 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:35:06.822029 sshd-session[5757]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:06.827952 systemd-logind[1532]: New session 13 of user core. Aug 13 00:35:06.835772 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 13 00:35:07.630190 sshd[5759]: Connection closed by 139.178.89.65 port 44618 Aug 13 00:35:07.633700 sshd-session[5757]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:07.641535 systemd[1]: sshd@13-95.217.135.102:22-139.178.89.65:44618.service: Deactivated successfully. Aug 13 00:35:07.645156 systemd[1]: session-13.scope: Deactivated successfully. Aug 13 00:35:07.647074 systemd-logind[1532]: Session 13 logged out. Waiting for processes to exit. Aug 13 00:35:07.648781 systemd-logind[1532]: Removed session 13. Aug 13 00:35:11.868167 containerd[1576]: time="2025-08-13T00:35:11.868128104Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\" id:\"f4ce7c345650ad2878ae81768594ee3de744c8f47b861f778ed72cb6df9f94b3\" pid:5784 exited_at:{seconds:1755045311 nanos:867478042}" Aug 13 00:35:12.797683 systemd[1]: Started sshd@14-95.217.135.102:22-139.178.89.65:57836.service - OpenSSH per-connection server daemon (139.178.89.65:57836). Aug 13 00:35:13.807630 sshd[5796]: Accepted publickey for core from 139.178.89.65 port 57836 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:35:13.810274 sshd-session[5796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:13.820476 systemd-logind[1532]: New session 14 of user core. Aug 13 00:35:13.828598 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 13 00:35:14.633755 sshd[5798]: Connection closed by 139.178.89.65 port 57836 Aug 13 00:35:14.635547 sshd-session[5796]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:14.639538 systemd-logind[1532]: Session 14 logged out. Waiting for processes to exit. Aug 13 00:35:14.640246 systemd[1]: sshd@14-95.217.135.102:22-139.178.89.65:57836.service: Deactivated successfully. Aug 13 00:35:14.643284 systemd[1]: session-14.scope: Deactivated successfully. Aug 13 00:35:14.646593 systemd-logind[1532]: Removed session 14. Aug 13 00:35:14.804041 systemd[1]: Started sshd@15-95.217.135.102:22-139.178.89.65:57850.service - OpenSSH per-connection server daemon (139.178.89.65:57850). Aug 13 00:35:15.781494 sshd[5818]: Accepted publickey for core from 139.178.89.65 port 57850 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:35:15.785121 sshd-session[5818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:15.796688 systemd-logind[1532]: New session 15 of user core. Aug 13 00:35:15.805685 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 13 00:35:16.753684 sshd[5820]: Connection closed by 139.178.89.65 port 57850 Aug 13 00:35:16.758283 sshd-session[5818]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:16.762115 systemd-logind[1532]: Session 15 logged out. Waiting for processes to exit. Aug 13 00:35:16.762990 systemd[1]: sshd@15-95.217.135.102:22-139.178.89.65:57850.service: Deactivated successfully. Aug 13 00:35:16.765276 systemd[1]: session-15.scope: Deactivated successfully. Aug 13 00:35:16.769150 systemd-logind[1532]: Removed session 15. Aug 13 00:35:16.920024 systemd[1]: Started sshd@16-95.217.135.102:22-139.178.89.65:57854.service - OpenSSH per-connection server daemon (139.178.89.65:57854). Aug 13 00:35:17.937842 sshd[5830]: Accepted publickey for core from 139.178.89.65 port 57854 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:35:17.939162 sshd-session[5830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:17.945395 systemd-logind[1532]: New session 16 of user core. Aug 13 00:35:17.951724 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 13 00:35:20.477174 sshd[5832]: Connection closed by 139.178.89.65 port 57854 Aug 13 00:35:20.493327 sshd-session[5830]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:20.520684 systemd[1]: sshd@16-95.217.135.102:22-139.178.89.65:57854.service: Deactivated successfully. Aug 13 00:35:20.522239 systemd[1]: session-16.scope: Deactivated successfully. Aug 13 00:35:20.526205 systemd[1]: session-16.scope: Consumed 503ms CPU time, 69.2M memory peak. Aug 13 00:35:20.529213 systemd-logind[1532]: Session 16 logged out. Waiting for processes to exit. Aug 13 00:35:20.535283 systemd-logind[1532]: Removed session 16. Aug 13 00:35:20.646981 systemd[1]: Started sshd@17-95.217.135.102:22-139.178.89.65:50332.service - OpenSSH per-connection server daemon (139.178.89.65:50332). Aug 13 00:35:21.674018 sshd[5850]: Accepted publickey for core from 139.178.89.65 port 50332 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:35:21.678137 sshd-session[5850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:21.685076 systemd-logind[1532]: New session 17 of user core. Aug 13 00:35:21.690495 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 13 00:35:23.061473 sshd[5852]: Connection closed by 139.178.89.65 port 50332 Aug 13 00:35:23.062041 sshd-session[5850]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:23.066015 systemd[1]: sshd@17-95.217.135.102:22-139.178.89.65:50332.service: Deactivated successfully. Aug 13 00:35:23.068028 systemd[1]: session-17.scope: Deactivated successfully. Aug 13 00:35:23.069304 systemd-logind[1532]: Session 17 logged out. Waiting for processes to exit. Aug 13 00:35:23.071174 systemd-logind[1532]: Removed session 17. Aug 13 00:35:23.232715 systemd[1]: Started sshd@18-95.217.135.102:22-139.178.89.65:50334.service - OpenSSH per-connection server daemon (139.178.89.65:50334). Aug 13 00:35:24.245849 sshd[5862]: Accepted publickey for core from 139.178.89.65 port 50334 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:35:24.248397 sshd-session[5862]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:24.256403 systemd-logind[1532]: New session 18 of user core. Aug 13 00:35:24.264581 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 13 00:35:25.523759 sshd[5864]: Connection closed by 139.178.89.65 port 50334 Aug 13 00:35:25.531196 sshd-session[5862]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:25.545959 systemd[1]: sshd@18-95.217.135.102:22-139.178.89.65:50334.service: Deactivated successfully. Aug 13 00:35:25.548425 systemd[1]: session-18.scope: Deactivated successfully. Aug 13 00:35:25.556667 systemd-logind[1532]: Session 18 logged out. Waiting for processes to exit. Aug 13 00:35:25.560267 systemd-logind[1532]: Removed session 18. Aug 13 00:35:27.112989 containerd[1576]: time="2025-08-13T00:35:27.108105905Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\" id:\"1ea74352e7864a189a6d6c29ff6646463766f70e47cf8ef6a984e45df2c9adbb\" pid:5906 exited_at:{seconds:1755045327 nanos:66428358}" Aug 13 00:35:27.973464 containerd[1576]: time="2025-08-13T00:35:27.973410950Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\" id:\"6bbdbd004dc0f08c3e1f534c3daf47aa259b02a6a8d81953fd33040c994baf02\" pid:5928 exited_at:{seconds:1755045327 nanos:973186498}" Aug 13 00:35:30.268517 containerd[1576]: time="2025-08-13T00:35:30.268439414Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\" id:\"6f514d786d195fab3c883806d49546fb3672dffc1fd5b3508cb12ee3910ec983\" pid:5949 exited_at:{seconds:1755045330 nanos:268034123}" Aug 13 00:35:30.726085 systemd[1]: Started sshd@19-95.217.135.102:22-139.178.89.65:53274.service - OpenSSH per-connection server daemon (139.178.89.65:53274). Aug 13 00:35:31.752864 sshd[5967]: Accepted publickey for core from 139.178.89.65 port 53274 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:35:31.755180 sshd-session[5967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:31.761210 systemd-logind[1532]: New session 19 of user core. Aug 13 00:35:31.765488 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 13 00:35:33.016212 sshd[5976]: Connection closed by 139.178.89.65 port 53274 Aug 13 00:35:33.016835 sshd-session[5967]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:33.020627 systemd-logind[1532]: Session 19 logged out. Waiting for processes to exit. Aug 13 00:35:33.024640 systemd[1]: sshd@19-95.217.135.102:22-139.178.89.65:53274.service: Deactivated successfully. Aug 13 00:35:33.027256 systemd[1]: session-19.scope: Deactivated successfully. Aug 13 00:35:33.028739 systemd-logind[1532]: Removed session 19. Aug 13 00:35:38.190454 systemd[1]: Started sshd@20-95.217.135.102:22-139.178.89.65:53280.service - OpenSSH per-connection server daemon (139.178.89.65:53280). Aug 13 00:35:39.260384 sshd[5988]: Accepted publickey for core from 139.178.89.65 port 53280 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:35:39.262408 sshd-session[5988]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:35:39.268928 systemd-logind[1532]: New session 20 of user core. Aug 13 00:35:39.272475 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 13 00:35:40.359587 sshd[5990]: Connection closed by 139.178.89.65 port 53280 Aug 13 00:35:40.360187 sshd-session[5988]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:40.363247 systemd[1]: sshd@20-95.217.135.102:22-139.178.89.65:53280.service: Deactivated successfully. Aug 13 00:35:40.365242 systemd[1]: session-20.scope: Deactivated successfully. Aug 13 00:35:40.367324 systemd-logind[1532]: Session 20 logged out. Waiting for processes to exit. Aug 13 00:35:40.368743 systemd-logind[1532]: Removed session 20. Aug 13 00:35:42.201202 containerd[1576]: time="2025-08-13T00:35:42.198050058Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\" id:\"93e3fe0a5a1af03bc2a54bf63e53e1a0fe1c0f28f7bec32f685be78ceb6fe721\" pid:6016 exited_at:{seconds:1755045342 nanos:112917127}" Aug 13 00:35:43.043625 containerd[1576]: time="2025-08-13T00:35:43.043534359Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ddd5249fed2e9358e755e18ee665ed37cd476143c91694464c0297fb8b29b124\" id:\"6620ca8c464df3c0e89ec9fdb81f2746494cb4c7cac51f393ba908d36bc4eaa8\" pid:6041 exited_at:{seconds:1755045343 nanos:43091606}" Aug 13 00:35:55.658754 systemd[1]: cri-containerd-6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c.scope: Deactivated successfully. Aug 13 00:35:55.659041 systemd[1]: cri-containerd-6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c.scope: Consumed 15.214s CPU time, 111.5M memory peak, 80.2M read from disk. Aug 13 00:35:55.710424 containerd[1576]: time="2025-08-13T00:35:55.710386622Z" level=info msg="received exit event container_id:\"6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c\" id:\"6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c\" pid:3074 exit_status:1 exited_at:{seconds:1755045355 nanos:694578116}" Aug 13 00:35:55.713589 containerd[1576]: time="2025-08-13T00:35:55.713532417Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c\" id:\"6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c\" pid:3074 exit_status:1 exited_at:{seconds:1755045355 nanos:694578116}" Aug 13 00:35:55.776765 kubelet[2741]: E0813 00:35:55.766521 2741 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:48008->10.0.0.2:2379: read: connection timed out" Aug 13 00:35:55.814878 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c-rootfs.mount: Deactivated successfully. Aug 13 00:35:56.140618 kubelet[2741]: I0813 00:35:56.137259 2741 scope.go:117] "RemoveContainer" containerID="6b234230b7db09100b74c90a8c48c326fbbdc64ffa4520b91eb183b206210a1c" Aug 13 00:35:56.237363 containerd[1576]: time="2025-08-13T00:35:56.237313027Z" level=info msg="CreateContainer within sandbox \"14dfb0563cfb7323b71775626cee118113f676bec1b312c4eaa2d4005ec3ebde\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Aug 13 00:35:56.310305 containerd[1576]: time="2025-08-13T00:35:56.310257829Z" level=info msg="Container 9d1af040200a6780d437bfa1f119139df205e3ccb7ff09d321c10d0f5eedcbbc: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:56.316094 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2778772715.mount: Deactivated successfully. Aug 13 00:35:56.325787 containerd[1576]: time="2025-08-13T00:35:56.325736876Z" level=info msg="CreateContainer within sandbox \"14dfb0563cfb7323b71775626cee118113f676bec1b312c4eaa2d4005ec3ebde\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"9d1af040200a6780d437bfa1f119139df205e3ccb7ff09d321c10d0f5eedcbbc\"" Aug 13 00:35:56.326301 containerd[1576]: time="2025-08-13T00:35:56.326262352Z" level=info msg="StartContainer for \"9d1af040200a6780d437bfa1f119139df205e3ccb7ff09d321c10d0f5eedcbbc\"" Aug 13 00:35:56.327095 containerd[1576]: time="2025-08-13T00:35:56.327064038Z" level=info msg="connecting to shim 9d1af040200a6780d437bfa1f119139df205e3ccb7ff09d321c10d0f5eedcbbc" address="unix:///run/containerd/s/fdf26b845c4c3c7999b60337cc6102bb73ce512396db6e7807bc5f6a7d8d15da" protocol=ttrpc version=3 Aug 13 00:35:56.386489 systemd[1]: Started cri-containerd-9d1af040200a6780d437bfa1f119139df205e3ccb7ff09d321c10d0f5eedcbbc.scope - libcontainer container 9d1af040200a6780d437bfa1f119139df205e3ccb7ff09d321c10d0f5eedcbbc. Aug 13 00:35:56.439869 containerd[1576]: time="2025-08-13T00:35:56.439754490Z" level=info msg="StartContainer for \"9d1af040200a6780d437bfa1f119139df205e3ccb7ff09d321c10d0f5eedcbbc\" returns successfully" Aug 13 00:35:57.044483 systemd[1]: cri-containerd-4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152.scope: Deactivated successfully. Aug 13 00:35:57.044791 systemd[1]: cri-containerd-4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152.scope: Consumed 2.701s CPU time, 86.7M memory peak, 119M read from disk. Aug 13 00:35:57.052758 containerd[1576]: time="2025-08-13T00:35:57.052720293Z" level=info msg="received exit event container_id:\"4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152\" id:\"4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152\" pid:2598 exit_status:1 exited_at:{seconds:1755045357 nanos:51573600}" Aug 13 00:35:57.054110 containerd[1576]: time="2025-08-13T00:35:57.054060147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152\" id:\"4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152\" pid:2598 exit_status:1 exited_at:{seconds:1755045357 nanos:51573600}" Aug 13 00:35:57.080299 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152-rootfs.mount: Deactivated successfully. Aug 13 00:35:57.125362 kubelet[2741]: I0813 00:35:57.125324 2741 scope.go:117] "RemoveContainer" containerID="4c26b458f896f9bc497186ee9dedaa7d6ce6911f2a98985d6bc7eb62edbf7152" Aug 13 00:35:57.127746 containerd[1576]: time="2025-08-13T00:35:57.127703891Z" level=info msg="CreateContainer within sandbox \"b0284c1386e1a553d65ed112a51cbce2bc0f2e20045d358f3937ac8f19e4afad\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Aug 13 00:35:57.149276 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3221110782.mount: Deactivated successfully. Aug 13 00:35:57.170780 containerd[1576]: time="2025-08-13T00:35:57.169980412Z" level=info msg="Container 55af687ba29811b2f2b920384a82b317eb31c4c65fbaf5c7300206d506fb36b2: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:57.174824 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount131792499.mount: Deactivated successfully. Aug 13 00:35:57.181703 containerd[1576]: time="2025-08-13T00:35:57.181671950Z" level=info msg="CreateContainer within sandbox \"b0284c1386e1a553d65ed112a51cbce2bc0f2e20045d358f3937ac8f19e4afad\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"55af687ba29811b2f2b920384a82b317eb31c4c65fbaf5c7300206d506fb36b2\"" Aug 13 00:35:57.182116 containerd[1576]: time="2025-08-13T00:35:57.182089523Z" level=info msg="StartContainer for \"55af687ba29811b2f2b920384a82b317eb31c4c65fbaf5c7300206d506fb36b2\"" Aug 13 00:35:57.184776 containerd[1576]: time="2025-08-13T00:35:57.184746410Z" level=info msg="connecting to shim 55af687ba29811b2f2b920384a82b317eb31c4c65fbaf5c7300206d506fb36b2" address="unix:///run/containerd/s/99e57e2f43d68cfb666dd0bfa9664b8d816dd2b2ef224f65b48073e8d3ffb3cf" protocol=ttrpc version=3 Aug 13 00:35:57.201684 systemd[1]: Started cri-containerd-55af687ba29811b2f2b920384a82b317eb31c4c65fbaf5c7300206d506fb36b2.scope - libcontainer container 55af687ba29811b2f2b920384a82b317eb31c4c65fbaf5c7300206d506fb36b2. Aug 13 00:35:57.255171 containerd[1576]: time="2025-08-13T00:35:57.255084388Z" level=info msg="StartContainer for \"55af687ba29811b2f2b920384a82b317eb31c4c65fbaf5c7300206d506fb36b2\" returns successfully" Aug 13 00:35:57.994609 containerd[1576]: time="2025-08-13T00:35:57.994567475Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1e20b32f91d5a452d5d5d9680e7c28ac8a629bd4fb8304d86b1a185148c99184\" id:\"042ac6b2c7cfba8750bf94c8e39be9396a5f67ff4ca495c5060bcb2efc2c64b2\" pid:6153 exit_status:1 exited_at:{seconds:1755045357 nanos:994185408}" Aug 13 00:36:00.278221 containerd[1576]: time="2025-08-13T00:36:00.278164651Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1f05d4c5adb8d10fc3a5874e8491698fa876fabd593cbb51426f76d1c9a733d7\" id:\"635a3929420de43a5f911a3581bbdbf46971f38b8b3f959f244c5d32cb882d45\" pid:6175 exited_at:{seconds:1755045360 nanos:277869416}" Aug 13 00:36:00.589694 kubelet[2741]: E0813 00:36:00.525110 2741 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:47814->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4372-1-0-4-15a6623c0c.185b2c72796bebb3 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4372-1-0-4-15a6623c0c,UID:7083a26ff933664f33567a3bb2fe0187,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-4-15a6623c0c,},FirstTimestamp:2025-08-13 00:35:49.965949875 +0000 UTC m=+165.482534912,LastTimestamp:2025-08-13 00:35:49.965949875 +0000 UTC m=+165.482534912,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-4-15a6623c0c,}"