Aug 13 00:34:09.799182 kernel: Linux version 6.12.40-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Aug 12 21:42:48 -00 2025 Aug 13 00:34:09.799206 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:34:09.799215 kernel: BIOS-provided physical RAM map: Aug 13 00:34:09.799222 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Aug 13 00:34:09.799228 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Aug 13 00:34:09.799234 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Aug 13 00:34:09.799242 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Aug 13 00:34:09.799248 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Aug 13 00:34:09.799254 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Aug 13 00:34:09.799260 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Aug 13 00:34:09.799266 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Aug 13 00:34:09.799272 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Aug 13 00:34:09.799278 kernel: NX (Execute Disable) protection: active Aug 13 00:34:09.799284 kernel: APIC: Static calls initialized Aug 13 00:34:09.799293 kernel: SMBIOS 2.8 present. Aug 13 00:34:09.799300 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Aug 13 00:34:09.799306 kernel: DMI: Memory slots populated: 1/1 Aug 13 00:34:09.799312 kernel: Hypervisor detected: KVM Aug 13 00:34:09.799319 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Aug 13 00:34:09.799325 kernel: kvm-clock: using sched offset of 4146244491 cycles Aug 13 00:34:09.799332 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Aug 13 00:34:09.799339 kernel: tsc: Detected 2445.404 MHz processor Aug 13 00:34:09.799347 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Aug 13 00:34:09.799354 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Aug 13 00:34:09.799361 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Aug 13 00:34:09.799368 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Aug 13 00:34:09.799374 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Aug 13 00:34:09.799381 kernel: Using GB pages for direct mapping Aug 13 00:34:09.799387 kernel: ACPI: Early table checksum verification disabled Aug 13 00:34:09.799394 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Aug 13 00:34:09.799401 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:34:09.799409 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:34:09.799415 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:34:09.799422 kernel: ACPI: FACS 0x000000007CFE0000 000040 Aug 13 00:34:09.799430 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:34:09.799442 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:34:09.799450 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:34:09.799457 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 13 00:34:09.799464 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Aug 13 00:34:09.799470 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Aug 13 00:34:09.799481 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Aug 13 00:34:09.799488 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Aug 13 00:34:09.799495 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Aug 13 00:34:09.799502 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Aug 13 00:34:09.799509 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Aug 13 00:34:09.799517 kernel: No NUMA configuration found Aug 13 00:34:09.799524 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Aug 13 00:34:09.799531 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Aug 13 00:34:09.799538 kernel: Zone ranges: Aug 13 00:34:09.799545 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Aug 13 00:34:09.799552 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Aug 13 00:34:09.799559 kernel: Normal empty Aug 13 00:34:09.799565 kernel: Device empty Aug 13 00:34:09.799572 kernel: Movable zone start for each node Aug 13 00:34:09.799579 kernel: Early memory node ranges Aug 13 00:34:09.799587 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Aug 13 00:34:09.799594 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Aug 13 00:34:09.799601 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Aug 13 00:34:09.799607 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Aug 13 00:34:09.799614 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Aug 13 00:34:09.799621 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Aug 13 00:34:09.799628 kernel: ACPI: PM-Timer IO Port: 0x608 Aug 13 00:34:09.799635 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Aug 13 00:34:09.799642 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Aug 13 00:34:09.799650 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Aug 13 00:34:09.799657 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Aug 13 00:34:09.799664 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Aug 13 00:34:09.799670 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Aug 13 00:34:09.799677 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Aug 13 00:34:09.799684 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Aug 13 00:34:09.799691 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Aug 13 00:34:09.799715 kernel: CPU topo: Max. logical packages: 1 Aug 13 00:34:09.799722 kernel: CPU topo: Max. logical dies: 1 Aug 13 00:34:09.799730 kernel: CPU topo: Max. dies per package: 1 Aug 13 00:34:09.799737 kernel: CPU topo: Max. threads per core: 1 Aug 13 00:34:09.799744 kernel: CPU topo: Num. cores per package: 2 Aug 13 00:34:09.799750 kernel: CPU topo: Num. threads per package: 2 Aug 13 00:34:09.799757 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Aug 13 00:34:09.799764 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Aug 13 00:34:09.799771 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Aug 13 00:34:09.799778 kernel: Booting paravirtualized kernel on KVM Aug 13 00:34:09.799794 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Aug 13 00:34:09.799801 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Aug 13 00:34:09.799810 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Aug 13 00:34:09.799817 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Aug 13 00:34:09.799824 kernel: pcpu-alloc: [0] 0 1 Aug 13 00:34:09.799830 kernel: kvm-guest: PV spinlocks disabled, no host support Aug 13 00:34:09.799839 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:34:09.799846 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 13 00:34:09.799853 kernel: random: crng init done Aug 13 00:34:09.799860 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 13 00:34:09.799868 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Aug 13 00:34:09.799875 kernel: Fallback order for Node 0: 0 Aug 13 00:34:09.799882 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Aug 13 00:34:09.799889 kernel: Policy zone: DMA32 Aug 13 00:34:09.799896 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 13 00:34:09.799903 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 13 00:34:09.799909 kernel: ftrace: allocating 40098 entries in 157 pages Aug 13 00:34:09.799916 kernel: ftrace: allocated 157 pages with 5 groups Aug 13 00:34:09.799923 kernel: Dynamic Preempt: voluntary Aug 13 00:34:09.799931 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 13 00:34:09.799939 kernel: rcu: RCU event tracing is enabled. Aug 13 00:34:09.799946 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 13 00:34:09.799953 kernel: Trampoline variant of Tasks RCU enabled. Aug 13 00:34:09.799960 kernel: Rude variant of Tasks RCU enabled. Aug 13 00:34:09.799967 kernel: Tracing variant of Tasks RCU enabled. Aug 13 00:34:09.799974 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 13 00:34:09.799981 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 13 00:34:09.799988 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 13 00:34:09.799996 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 13 00:34:09.800003 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 13 00:34:09.800010 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Aug 13 00:34:09.800017 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 13 00:34:09.800024 kernel: Console: colour VGA+ 80x25 Aug 13 00:34:09.800032 kernel: printk: legacy console [tty0] enabled Aug 13 00:34:09.800038 kernel: printk: legacy console [ttyS0] enabled Aug 13 00:34:09.800045 kernel: ACPI: Core revision 20240827 Aug 13 00:34:09.800053 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Aug 13 00:34:09.800065 kernel: APIC: Switch to symmetric I/O mode setup Aug 13 00:34:09.800072 kernel: x2apic enabled Aug 13 00:34:09.800079 kernel: APIC: Switched APIC routing to: physical x2apic Aug 13 00:34:09.800088 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Aug 13 00:34:09.800095 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns Aug 13 00:34:09.800103 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404) Aug 13 00:34:09.800110 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Aug 13 00:34:09.800117 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Aug 13 00:34:09.800125 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Aug 13 00:34:09.800134 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Aug 13 00:34:09.800141 kernel: Spectre V2 : Mitigation: Retpolines Aug 13 00:34:09.800148 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Aug 13 00:34:09.800156 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Aug 13 00:34:09.800163 kernel: RETBleed: Mitigation: untrained return thunk Aug 13 00:34:09.800170 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Aug 13 00:34:09.800178 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Aug 13 00:34:09.800185 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Aug 13 00:34:09.800193 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Aug 13 00:34:09.800201 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Aug 13 00:34:09.800208 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Aug 13 00:34:09.800215 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Aug 13 00:34:09.800223 kernel: Freeing SMP alternatives memory: 32K Aug 13 00:34:09.800230 kernel: pid_max: default: 32768 minimum: 301 Aug 13 00:34:09.800237 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 13 00:34:09.800244 kernel: landlock: Up and running. Aug 13 00:34:09.800253 kernel: SELinux: Initializing. Aug 13 00:34:09.800260 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 13 00:34:09.800267 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Aug 13 00:34:09.800275 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Aug 13 00:34:09.800282 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Aug 13 00:34:09.800289 kernel: ... version: 0 Aug 13 00:34:09.800296 kernel: ... bit width: 48 Aug 13 00:34:09.800304 kernel: ... generic registers: 6 Aug 13 00:34:09.800311 kernel: ... value mask: 0000ffffffffffff Aug 13 00:34:09.800318 kernel: ... max period: 00007fffffffffff Aug 13 00:34:09.800327 kernel: ... fixed-purpose events: 0 Aug 13 00:34:09.800334 kernel: ... event mask: 000000000000003f Aug 13 00:34:09.800341 kernel: signal: max sigframe size: 1776 Aug 13 00:34:09.800348 kernel: rcu: Hierarchical SRCU implementation. Aug 13 00:34:09.800355 kernel: rcu: Max phase no-delay instances is 400. Aug 13 00:34:09.800363 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 13 00:34:09.800370 kernel: smp: Bringing up secondary CPUs ... Aug 13 00:34:09.800377 kernel: smpboot: x86: Booting SMP configuration: Aug 13 00:34:09.800384 kernel: .... node #0, CPUs: #1 Aug 13 00:34:09.800393 kernel: smp: Brought up 1 node, 2 CPUs Aug 13 00:34:09.800400 kernel: smpboot: Total of 2 processors activated (9781.61 BogoMIPS) Aug 13 00:34:09.800408 kernel: Memory: 1917780K/2047464K available (14336K kernel code, 2430K rwdata, 9960K rodata, 54444K init, 2524K bss, 125140K reserved, 0K cma-reserved) Aug 13 00:34:09.800415 kernel: devtmpfs: initialized Aug 13 00:34:09.800422 kernel: x86/mm: Memory block size: 128MB Aug 13 00:34:09.800430 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 13 00:34:09.800437 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 13 00:34:09.800444 kernel: pinctrl core: initialized pinctrl subsystem Aug 13 00:34:09.800451 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 13 00:34:09.800460 kernel: audit: initializing netlink subsys (disabled) Aug 13 00:34:09.800467 kernel: audit: type=2000 audit(1755045246.865:1): state=initialized audit_enabled=0 res=1 Aug 13 00:34:09.800474 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 13 00:34:09.800481 kernel: thermal_sys: Registered thermal governor 'user_space' Aug 13 00:34:09.800489 kernel: cpuidle: using governor menu Aug 13 00:34:09.800496 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 13 00:34:09.800503 kernel: dca service started, version 1.12.1 Aug 13 00:34:09.800510 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Aug 13 00:34:09.800518 kernel: PCI: Using configuration type 1 for base access Aug 13 00:34:09.800526 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Aug 13 00:34:09.800534 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 13 00:34:09.800541 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Aug 13 00:34:09.800548 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 13 00:34:09.800555 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Aug 13 00:34:09.800562 kernel: ACPI: Added _OSI(Module Device) Aug 13 00:34:09.800570 kernel: ACPI: Added _OSI(Processor Device) Aug 13 00:34:09.800577 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 13 00:34:09.800584 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 13 00:34:09.800592 kernel: ACPI: Interpreter enabled Aug 13 00:34:09.800599 kernel: ACPI: PM: (supports S0 S5) Aug 13 00:34:09.800607 kernel: ACPI: Using IOAPIC for interrupt routing Aug 13 00:34:09.800614 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Aug 13 00:34:09.800621 kernel: PCI: Using E820 reservations for host bridge windows Aug 13 00:34:09.800628 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Aug 13 00:34:09.800636 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 13 00:34:09.800796 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 13 00:34:09.800892 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Aug 13 00:34:09.800964 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Aug 13 00:34:09.800975 kernel: PCI host bridge to bus 0000:00 Aug 13 00:34:09.801050 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Aug 13 00:34:09.801123 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Aug 13 00:34:09.801202 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Aug 13 00:34:09.801265 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Aug 13 00:34:09.801330 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Aug 13 00:34:09.801391 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Aug 13 00:34:09.801453 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 13 00:34:09.801560 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Aug 13 00:34:09.801650 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Aug 13 00:34:09.801753 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Aug 13 00:34:09.801851 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Aug 13 00:34:09.801924 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Aug 13 00:34:09.801993 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Aug 13 00:34:09.802063 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Aug 13 00:34:09.802141 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.802213 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Aug 13 00:34:09.802283 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 13 00:34:09.802357 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Aug 13 00:34:09.802427 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Aug 13 00:34:09.802504 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.802576 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Aug 13 00:34:09.802646 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 13 00:34:09.802738 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Aug 13 00:34:09.802828 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Aug 13 00:34:09.802916 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.803029 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Aug 13 00:34:09.803218 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 13 00:34:09.803323 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Aug 13 00:34:09.803423 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Aug 13 00:34:09.803541 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.803648 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Aug 13 00:34:09.803845 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 13 00:34:09.803950 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Aug 13 00:34:09.804059 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Aug 13 00:34:09.804225 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.804310 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Aug 13 00:34:09.804400 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 13 00:34:09.804493 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Aug 13 00:34:09.804570 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Aug 13 00:34:09.804649 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.806114 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Aug 13 00:34:09.806204 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 13 00:34:09.806279 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Aug 13 00:34:09.806350 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Aug 13 00:34:09.806428 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.806506 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Aug 13 00:34:09.806578 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 13 00:34:09.806648 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Aug 13 00:34:09.806743 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Aug 13 00:34:09.806841 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.806913 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Aug 13 00:34:09.806988 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 13 00:34:09.807059 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Aug 13 00:34:09.807129 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Aug 13 00:34:09.807205 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 13 00:34:09.807275 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Aug 13 00:34:09.807343 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 13 00:34:09.807446 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Aug 13 00:34:09.807529 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Aug 13 00:34:09.807606 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Aug 13 00:34:09.807676 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Aug 13 00:34:09.808876 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Aug 13 00:34:09.808961 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Aug 13 00:34:09.809033 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Aug 13 00:34:09.809115 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Aug 13 00:34:09.809193 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Aug 13 00:34:09.809275 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Aug 13 00:34:09.809906 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Aug 13 00:34:09.810011 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Aug 13 00:34:09.810090 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Aug 13 00:34:09.810162 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 13 00:34:09.810243 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Aug 13 00:34:09.810322 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Aug 13 00:34:09.810393 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 13 00:34:09.810473 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Aug 13 00:34:09.810547 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Aug 13 00:34:09.810620 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Aug 13 00:34:09.810690 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 13 00:34:09.810817 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Aug 13 00:34:09.810895 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Aug 13 00:34:09.810966 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 13 00:34:09.811045 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Aug 13 00:34:09.811119 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Aug 13 00:34:09.811188 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 13 00:34:09.811267 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Aug 13 00:34:09.811345 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Aug 13 00:34:09.811417 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Aug 13 00:34:09.811486 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 13 00:34:09.811496 kernel: acpiphp: Slot [0] registered Aug 13 00:34:09.811574 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Aug 13 00:34:09.811649 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Aug 13 00:34:09.812851 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Aug 13 00:34:09.812923 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Aug 13 00:34:09.812985 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 13 00:34:09.812993 kernel: acpiphp: Slot [0-2] registered Aug 13 00:34:09.813050 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 13 00:34:09.813059 kernel: acpiphp: Slot [0-3] registered Aug 13 00:34:09.813116 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 13 00:34:09.813124 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Aug 13 00:34:09.813134 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Aug 13 00:34:09.813140 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Aug 13 00:34:09.813145 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Aug 13 00:34:09.813151 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Aug 13 00:34:09.813157 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Aug 13 00:34:09.813163 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Aug 13 00:34:09.813169 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Aug 13 00:34:09.813175 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Aug 13 00:34:09.813181 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Aug 13 00:34:09.813188 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Aug 13 00:34:09.813193 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Aug 13 00:34:09.813199 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Aug 13 00:34:09.813205 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Aug 13 00:34:09.813211 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Aug 13 00:34:09.813216 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Aug 13 00:34:09.813222 kernel: iommu: Default domain type: Translated Aug 13 00:34:09.813228 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Aug 13 00:34:09.813234 kernel: PCI: Using ACPI for IRQ routing Aug 13 00:34:09.813241 kernel: PCI: pci_cache_line_size set to 64 bytes Aug 13 00:34:09.813246 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Aug 13 00:34:09.813252 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Aug 13 00:34:09.813311 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Aug 13 00:34:09.813370 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Aug 13 00:34:09.813428 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Aug 13 00:34:09.813436 kernel: vgaarb: loaded Aug 13 00:34:09.813442 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Aug 13 00:34:09.813448 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Aug 13 00:34:09.813456 kernel: clocksource: Switched to clocksource kvm-clock Aug 13 00:34:09.813462 kernel: VFS: Disk quotas dquot_6.6.0 Aug 13 00:34:09.813468 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 13 00:34:09.813474 kernel: pnp: PnP ACPI init Aug 13 00:34:09.813540 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Aug 13 00:34:09.813550 kernel: pnp: PnP ACPI: found 5 devices Aug 13 00:34:09.813556 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Aug 13 00:34:09.813562 kernel: NET: Registered PF_INET protocol family Aug 13 00:34:09.813570 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 13 00:34:09.813576 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Aug 13 00:34:09.813582 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 13 00:34:09.813588 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Aug 13 00:34:09.813594 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Aug 13 00:34:09.813600 kernel: TCP: Hash tables configured (established 16384 bind 16384) Aug 13 00:34:09.813605 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 13 00:34:09.813611 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Aug 13 00:34:09.813617 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 13 00:34:09.813624 kernel: NET: Registered PF_XDP protocol family Aug 13 00:34:09.813683 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Aug 13 00:34:09.814845 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Aug 13 00:34:09.814913 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Aug 13 00:34:09.814972 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Aug 13 00:34:09.815030 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Aug 13 00:34:09.815087 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Aug 13 00:34:09.815144 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 13 00:34:09.815205 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Aug 13 00:34:09.815272 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Aug 13 00:34:09.815330 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 13 00:34:09.815386 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Aug 13 00:34:09.815442 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Aug 13 00:34:09.815498 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 13 00:34:09.815554 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Aug 13 00:34:09.815610 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Aug 13 00:34:09.815666 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 13 00:34:09.815740 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Aug 13 00:34:09.816769 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Aug 13 00:34:09.816849 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 13 00:34:09.816909 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Aug 13 00:34:09.816966 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Aug 13 00:34:09.817028 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 13 00:34:09.817085 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Aug 13 00:34:09.817145 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Aug 13 00:34:09.817202 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 13 00:34:09.817258 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Aug 13 00:34:09.817314 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Aug 13 00:34:09.817370 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Aug 13 00:34:09.817430 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 13 00:34:09.817486 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Aug 13 00:34:09.817543 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Aug 13 00:34:09.817602 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Aug 13 00:34:09.817666 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 13 00:34:09.818758 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Aug 13 00:34:09.818835 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Aug 13 00:34:09.818894 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Aug 13 00:34:09.818948 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Aug 13 00:34:09.819004 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Aug 13 00:34:09.819054 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Aug 13 00:34:09.819103 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Aug 13 00:34:09.819481 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Aug 13 00:34:09.819538 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Aug 13 00:34:09.819599 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Aug 13 00:34:09.819653 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Aug 13 00:34:09.820720 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Aug 13 00:34:09.820804 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Aug 13 00:34:09.820870 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Aug 13 00:34:09.820926 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Aug 13 00:34:09.820985 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Aug 13 00:34:09.821038 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Aug 13 00:34:09.821096 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Aug 13 00:34:09.821153 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Aug 13 00:34:09.821213 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Aug 13 00:34:09.821270 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Aug 13 00:34:09.821330 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Aug 13 00:34:09.821383 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Aug 13 00:34:09.821436 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Aug 13 00:34:09.821500 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Aug 13 00:34:09.821553 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Aug 13 00:34:09.821606 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Aug 13 00:34:09.821663 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Aug 13 00:34:09.822309 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Aug 13 00:34:09.822367 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Aug 13 00:34:09.822376 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Aug 13 00:34:09.822386 kernel: PCI: CLS 0 bytes, default 64 Aug 13 00:34:09.822393 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns Aug 13 00:34:09.822399 kernel: Initialise system trusted keyrings Aug 13 00:34:09.822406 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Aug 13 00:34:09.822413 kernel: Key type asymmetric registered Aug 13 00:34:09.822420 kernel: Asymmetric key parser 'x509' registered Aug 13 00:34:09.822426 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Aug 13 00:34:09.822432 kernel: io scheduler mq-deadline registered Aug 13 00:34:09.822438 kernel: io scheduler kyber registered Aug 13 00:34:09.822446 kernel: io scheduler bfq registered Aug 13 00:34:09.822515 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Aug 13 00:34:09.822577 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Aug 13 00:34:09.822635 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Aug 13 00:34:09.824706 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Aug 13 00:34:09.824800 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Aug 13 00:34:09.824862 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Aug 13 00:34:09.824921 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Aug 13 00:34:09.824983 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Aug 13 00:34:09.825040 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Aug 13 00:34:09.825098 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Aug 13 00:34:09.825156 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Aug 13 00:34:09.825212 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Aug 13 00:34:09.825269 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Aug 13 00:34:09.825326 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Aug 13 00:34:09.825382 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Aug 13 00:34:09.825443 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Aug 13 00:34:09.825452 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Aug 13 00:34:09.825507 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Aug 13 00:34:09.825564 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Aug 13 00:34:09.825573 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Aug 13 00:34:09.825580 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Aug 13 00:34:09.825588 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 13 00:34:09.825594 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Aug 13 00:34:09.825601 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Aug 13 00:34:09.825607 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Aug 13 00:34:09.825613 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Aug 13 00:34:09.825674 kernel: rtc_cmos 00:03: RTC can wake from S4 Aug 13 00:34:09.825684 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Aug 13 00:34:09.825761 kernel: rtc_cmos 00:03: registered as rtc0 Aug 13 00:34:09.825832 kernel: rtc_cmos 00:03: setting system clock to 2025-08-13T00:34:09 UTC (1755045249) Aug 13 00:34:09.825885 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Aug 13 00:34:09.825893 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Aug 13 00:34:09.825900 kernel: NET: Registered PF_INET6 protocol family Aug 13 00:34:09.825906 kernel: Segment Routing with IPv6 Aug 13 00:34:09.825913 kernel: In-situ OAM (IOAM) with IPv6 Aug 13 00:34:09.825919 kernel: NET: Registered PF_PACKET protocol family Aug 13 00:34:09.825925 kernel: Key type dns_resolver registered Aug 13 00:34:09.825931 kernel: IPI shorthand broadcast: enabled Aug 13 00:34:09.825939 kernel: sched_clock: Marking stable (2961006837, 153826795)->(3119778339, -4944707) Aug 13 00:34:09.825946 kernel: registered taskstats version 1 Aug 13 00:34:09.825952 kernel: Loading compiled-in X.509 certificates Aug 13 00:34:09.825958 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.40-flatcar: dee0b464d3f7f8d09744a2392f69dde258bc95c0' Aug 13 00:34:09.825964 kernel: Demotion targets for Node 0: null Aug 13 00:34:09.825970 kernel: Key type .fscrypt registered Aug 13 00:34:09.825976 kernel: Key type fscrypt-provisioning registered Aug 13 00:34:09.825982 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 13 00:34:09.825989 kernel: ima: Allocated hash algorithm: sha1 Aug 13 00:34:09.825996 kernel: ima: No architecture policies found Aug 13 00:34:09.826002 kernel: clk: Disabling unused clocks Aug 13 00:34:09.826008 kernel: Warning: unable to open an initial console. Aug 13 00:34:09.826014 kernel: Freeing unused kernel image (initmem) memory: 54444K Aug 13 00:34:09.826021 kernel: Write protecting the kernel read-only data: 24576k Aug 13 00:34:09.826027 kernel: Freeing unused kernel image (rodata/data gap) memory: 280K Aug 13 00:34:09.826034 kernel: Run /init as init process Aug 13 00:34:09.826040 kernel: with arguments: Aug 13 00:34:09.826046 kernel: /init Aug 13 00:34:09.826054 kernel: with environment: Aug 13 00:34:09.826060 kernel: HOME=/ Aug 13 00:34:09.826066 kernel: TERM=linux Aug 13 00:34:09.826072 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 13 00:34:09.826079 systemd[1]: Successfully made /usr/ read-only. Aug 13 00:34:09.826088 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 00:34:09.826096 systemd[1]: Detected virtualization kvm. Aug 13 00:34:09.826103 systemd[1]: Detected architecture x86-64. Aug 13 00:34:09.826109 systemd[1]: Running in initrd. Aug 13 00:34:09.826116 systemd[1]: No hostname configured, using default hostname. Aug 13 00:34:09.826123 systemd[1]: Hostname set to . Aug 13 00:34:09.826129 systemd[1]: Initializing machine ID from VM UUID. Aug 13 00:34:09.826136 systemd[1]: Queued start job for default target initrd.target. Aug 13 00:34:09.826142 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:34:09.826149 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:34:09.826157 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 13 00:34:09.826164 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:34:09.826171 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 13 00:34:09.826178 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 13 00:34:09.826185 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 13 00:34:09.826192 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 13 00:34:09.826199 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:34:09.826206 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:34:09.826213 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:34:09.826219 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:34:09.826226 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:34:09.826233 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:34:09.826239 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:34:09.826246 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:34:09.826252 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 13 00:34:09.826259 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 13 00:34:09.826266 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:34:09.826273 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:34:09.826279 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:34:09.826286 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:34:09.826293 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 13 00:34:09.826299 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:34:09.826306 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 13 00:34:09.826313 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 13 00:34:09.826321 systemd[1]: Starting systemd-fsck-usr.service... Aug 13 00:34:09.826327 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:34:09.826334 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:34:09.826340 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:34:09.826360 systemd-journald[217]: Collecting audit messages is disabled. Aug 13 00:34:09.826378 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 13 00:34:09.826385 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:34:09.826392 systemd[1]: Finished systemd-fsck-usr.service. Aug 13 00:34:09.826399 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 00:34:09.826407 systemd-journald[217]: Journal started Aug 13 00:34:09.826423 systemd-journald[217]: Runtime Journal (/run/log/journal/344cc42c351c4fb4add447db996adf12) is 4.8M, max 38.6M, 33.7M free. Aug 13 00:34:09.804207 systemd-modules-load[218]: Inserted module 'overlay' Aug 13 00:34:09.863280 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 13 00:34:09.863296 kernel: Bridge firewalling registered Aug 13 00:34:09.834652 systemd-modules-load[218]: Inserted module 'br_netfilter' Aug 13 00:34:09.865439 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:34:09.866090 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:34:09.867216 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:34:09.870816 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 13 00:34:09.872813 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:34:09.884345 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:34:09.887971 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:34:09.891177 systemd-tmpfiles[235]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 13 00:34:09.895295 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:34:09.898920 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:34:09.899666 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:34:09.901685 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 13 00:34:09.903777 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:34:09.911769 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:34:09.916820 dracut-cmdline[251]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=215bdedb8de38f6b96ec4f9db80853e25015f60454b867e319fdcb9244320a21 Aug 13 00:34:09.927322 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:34:09.933362 systemd-resolved[252]: Positive Trust Anchors: Aug 13 00:34:09.933918 systemd-resolved[252]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:34:09.934545 systemd-resolved[252]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:34:09.938266 systemd-resolved[252]: Defaulting to hostname 'linux'. Aug 13 00:34:09.938987 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:34:09.940769 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:34:09.972733 kernel: SCSI subsystem initialized Aug 13 00:34:09.979713 kernel: Loading iSCSI transport class v2.0-870. Aug 13 00:34:09.990725 kernel: iscsi: registered transport (tcp) Aug 13 00:34:10.006730 kernel: iscsi: registered transport (qla4xxx) Aug 13 00:34:10.006756 kernel: QLogic iSCSI HBA Driver Aug 13 00:34:10.019969 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:34:10.030376 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:34:10.032677 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:34:10.061219 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 13 00:34:10.063382 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 13 00:34:10.100720 kernel: raid6: avx2x4 gen() 32648 MB/s Aug 13 00:34:10.117713 kernel: raid6: avx2x2 gen() 31612 MB/s Aug 13 00:34:10.135012 kernel: raid6: avx2x1 gen() 23222 MB/s Aug 13 00:34:10.135039 kernel: raid6: using algorithm avx2x4 gen() 32648 MB/s Aug 13 00:34:10.153936 kernel: raid6: .... xor() 4667 MB/s, rmw enabled Aug 13 00:34:10.153967 kernel: raid6: using avx2x2 recovery algorithm Aug 13 00:34:10.174741 kernel: xor: automatically using best checksumming function avx Aug 13 00:34:10.298738 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 13 00:34:10.303526 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:34:10.305229 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:34:10.327316 systemd-udevd[465]: Using default interface naming scheme 'v255'. Aug 13 00:34:10.331233 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:34:10.334171 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 13 00:34:10.362726 dracut-pre-trigger[474]: rd.md=0: removing MD RAID activation Aug 13 00:34:10.379227 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:34:10.380862 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:34:10.441193 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:34:10.445266 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 13 00:34:10.528735 kernel: ACPI: bus type USB registered Aug 13 00:34:10.528794 kernel: usbcore: registered new interface driver usbfs Aug 13 00:34:10.528804 kernel: cryptd: max_cpu_qlen set to 1000 Aug 13 00:34:10.532800 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Aug 13 00:34:10.538742 kernel: usbcore: registered new interface driver hub Aug 13 00:34:10.540678 kernel: libata version 3.00 loaded. Aug 13 00:34:10.540722 kernel: scsi host0: Virtio SCSI HBA Aug 13 00:34:10.542066 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:34:10.545165 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Aug 13 00:34:10.545194 kernel: usbcore: registered new device driver usb Aug 13 00:34:10.542245 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:34:10.546475 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:34:10.550084 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:34:10.554756 kernel: AES CTR mode by8 optimization enabled Aug 13 00:34:10.604506 kernel: ahci 0000:00:1f.2: version 3.0 Aug 13 00:34:10.604686 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Aug 13 00:34:10.604722 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Aug 13 00:34:10.604831 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Aug 13 00:34:10.604956 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Aug 13 00:34:10.611793 kernel: scsi host1: ahci Aug 13 00:34:10.611899 kernel: scsi host2: ahci Aug 13 00:34:10.611974 kernel: scsi host3: ahci Aug 13 00:34:10.614721 kernel: scsi host4: ahci Aug 13 00:34:10.614907 kernel: scsi host5: ahci Aug 13 00:34:10.614988 kernel: scsi host6: ahci Aug 13 00:34:10.615090 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 46 lpm-pol 0 Aug 13 00:34:10.615099 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 46 lpm-pol 0 Aug 13 00:34:10.615107 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 46 lpm-pol 0 Aug 13 00:34:10.615114 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 46 lpm-pol 0 Aug 13 00:34:10.615121 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 46 lpm-pol 0 Aug 13 00:34:10.615128 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 46 lpm-pol 0 Aug 13 00:34:10.618716 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Aug 13 00:34:10.632716 kernel: sd 0:0:0:0: Power-on or device reset occurred Aug 13 00:34:10.633743 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Aug 13 00:34:10.633857 kernel: sd 0:0:0:0: [sda] Write Protect is off Aug 13 00:34:10.633937 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Aug 13 00:34:10.634010 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 13 00:34:10.640715 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 13 00:34:10.640735 kernel: GPT:17805311 != 80003071 Aug 13 00:34:10.640748 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 13 00:34:10.640756 kernel: GPT:17805311 != 80003071 Aug 13 00:34:10.640763 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 13 00:34:10.640770 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:34:10.640777 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Aug 13 00:34:10.677423 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:34:10.927409 kernel: ata3: SATA link down (SStatus 0 SControl 300) Aug 13 00:34:10.927498 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Aug 13 00:34:10.927531 kernel: ata4: SATA link down (SStatus 0 SControl 300) Aug 13 00:34:10.927541 kernel: ata2: SATA link down (SStatus 0 SControl 300) Aug 13 00:34:10.927549 kernel: ata6: SATA link down (SStatus 0 SControl 300) Aug 13 00:34:10.927556 kernel: ata5: SATA link down (SStatus 0 SControl 300) Aug 13 00:34:10.929723 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Aug 13 00:34:10.930978 kernel: ata1.00: applying bridge limits Aug 13 00:34:10.931035 kernel: ata1.00: configured for UDMA/100 Aug 13 00:34:10.932724 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Aug 13 00:34:10.962451 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 13 00:34:10.962674 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Aug 13 00:34:10.967728 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Aug 13 00:34:10.971911 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 13 00:34:10.972167 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Aug 13 00:34:10.975947 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Aug 13 00:34:10.983421 kernel: hub 1-0:1.0: USB hub found Aug 13 00:34:10.983891 kernel: hub 1-0:1.0: 4 ports detected Aug 13 00:34:10.993731 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Aug 13 00:34:10.994116 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Aug 13 00:34:10.998509 kernel: hub 2-0:1.0: USB hub found Aug 13 00:34:10.998996 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 13 00:34:11.011728 kernel: hub 2-0:1.0: 4 ports detected Aug 13 00:34:11.030735 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Aug 13 00:34:11.047409 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Aug 13 00:34:11.056929 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Aug 13 00:34:11.063902 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Aug 13 00:34:11.064410 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Aug 13 00:34:11.082581 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 13 00:34:11.084230 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 13 00:34:11.102836 disk-uuid[630]: Primary Header is updated. Aug 13 00:34:11.102836 disk-uuid[630]: Secondary Entries is updated. Aug 13 00:34:11.102836 disk-uuid[630]: Secondary Header is updated. Aug 13 00:34:11.119113 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:34:11.230661 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Aug 13 00:34:11.257031 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 13 00:34:11.257755 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:34:11.258590 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:34:11.259866 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:34:11.261561 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 13 00:34:11.273492 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:34:11.377738 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 13 00:34:11.384350 kernel: usbcore: registered new interface driver usbhid Aug 13 00:34:11.384398 kernel: usbhid: USB HID core driver Aug 13 00:34:11.392069 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Aug 13 00:34:11.392095 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Aug 13 00:34:12.138822 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 13 00:34:12.139263 disk-uuid[631]: The operation has completed successfully. Aug 13 00:34:12.181530 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 13 00:34:12.181623 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 13 00:34:12.217424 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 13 00:34:12.232799 sh[663]: Success Aug 13 00:34:12.252540 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 13 00:34:12.252629 kernel: device-mapper: uevent: version 1.0.3 Aug 13 00:34:12.255899 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 13 00:34:12.264746 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Aug 13 00:34:12.308842 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 13 00:34:12.312374 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 13 00:34:12.320526 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 13 00:34:12.333391 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Aug 13 00:34:12.333427 kernel: BTRFS: device fsid 0c0338fb-9434-41c1-99a2-737cbe2351c4 devid 1 transid 44 /dev/mapper/usr (254:0) scanned by mount (675) Aug 13 00:34:12.338942 kernel: BTRFS info (device dm-0): first mount of filesystem 0c0338fb-9434-41c1-99a2-737cbe2351c4 Aug 13 00:34:12.338975 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:34:12.341602 kernel: BTRFS info (device dm-0): using free-space-tree Aug 13 00:34:12.350098 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 13 00:34:12.350911 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 13 00:34:12.351769 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 13 00:34:12.353526 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 13 00:34:12.355289 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 13 00:34:12.376753 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (708) Aug 13 00:34:12.382463 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:34:12.382503 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:34:12.382517 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:34:12.390716 kernel: BTRFS info (device sda6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:34:12.391798 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 13 00:34:12.394825 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 13 00:34:12.426060 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:34:12.428808 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:34:12.474608 systemd-networkd[844]: lo: Link UP Aug 13 00:34:12.475208 systemd-networkd[844]: lo: Gained carrier Aug 13 00:34:12.477846 systemd-networkd[844]: Enumeration completed Aug 13 00:34:12.478424 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:34:12.479194 systemd-networkd[844]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:34:12.479198 systemd-networkd[844]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:34:12.480164 systemd-networkd[844]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:34:12.480167 systemd-networkd[844]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:34:12.481009 systemd-networkd[844]: eth0: Link UP Aug 13 00:34:12.481120 systemd-networkd[844]: eth0: Gained carrier Aug 13 00:34:12.485485 ignition[783]: Ignition 2.21.0 Aug 13 00:34:12.481127 systemd-networkd[844]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:34:12.485490 ignition[783]: Stage: fetch-offline Aug 13 00:34:12.481244 systemd[1]: Reached target network.target - Network. Aug 13 00:34:12.485509 ignition[783]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:34:12.486005 systemd-networkd[844]: eth1: Link UP Aug 13 00:34:12.485514 ignition[783]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:34:12.486757 systemd-networkd[844]: eth1: Gained carrier Aug 13 00:34:12.485569 ignition[783]: parsed url from cmdline: "" Aug 13 00:34:12.486765 systemd-networkd[844]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:34:12.485572 ignition[783]: no config URL provided Aug 13 00:34:12.487030 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:34:12.485575 ignition[783]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 00:34:12.490821 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 13 00:34:12.485579 ignition[783]: no config at "/usr/lib/ignition/user.ign" Aug 13 00:34:12.485583 ignition[783]: failed to fetch config: resource requires networking Aug 13 00:34:12.485689 ignition[783]: Ignition finished successfully Aug 13 00:34:12.506200 ignition[853]: Ignition 2.21.0 Aug 13 00:34:12.506212 ignition[853]: Stage: fetch Aug 13 00:34:12.506327 ignition[853]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:34:12.506333 ignition[853]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:34:12.506393 ignition[853]: parsed url from cmdline: "" Aug 13 00:34:12.506395 ignition[853]: no config URL provided Aug 13 00:34:12.506398 ignition[853]: reading system config file "/usr/lib/ignition/user.ign" Aug 13 00:34:12.506404 ignition[853]: no config at "/usr/lib/ignition/user.ign" Aug 13 00:34:12.510742 systemd-networkd[844]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 13 00:34:12.506429 ignition[853]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Aug 13 00:34:12.506532 ignition[853]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Aug 13 00:34:12.538759 systemd-networkd[844]: eth0: DHCPv4 address 46.62.157.78/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 13 00:34:12.707622 ignition[853]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Aug 13 00:34:12.712848 ignition[853]: GET result: OK Aug 13 00:34:12.712933 ignition[853]: parsing config with SHA512: 0ddaf5377202c4787ac0b34e5723a136f15cb59b8a1e8ea4e3260270e59ae94dfebc341e7b4cc12bda68be53139adfe253e86cdf77055873dc5bc49f9d058c0a Aug 13 00:34:12.718888 unknown[853]: fetched base config from "system" Aug 13 00:34:12.718899 unknown[853]: fetched base config from "system" Aug 13 00:34:12.719201 ignition[853]: fetch: fetch complete Aug 13 00:34:12.718905 unknown[853]: fetched user config from "hetzner" Aug 13 00:34:12.719206 ignition[853]: fetch: fetch passed Aug 13 00:34:12.722321 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 13 00:34:12.719243 ignition[853]: Ignition finished successfully Aug 13 00:34:12.724750 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 13 00:34:12.749000 ignition[860]: Ignition 2.21.0 Aug 13 00:34:12.749014 ignition[860]: Stage: kargs Aug 13 00:34:12.749140 ignition[860]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:34:12.749148 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:34:12.751013 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 13 00:34:12.749814 ignition[860]: kargs: kargs passed Aug 13 00:34:12.749854 ignition[860]: Ignition finished successfully Aug 13 00:34:12.754622 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 13 00:34:12.772159 ignition[867]: Ignition 2.21.0 Aug 13 00:34:12.772684 ignition[867]: Stage: disks Aug 13 00:34:12.772900 ignition[867]: no configs at "/usr/lib/ignition/base.d" Aug 13 00:34:12.772914 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:34:12.773852 ignition[867]: disks: disks passed Aug 13 00:34:12.774816 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 13 00:34:12.773912 ignition[867]: Ignition finished successfully Aug 13 00:34:12.776116 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 13 00:34:12.776984 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 13 00:34:12.778076 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:34:12.779367 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:34:12.780636 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:34:12.782474 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 13 00:34:12.805584 systemd-fsck[875]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Aug 13 00:34:12.807298 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 13 00:34:12.810147 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 13 00:34:12.919762 kernel: EXT4-fs (sda9): mounted filesystem 069caac6-7833-4acd-8940-01a7ff7d1281 r/w with ordered data mode. Quota mode: none. Aug 13 00:34:12.920333 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 13 00:34:12.921482 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 13 00:34:12.925065 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:34:12.928808 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 13 00:34:12.939893 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 13 00:34:12.941971 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 13 00:34:12.942013 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:34:12.950166 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 13 00:34:12.955895 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 13 00:34:12.962214 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (883) Aug 13 00:34:12.962249 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:34:12.962271 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:34:12.971476 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:34:12.983478 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:34:13.012990 coreos-metadata[885]: Aug 13 00:34:13.012 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Aug 13 00:34:13.014997 initrd-setup-root[911]: cut: /sysroot/etc/passwd: No such file or directory Aug 13 00:34:13.015863 coreos-metadata[885]: Aug 13 00:34:13.015 INFO Fetch successful Aug 13 00:34:13.015863 coreos-metadata[885]: Aug 13 00:34:13.015 INFO wrote hostname ci-4372-1-0-b-5ba4a9a74b to /sysroot/etc/hostname Aug 13 00:34:13.016950 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 00:34:13.020980 initrd-setup-root[919]: cut: /sysroot/etc/group: No such file or directory Aug 13 00:34:13.024812 initrd-setup-root[926]: cut: /sysroot/etc/shadow: No such file or directory Aug 13 00:34:13.028187 initrd-setup-root[933]: cut: /sysroot/etc/gshadow: No such file or directory Aug 13 00:34:13.100292 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 13 00:34:13.101880 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 13 00:34:13.103538 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 13 00:34:13.120726 kernel: BTRFS info (device sda6): last unmount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:34:13.134202 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 13 00:34:13.143560 ignition[1000]: INFO : Ignition 2.21.0 Aug 13 00:34:13.143560 ignition[1000]: INFO : Stage: mount Aug 13 00:34:13.144661 ignition[1000]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:34:13.144661 ignition[1000]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:34:13.146638 ignition[1000]: INFO : mount: mount passed Aug 13 00:34:13.147169 ignition[1000]: INFO : Ignition finished successfully Aug 13 00:34:13.147798 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 13 00:34:13.149433 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 13 00:34:13.331901 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 13 00:34:13.333393 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 13 00:34:13.360725 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1012) Aug 13 00:34:13.360768 kernel: BTRFS info (device sda6): first mount of filesystem 900bf3f4-cc50-4925-b275-d85854bb916f Aug 13 00:34:13.362933 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Aug 13 00:34:13.365493 kernel: BTRFS info (device sda6): using free-space-tree Aug 13 00:34:13.370334 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 13 00:34:13.397679 ignition[1028]: INFO : Ignition 2.21.0 Aug 13 00:34:13.397679 ignition[1028]: INFO : Stage: files Aug 13 00:34:13.399288 ignition[1028]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:34:13.399288 ignition[1028]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:34:13.399288 ignition[1028]: DEBUG : files: compiled without relabeling support, skipping Aug 13 00:34:13.402135 ignition[1028]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 13 00:34:13.402135 ignition[1028]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 13 00:34:13.403937 ignition[1028]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 13 00:34:13.404967 ignition[1028]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 13 00:34:13.404967 ignition[1028]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 13 00:34:13.404348 unknown[1028]: wrote ssh authorized keys file for user: core Aug 13 00:34:13.407291 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Aug 13 00:34:13.407291 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Aug 13 00:34:13.571399 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 13 00:34:13.778932 systemd-networkd[844]: eth1: Gained IPv6LL Aug 13 00:34:13.842947 systemd-networkd[844]: eth0: Gained IPv6LL Aug 13 00:34:14.111457 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Aug 13 00:34:14.111457 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 13 00:34:14.119572 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Aug 13 00:34:14.286196 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 13 00:34:14.460013 ignition[1028]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Aug 13 00:34:14.460013 ignition[1028]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 13 00:34:14.462921 ignition[1028]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 13 00:34:14.464147 ignition[1028]: INFO : files: files passed Aug 13 00:34:14.464147 ignition[1028]: INFO : Ignition finished successfully Aug 13 00:34:14.464993 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 13 00:34:14.468064 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 13 00:34:14.477523 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 13 00:34:14.480414 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 13 00:34:14.481189 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 13 00:34:14.491421 initrd-setup-root-after-ignition[1058]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:34:14.491421 initrd-setup-root-after-ignition[1058]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:34:14.493861 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 13 00:34:14.494302 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:34:14.495719 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 13 00:34:14.497226 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 13 00:34:14.533303 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 13 00:34:14.533405 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 13 00:34:14.534894 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 13 00:34:14.535988 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 13 00:34:14.537161 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 13 00:34:14.537932 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 13 00:34:14.556187 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:34:14.558970 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 13 00:34:14.575116 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:34:14.575841 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:34:14.577086 systemd[1]: Stopped target timers.target - Timer Units. Aug 13 00:34:14.578206 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 13 00:34:14.578310 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 13 00:34:14.579665 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 13 00:34:14.580366 systemd[1]: Stopped target basic.target - Basic System. Aug 13 00:34:14.581774 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 13 00:34:14.582755 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 13 00:34:14.583722 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 13 00:34:14.585466 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 13 00:34:14.586876 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 13 00:34:14.588258 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 13 00:34:14.589660 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 13 00:34:14.591210 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 13 00:34:14.592480 systemd[1]: Stopped target swap.target - Swaps. Aug 13 00:34:14.594528 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 13 00:34:14.594856 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 13 00:34:14.597192 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:34:14.598292 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:34:14.599721 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 13 00:34:14.599905 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:34:14.601218 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 13 00:34:14.601396 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 13 00:34:14.603260 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 13 00:34:14.603508 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 13 00:34:14.605019 systemd[1]: ignition-files.service: Deactivated successfully. Aug 13 00:34:14.605217 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 13 00:34:14.606517 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 13 00:34:14.606783 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 13 00:34:14.614986 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 13 00:34:14.618838 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 13 00:34:14.619907 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 13 00:34:14.620012 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:34:14.622204 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 13 00:34:14.622321 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 13 00:34:14.626083 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 13 00:34:14.628092 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 13 00:34:14.641202 ignition[1082]: INFO : Ignition 2.21.0 Aug 13 00:34:14.641914 ignition[1082]: INFO : Stage: umount Aug 13 00:34:14.643362 ignition[1082]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 13 00:34:14.643362 ignition[1082]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 13 00:34:14.643362 ignition[1082]: INFO : umount: umount passed Aug 13 00:34:14.643362 ignition[1082]: INFO : Ignition finished successfully Aug 13 00:34:14.644831 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 13 00:34:14.646737 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 13 00:34:14.646833 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 13 00:34:14.647665 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 13 00:34:14.647881 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 13 00:34:14.648368 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 13 00:34:14.648400 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 13 00:34:14.648971 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 13 00:34:14.649004 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 13 00:34:14.649895 systemd[1]: Stopped target network.target - Network. Aug 13 00:34:14.650819 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 13 00:34:14.650858 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 13 00:34:14.651758 systemd[1]: Stopped target paths.target - Path Units. Aug 13 00:34:14.652582 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 13 00:34:14.656025 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:34:14.656720 systemd[1]: Stopped target slices.target - Slice Units. Aug 13 00:34:14.657552 systemd[1]: Stopped target sockets.target - Socket Units. Aug 13 00:34:14.658570 systemd[1]: iscsid.socket: Deactivated successfully. Aug 13 00:34:14.658600 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 13 00:34:14.659917 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 13 00:34:14.659942 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 13 00:34:14.660980 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 13 00:34:14.661021 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 13 00:34:14.662089 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 13 00:34:14.662121 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 13 00:34:14.663345 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 13 00:34:14.664305 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 13 00:34:14.665753 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 13 00:34:14.665832 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 13 00:34:14.666877 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 13 00:34:14.666936 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 13 00:34:14.672490 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 13 00:34:14.672636 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 13 00:34:14.675068 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 13 00:34:14.675233 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 13 00:34:14.675263 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:34:14.676663 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 13 00:34:14.677628 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 13 00:34:14.677913 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 13 00:34:14.680134 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 13 00:34:14.680353 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 13 00:34:14.681411 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 13 00:34:14.681437 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:34:14.683152 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 13 00:34:14.685857 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 13 00:34:14.685894 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 13 00:34:14.686587 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 13 00:34:14.686617 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:34:14.687572 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 13 00:34:14.687612 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 13 00:34:14.688517 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:34:14.692179 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 13 00:34:14.700104 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 13 00:34:14.708850 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:34:14.710101 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 13 00:34:14.710170 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 13 00:34:14.711501 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 13 00:34:14.711545 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:34:14.712070 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 13 00:34:14.712128 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 13 00:34:14.713844 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 13 00:34:14.713881 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 13 00:34:14.715158 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 13 00:34:14.715196 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 13 00:34:14.717783 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 13 00:34:14.718475 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 13 00:34:14.718532 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:34:14.721238 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 13 00:34:14.721276 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:34:14.722831 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Aug 13 00:34:14.722865 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:34:14.728412 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 13 00:34:14.728513 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:34:14.729458 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:34:14.729517 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:34:14.731894 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 13 00:34:14.731991 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 13 00:34:14.736199 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 13 00:34:14.736262 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 13 00:34:14.738014 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 13 00:34:14.740761 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 13 00:34:14.775255 systemd[1]: Switching root. Aug 13 00:34:14.812718 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Aug 13 00:34:14.812789 systemd-journald[217]: Journal stopped Aug 13 00:34:15.689025 kernel: SELinux: policy capability network_peer_controls=1 Aug 13 00:34:15.689069 kernel: SELinux: policy capability open_perms=1 Aug 13 00:34:15.689080 kernel: SELinux: policy capability extended_socket_class=1 Aug 13 00:34:15.689090 kernel: SELinux: policy capability always_check_network=0 Aug 13 00:34:15.689099 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 13 00:34:15.689107 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 13 00:34:15.689115 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 13 00:34:15.689122 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 13 00:34:15.689129 kernel: SELinux: policy capability userspace_initial_context=0 Aug 13 00:34:15.689138 kernel: audit: type=1403 audit(1755045254.959:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 13 00:34:15.689151 systemd[1]: Successfully loaded SELinux policy in 48.472ms. Aug 13 00:34:15.689168 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.300ms. Aug 13 00:34:15.689178 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 13 00:34:15.689187 systemd[1]: Detected virtualization kvm. Aug 13 00:34:15.689195 systemd[1]: Detected architecture x86-64. Aug 13 00:34:15.689203 systemd[1]: Detected first boot. Aug 13 00:34:15.689212 systemd[1]: Hostname set to . Aug 13 00:34:15.689220 systemd[1]: Initializing machine ID from VM UUID. Aug 13 00:34:15.689228 zram_generator::config[1126]: No configuration found. Aug 13 00:34:15.689237 kernel: Guest personality initialized and is inactive Aug 13 00:34:15.689246 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Aug 13 00:34:15.689253 kernel: Initialized host personality Aug 13 00:34:15.689261 kernel: NET: Registered PF_VSOCK protocol family Aug 13 00:34:15.689268 systemd[1]: Populated /etc with preset unit settings. Aug 13 00:34:15.689277 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 13 00:34:15.689285 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 13 00:34:15.689293 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 13 00:34:15.689301 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 13 00:34:15.689309 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 13 00:34:15.689319 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 13 00:34:15.689327 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 13 00:34:15.689335 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 13 00:34:15.689344 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 13 00:34:15.689352 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 13 00:34:15.689360 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 13 00:34:15.689368 systemd[1]: Created slice user.slice - User and Session Slice. Aug 13 00:34:15.689376 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 13 00:34:15.689385 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 13 00:34:15.689394 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 13 00:34:15.689402 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 13 00:34:15.691736 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 13 00:34:15.691752 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 13 00:34:15.691768 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Aug 13 00:34:15.691776 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 13 00:34:15.691785 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 13 00:34:15.691806 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 13 00:34:15.691815 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 13 00:34:15.691823 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 13 00:34:15.691832 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 13 00:34:15.691840 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 13 00:34:15.691850 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 13 00:34:15.691858 systemd[1]: Reached target slices.target - Slice Units. Aug 13 00:34:15.691866 systemd[1]: Reached target swap.target - Swaps. Aug 13 00:34:15.691874 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 13 00:34:15.691883 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 13 00:34:15.691891 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 13 00:34:15.691900 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 13 00:34:15.691908 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 13 00:34:15.691917 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 13 00:34:15.691926 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 13 00:34:15.691934 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 13 00:34:15.691942 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 13 00:34:15.691951 systemd[1]: Mounting media.mount - External Media Directory... Aug 13 00:34:15.691961 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:34:15.691969 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 13 00:34:15.691977 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 13 00:34:15.691986 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 13 00:34:15.691994 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 13 00:34:15.692004 systemd[1]: Reached target machines.target - Containers. Aug 13 00:34:15.692012 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 13 00:34:15.692020 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:34:15.692028 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 13 00:34:15.692036 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 13 00:34:15.692044 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:34:15.692052 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:34:15.692061 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:34:15.692070 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 13 00:34:15.692079 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:34:15.692087 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 13 00:34:15.692095 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 13 00:34:15.692103 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 13 00:34:15.692111 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 13 00:34:15.692119 systemd[1]: Stopped systemd-fsck-usr.service. Aug 13 00:34:15.692128 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:34:15.692138 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 13 00:34:15.692147 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 13 00:34:15.692155 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 13 00:34:15.692163 kernel: loop: module loaded Aug 13 00:34:15.692171 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 13 00:34:15.692179 kernel: fuse: init (API version 7.41) Aug 13 00:34:15.692188 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 13 00:34:15.692214 systemd-journald[1217]: Collecting audit messages is disabled. Aug 13 00:34:15.692236 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 13 00:34:15.692247 systemd[1]: verity-setup.service: Deactivated successfully. Aug 13 00:34:15.692256 systemd-journald[1217]: Journal started Aug 13 00:34:15.692273 systemd-journald[1217]: Runtime Journal (/run/log/journal/344cc42c351c4fb4add447db996adf12) is 4.8M, max 38.6M, 33.7M free. Aug 13 00:34:15.413642 systemd[1]: Queued start job for default target multi-user.target. Aug 13 00:34:15.427487 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Aug 13 00:34:15.428043 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 13 00:34:15.694201 systemd[1]: Stopped verity-setup.service. Aug 13 00:34:15.698837 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:34:15.713157 systemd[1]: Started systemd-journald.service - Journal Service. Aug 13 00:34:15.706235 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 13 00:34:15.706765 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 13 00:34:15.708606 systemd[1]: Mounted media.mount - External Media Directory. Aug 13 00:34:15.709115 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 13 00:34:15.709602 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 13 00:34:15.710133 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 13 00:34:15.711463 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 13 00:34:15.712226 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 13 00:34:15.713016 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 13 00:34:15.713548 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 13 00:34:15.714397 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:34:15.714501 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:34:15.715836 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:34:15.716083 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:34:15.717607 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 13 00:34:15.720231 kernel: ACPI: bus type drm_connector registered Aug 13 00:34:15.719764 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 13 00:34:15.721149 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:34:15.721332 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:34:15.722134 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:34:15.722300 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:34:15.723242 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 13 00:34:15.724000 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 13 00:34:15.724681 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 13 00:34:15.725471 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 13 00:34:15.734366 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 13 00:34:15.736774 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 13 00:34:15.739864 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 13 00:34:15.740463 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 13 00:34:15.740563 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 13 00:34:15.742906 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 13 00:34:15.751839 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 13 00:34:15.753873 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:34:15.755735 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 13 00:34:15.758111 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 13 00:34:15.758814 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:34:15.759781 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 13 00:34:15.762045 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:34:15.763162 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 13 00:34:15.766117 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 13 00:34:15.770922 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 13 00:34:15.774233 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 13 00:34:15.775865 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 13 00:34:15.779580 systemd-journald[1217]: Time spent on flushing to /var/log/journal/344cc42c351c4fb4add447db996adf12 is 56.978ms for 1157 entries. Aug 13 00:34:15.779580 systemd-journald[1217]: System Journal (/var/log/journal/344cc42c351c4fb4add447db996adf12) is 8M, max 584.8M, 576.8M free. Aug 13 00:34:15.853375 systemd-journald[1217]: Received client request to flush runtime journal. Aug 13 00:34:15.853415 kernel: loop0: detected capacity change from 0 to 229808 Aug 13 00:34:15.788002 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 13 00:34:15.789940 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 13 00:34:15.800273 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 13 00:34:15.839822 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 13 00:34:15.844340 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 13 00:34:15.846981 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 13 00:34:15.855651 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 13 00:34:15.859901 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. Aug 13 00:34:15.859910 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. Aug 13 00:34:15.872604 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 13 00:34:15.871225 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 13 00:34:15.873056 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 13 00:34:15.897730 kernel: loop1: detected capacity change from 0 to 146240 Aug 13 00:34:15.916272 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 13 00:34:15.920836 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 13 00:34:15.928050 kernel: loop2: detected capacity change from 0 to 113872 Aug 13 00:34:15.947877 systemd-tmpfiles[1273]: ACLs are not supported, ignoring. Aug 13 00:34:15.948131 systemd-tmpfiles[1273]: ACLs are not supported, ignoring. Aug 13 00:34:15.951125 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 13 00:34:15.969970 kernel: loop3: detected capacity change from 0 to 8 Aug 13 00:34:15.985000 kernel: loop4: detected capacity change from 0 to 229808 Aug 13 00:34:16.007746 kernel: loop5: detected capacity change from 0 to 146240 Aug 13 00:34:16.026735 kernel: loop6: detected capacity change from 0 to 113872 Aug 13 00:34:16.046866 kernel: loop7: detected capacity change from 0 to 8 Aug 13 00:34:16.047828 (sd-merge)[1278]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Aug 13 00:34:16.048395 (sd-merge)[1278]: Merged extensions into '/usr'. Aug 13 00:34:16.052848 systemd[1]: Reload requested from client PID 1252 ('systemd-sysext') (unit systemd-sysext.service)... Aug 13 00:34:16.052936 systemd[1]: Reloading... Aug 13 00:34:16.101723 zram_generator::config[1300]: No configuration found. Aug 13 00:34:16.191660 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:34:16.248578 ldconfig[1247]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 13 00:34:16.268716 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 13 00:34:16.268812 systemd[1]: Reloading finished in 215 ms. Aug 13 00:34:16.299392 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 13 00:34:16.300361 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 13 00:34:16.312840 systemd[1]: Starting ensure-sysext.service... Aug 13 00:34:16.314915 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 13 00:34:16.331673 systemd[1]: Reload requested from client PID 1347 ('systemctl') (unit ensure-sysext.service)... Aug 13 00:34:16.331820 systemd[1]: Reloading... Aug 13 00:34:16.339466 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 13 00:34:16.339494 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 13 00:34:16.339673 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 13 00:34:16.341298 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 13 00:34:16.341922 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 13 00:34:16.342420 systemd-tmpfiles[1348]: ACLs are not supported, ignoring. Aug 13 00:34:16.342937 systemd-tmpfiles[1348]: ACLs are not supported, ignoring. Aug 13 00:34:16.347474 systemd-tmpfiles[1348]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:34:16.347484 systemd-tmpfiles[1348]: Skipping /boot Aug 13 00:34:16.355377 systemd-tmpfiles[1348]: Detected autofs mount point /boot during canonicalization of boot. Aug 13 00:34:16.355486 systemd-tmpfiles[1348]: Skipping /boot Aug 13 00:34:16.390721 zram_generator::config[1374]: No configuration found. Aug 13 00:34:16.456289 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:34:16.520912 systemd[1]: Reloading finished in 188 ms. Aug 13 00:34:16.539058 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 13 00:34:16.543414 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 13 00:34:16.549815 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 00:34:16.552314 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 13 00:34:16.554814 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 13 00:34:16.557245 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 13 00:34:16.560865 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 13 00:34:16.564880 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 13 00:34:16.572308 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:34:16.572438 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:34:16.573574 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 13 00:34:16.577285 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 13 00:34:16.578482 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 13 00:34:16.579508 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:34:16.579593 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:34:16.582813 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 13 00:34:16.583376 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:34:16.585547 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:34:16.585909 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:34:16.586118 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:34:16.586234 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:34:16.586360 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:34:16.590772 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:34:16.590957 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 13 00:34:16.592040 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 13 00:34:16.592851 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 13 00:34:16.592927 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 13 00:34:16.593029 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Aug 13 00:34:16.597731 systemd[1]: Finished ensure-sysext.service. Aug 13 00:34:16.601993 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 13 00:34:16.613066 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 13 00:34:16.622936 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 13 00:34:16.624929 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 13 00:34:16.635432 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 13 00:34:16.638034 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 13 00:34:16.640122 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 13 00:34:16.640255 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 13 00:34:16.641028 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 13 00:34:16.641147 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 13 00:34:16.641909 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 13 00:34:16.642304 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 13 00:34:16.648230 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 13 00:34:16.648283 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 13 00:34:16.650889 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 13 00:34:16.657643 systemd-udevd[1424]: Using default interface naming scheme 'v255'. Aug 13 00:34:16.667939 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 13 00:34:16.669014 augenrules[1462]: No rules Aug 13 00:34:16.671872 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:34:16.672037 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 00:34:16.681754 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 13 00:34:16.682485 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 13 00:34:16.695062 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 13 00:34:16.702049 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 13 00:34:16.765676 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Aug 13 00:34:16.776900 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 13 00:34:16.777911 systemd[1]: Reached target time-set.target - System Time Set. Aug 13 00:34:16.820600 systemd-resolved[1423]: Positive Trust Anchors: Aug 13 00:34:16.820618 systemd-resolved[1423]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 13 00:34:16.820645 systemd-resolved[1423]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 13 00:34:16.826723 kernel: mousedev: PS/2 mouse device common for all mice Aug 13 00:34:16.826582 systemd-resolved[1423]: Using system hostname 'ci-4372-1-0-b-5ba4a9a74b'. Aug 13 00:34:16.827880 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 13 00:34:16.828395 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 13 00:34:16.829208 systemd[1]: Reached target sysinit.target - System Initialization. Aug 13 00:34:16.829857 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 13 00:34:16.831226 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 13 00:34:16.831683 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Aug 13 00:34:16.832254 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 13 00:34:16.832835 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 13 00:34:16.833319 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 13 00:34:16.834287 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 13 00:34:16.834310 systemd[1]: Reached target paths.target - Path Units. Aug 13 00:34:16.834762 systemd[1]: Reached target timers.target - Timer Units. Aug 13 00:34:16.836440 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 13 00:34:16.838105 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 13 00:34:16.841670 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 13 00:34:16.842271 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 13 00:34:16.842782 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 13 00:34:16.845848 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 13 00:34:16.852032 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Aug 13 00:34:16.848069 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 13 00:34:16.853901 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 13 00:34:16.856382 systemd[1]: Reached target sockets.target - Socket Units. Aug 13 00:34:16.856828 systemd[1]: Reached target basic.target - Basic System. Aug 13 00:34:16.857259 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:34:16.857286 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 13 00:34:16.861197 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 13 00:34:16.863725 kernel: ACPI: button: Power Button [PWRF] Aug 13 00:34:16.862381 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 13 00:34:16.867036 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 13 00:34:16.873377 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 13 00:34:16.876775 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 13 00:34:16.877240 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 13 00:34:16.879742 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Aug 13 00:34:16.888618 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 13 00:34:16.894675 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 13 00:34:16.897600 jq[1519]: false Aug 13 00:34:16.903666 systemd-networkd[1481]: lo: Link UP Aug 13 00:34:16.903680 systemd-networkd[1481]: lo: Gained carrier Aug 13 00:34:16.908923 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 13 00:34:16.912146 systemd-networkd[1481]: Enumeration completed Aug 13 00:34:16.912627 systemd-networkd[1481]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:34:16.912637 systemd-networkd[1481]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:34:16.913298 systemd-networkd[1481]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:34:16.913305 systemd-networkd[1481]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 13 00:34:16.914328 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 13 00:34:16.915791 systemd-networkd[1481]: eth0: Link UP Aug 13 00:34:16.919120 extend-filesystems[1522]: Found /dev/sda6 Aug 13 00:34:16.919275 systemd-networkd[1481]: eth0: Gained carrier Aug 13 00:34:16.919291 systemd-networkd[1481]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:34:16.919986 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 13 00:34:16.921409 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Refreshing passwd entry cache Aug 13 00:34:16.922418 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 13 00:34:16.922869 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 13 00:34:16.923027 oslogin_cache_refresh[1523]: Refreshing passwd entry cache Aug 13 00:34:16.924339 extend-filesystems[1522]: Found /dev/sda9 Aug 13 00:34:16.925463 coreos-metadata[1516]: Aug 13 00:34:16.924 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Aug 13 00:34:16.925463 coreos-metadata[1516]: Aug 13 00:34:16.924 INFO Failed to fetch: error sending request for url (http://169.254.169.254/hetzner/v1/metadata) Aug 13 00:34:16.925927 systemd[1]: Starting update-engine.service - Update Engine... Aug 13 00:34:16.927363 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Failure getting users, quitting Aug 13 00:34:16.930578 extend-filesystems[1522]: Checking size of /dev/sda9 Aug 13 00:34:16.931552 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 00:34:16.931552 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Refreshing group entry cache Aug 13 00:34:16.931552 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Failure getting groups, quitting Aug 13 00:34:16.931552 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 00:34:16.929062 oslogin_cache_refresh[1523]: Failure getting users, quitting Aug 13 00:34:16.929010 systemd-networkd[1481]: eth1: Link UP Aug 13 00:34:16.929079 oslogin_cache_refresh[1523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Aug 13 00:34:16.929109 oslogin_cache_refresh[1523]: Refreshing group entry cache Aug 13 00:34:16.931937 systemd-networkd[1481]: eth1: Gained carrier Aug 13 00:34:16.929463 oslogin_cache_refresh[1523]: Failure getting groups, quitting Aug 13 00:34:16.931959 systemd-networkd[1481]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 13 00:34:16.929469 oslogin_cache_refresh[1523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Aug 13 00:34:16.932195 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 13 00:34:16.933730 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 13 00:34:16.935489 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 13 00:34:16.936985 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 13 00:34:16.937160 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 13 00:34:16.937398 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Aug 13 00:34:16.937532 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Aug 13 00:34:16.945492 systemd[1]: motdgen.service: Deactivated successfully. Aug 13 00:34:16.946858 extend-filesystems[1522]: Resized partition /dev/sda9 Aug 13 00:34:16.946875 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 13 00:34:16.949185 extend-filesystems[1548]: resize2fs 1.47.2 (1-Jan-2025) Aug 13 00:34:16.955717 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Aug 13 00:34:16.954383 systemd[1]: Reached target network.target - Network. Aug 13 00:34:16.960488 systemd[1]: Starting containerd.service - containerd container runtime... Aug 13 00:34:16.969640 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 13 00:34:16.971576 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 13 00:34:16.973113 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 13 00:34:16.973276 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 13 00:34:16.980028 jq[1538]: true Aug 13 00:34:16.979769 systemd-networkd[1481]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 13 00:34:16.980341 systemd-timesyncd[1438]: Network configuration changed, trying to establish connection. Aug 13 00:34:16.998823 update_engine[1535]: I20250813 00:34:16.992688 1535 main.cc:92] Flatcar Update Engine starting Aug 13 00:34:16.997056 systemd-networkd[1481]: eth0: DHCPv4 address 46.62.157.78/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 13 00:34:17.044086 tar[1545]: linux-amd64/LICENSE Aug 13 00:34:17.044086 tar[1545]: linux-amd64/helm Aug 13 00:34:17.001284 systemd-timesyncd[1438]: Network configuration changed, trying to establish connection. Aug 13 00:34:17.023330 dbus-daemon[1517]: [system] SELinux support is enabled Aug 13 00:34:17.022966 (ntainerd)[1566]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 13 00:34:17.023470 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 13 00:34:17.046098 jq[1559]: true Aug 13 00:34:17.031423 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 13 00:34:17.031446 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 13 00:34:17.033175 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 13 00:34:17.052139 update_engine[1535]: I20250813 00:34:17.048275 1535 update_check_scheduler.cc:74] Next update check in 5m42s Aug 13 00:34:17.033190 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 13 00:34:17.042751 systemd[1]: Started update-engine.service - Update Engine. Aug 13 00:34:17.084889 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 13 00:34:17.086077 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 13 00:34:17.143315 bash[1586]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:34:17.145374 systemd-logind[1533]: New seat seat0. Aug 13 00:34:17.145969 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 13 00:34:17.151147 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Aug 13 00:34:17.150389 systemd[1]: Starting sshkeys.service... Aug 13 00:34:17.156151 systemd[1]: Started systemd-logind.service - User Login Management. Aug 13 00:34:17.182729 extend-filesystems[1548]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Aug 13 00:34:17.182729 extend-filesystems[1548]: old_desc_blocks = 1, new_desc_blocks = 5 Aug 13 00:34:17.182729 extend-filesystems[1548]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Aug 13 00:34:17.193032 extend-filesystems[1522]: Resized filesystem in /dev/sda9 Aug 13 00:34:17.184421 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 13 00:34:17.184586 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 13 00:34:17.195372 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 13 00:34:17.199312 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 13 00:34:17.252824 sshd_keygen[1564]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 13 00:34:17.302212 coreos-metadata[1596]: Aug 13 00:34:17.301 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Aug 13 00:34:17.305662 coreos-metadata[1596]: Aug 13 00:34:17.305 INFO Fetch successful Aug 13 00:34:17.308039 containerd[1566]: time="2025-08-13T00:34:17Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 13 00:34:17.307539 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 13 00:34:17.309225 containerd[1566]: time="2025-08-13T00:34:17.308907542Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Aug 13 00:34:17.309901 unknown[1596]: wrote ssh authorized keys file for user: core Aug 13 00:34:17.318583 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326187327Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.529µs" Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326216782Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326233524Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326352146Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326365020Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326385228Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326430583Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326439760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326604209Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326617634Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326625869Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 13 00:34:17.326909 containerd[1566]: time="2025-08-13T00:34:17.326631821Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 13 00:34:17.327110 containerd[1566]: time="2025-08-13T00:34:17.326687305Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 13 00:34:17.329774 containerd[1566]: time="2025-08-13T00:34:17.329352334Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 00:34:17.329774 containerd[1566]: time="2025-08-13T00:34:17.329383102Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 13 00:34:17.329774 containerd[1566]: time="2025-08-13T00:34:17.329392790Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 13 00:34:17.331123 containerd[1566]: time="2025-08-13T00:34:17.330752491Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 13 00:34:17.332529 containerd[1566]: time="2025-08-13T00:34:17.332216357Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 13 00:34:17.332529 containerd[1566]: time="2025-08-13T00:34:17.332273504Z" level=info msg="metadata content store policy set" policy=shared Aug 13 00:34:17.338281 containerd[1566]: time="2025-08-13T00:34:17.338147954Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 13 00:34:17.338281 containerd[1566]: time="2025-08-13T00:34:17.338195644Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 13 00:34:17.338281 containerd[1566]: time="2025-08-13T00:34:17.338211624Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 13 00:34:17.338281 containerd[1566]: time="2025-08-13T00:34:17.338221622Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 13 00:34:17.338281 containerd[1566]: time="2025-08-13T00:34:17.338232332Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338397392Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338419544Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338429742Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338438179Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338446144Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338453217Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338463756Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338549027Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338566930Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338579423Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338593640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338604581Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338613848Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338623126Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 13 00:34:17.339025 containerd[1566]: time="2025-08-13T00:34:17.338631240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 13 00:34:17.339267 containerd[1566]: time="2025-08-13T00:34:17.338653442Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 13 00:34:17.339267 containerd[1566]: time="2025-08-13T00:34:17.338662229Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 13 00:34:17.339267 containerd[1566]: time="2025-08-13T00:34:17.338670414Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 13 00:34:17.339363 containerd[1566]: time="2025-08-13T00:34:17.339347675Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 13 00:34:17.340137 containerd[1566]: time="2025-08-13T00:34:17.339843054Z" level=info msg="Start snapshots syncer" Aug 13 00:34:17.340137 containerd[1566]: time="2025-08-13T00:34:17.340082744Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 13 00:34:17.343932 containerd[1566]: time="2025-08-13T00:34:17.343122856Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 13 00:34:17.343932 containerd[1566]: time="2025-08-13T00:34:17.343165667Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 13 00:34:17.341016 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 13 00:34:17.346763 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Aug 13 00:34:17.347859 update-ssh-keys[1611]: Updated "/home/core/.ssh/authorized_keys" Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348148013Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348255575Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348277066Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348287795Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348296191Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348305559Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348315748Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348325216Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348348279Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348357176Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 13 00:34:17.351435 containerd[1566]: time="2025-08-13T00:34:17.348365191Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 13 00:34:17.348580 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 13 00:34:17.350325 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354474782Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354499728Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354509537Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354753274Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354767251Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354790584Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354817996Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354841249Z" level=info msg="runtime interface created" Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354847231Z" level=info msg="created NRI interface" Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354889690Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354911601Z" level=info msg="Connect containerd service" Aug 13 00:34:17.355460 containerd[1566]: time="2025-08-13T00:34:17.354938211Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 13 00:34:17.355112 systemd[1]: Finished sshkeys.service. Aug 13 00:34:17.362386 containerd[1566]: time="2025-08-13T00:34:17.360092660Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 13 00:34:17.364318 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 13 00:34:17.368163 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Aug 13 00:34:17.395202 locksmithd[1571]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 13 00:34:17.400397 systemd[1]: issuegen.service: Deactivated successfully. Aug 13 00:34:17.400559 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 13 00:34:17.404831 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 13 00:34:17.445621 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 13 00:34:17.451055 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 13 00:34:17.454042 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Aug 13 00:34:17.454593 systemd[1]: Reached target getty.target - Login Prompts. Aug 13 00:34:17.489816 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Aug 13 00:34:17.490077 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Aug 13 00:34:17.490189 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Aug 13 00:34:17.492968 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Aug 13 00:34:17.495306 kernel: Console: switching to colour dummy device 80x25 Aug 13 00:34:17.495347 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Aug 13 00:34:17.495359 kernel: [drm] features: -context_init Aug 13 00:34:17.497014 kernel: [drm] number of scanouts: 1 Aug 13 00:34:17.497980 kernel: [drm] number of cap sets: 0 Aug 13 00:34:17.501009 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Aug 13 00:34:17.509136 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:34:17.514811 kernel: EDAC MC: Ver: 3.0.0 Aug 13 00:34:17.518736 containerd[1566]: time="2025-08-13T00:34:17.518686590Z" level=info msg="Start subscribing containerd event" Aug 13 00:34:17.518887 containerd[1566]: time="2025-08-13T00:34:17.518860686Z" level=info msg="Start recovering state" Aug 13 00:34:17.518990 containerd[1566]: time="2025-08-13T00:34:17.518978969Z" level=info msg="Start event monitor" Aug 13 00:34:17.519467 containerd[1566]: time="2025-08-13T00:34:17.519453679Z" level=info msg="Start cni network conf syncer for default" Aug 13 00:34:17.519514 containerd[1566]: time="2025-08-13T00:34:17.519505175Z" level=info msg="Start streaming server" Aug 13 00:34:17.519576 containerd[1566]: time="2025-08-13T00:34:17.519565528Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 13 00:34:17.519616 containerd[1566]: time="2025-08-13T00:34:17.519608178Z" level=info msg="runtime interface starting up..." Aug 13 00:34:17.519648 containerd[1566]: time="2025-08-13T00:34:17.519641230Z" level=info msg="starting plugins..." Aug 13 00:34:17.519689 containerd[1566]: time="2025-08-13T00:34:17.519680885Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 13 00:34:17.521623 containerd[1566]: time="2025-08-13T00:34:17.521429074Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 13 00:34:17.521623 containerd[1566]: time="2025-08-13T00:34:17.521498425Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 13 00:34:17.521758 systemd[1]: Started containerd.service - containerd container runtime. Aug 13 00:34:17.522250 containerd[1566]: time="2025-08-13T00:34:17.522155948Z" level=info msg="containerd successfully booted in 0.215398s" Aug 13 00:34:17.569239 systemd-logind[1533]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Aug 13 00:34:17.606778 systemd-logind[1533]: Watching system buttons on /dev/input/event3 (Power Button) Aug 13 00:34:17.623201 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:34:17.640290 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 13 00:34:17.640427 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:34:17.640758 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:34:17.641916 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 13 00:34:17.644217 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 13 00:34:17.689559 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 13 00:34:17.849938 tar[1545]: linux-amd64/README.md Aug 13 00:34:17.864649 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 13 00:34:17.925049 coreos-metadata[1516]: Aug 13 00:34:17.924 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #2 Aug 13 00:34:17.925766 coreos-metadata[1516]: Aug 13 00:34:17.925 INFO Fetch successful Aug 13 00:34:17.926106 coreos-metadata[1516]: Aug 13 00:34:17.926 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Aug 13 00:34:17.926384 coreos-metadata[1516]: Aug 13 00:34:17.926 INFO Fetch successful Aug 13 00:34:17.975283 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 13 00:34:17.976204 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 13 00:34:18.386943 systemd-networkd[1481]: eth1: Gained IPv6LL Aug 13 00:34:18.387534 systemd-timesyncd[1438]: Network configuration changed, trying to establish connection. Aug 13 00:34:18.389674 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 13 00:34:18.391024 systemd[1]: Reached target network-online.target - Network is Online. Aug 13 00:34:18.393509 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:18.396920 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 13 00:34:18.424644 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 13 00:34:18.707013 systemd-networkd[1481]: eth0: Gained IPv6LL Aug 13 00:34:18.707678 systemd-timesyncd[1438]: Network configuration changed, trying to establish connection. Aug 13 00:34:19.457549 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:19.458032 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 13 00:34:19.458899 systemd[1]: Startup finished in 3.039s (kernel) + 5.319s (initrd) + 4.546s (userspace) = 12.906s. Aug 13 00:34:19.464099 (kubelet)[1706]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:34:20.069767 kubelet[1706]: E0813 00:34:20.069719 1706 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:34:20.072294 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:34:20.072430 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:34:20.072964 systemd[1]: kubelet.service: Consumed 1.084s CPU time, 269.5M memory peak. Aug 13 00:34:26.477637 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 13 00:34:26.478722 systemd[1]: Started sshd@0-46.62.157.78:22-139.178.89.65:43834.service - OpenSSH per-connection server daemon (139.178.89.65:43834). Aug 13 00:34:27.460549 sshd[1718]: Accepted publickey for core from 139.178.89.65 port 43834 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:27.461955 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:27.468190 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 13 00:34:27.469324 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 13 00:34:27.477358 systemd-logind[1533]: New session 1 of user core. Aug 13 00:34:27.488270 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 13 00:34:27.490525 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 13 00:34:27.501770 (systemd)[1722]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 13 00:34:27.504073 systemd-logind[1533]: New session c1 of user core. Aug 13 00:34:27.633215 systemd[1722]: Queued start job for default target default.target. Aug 13 00:34:27.643404 systemd[1722]: Created slice app.slice - User Application Slice. Aug 13 00:34:27.643426 systemd[1722]: Reached target paths.target - Paths. Aug 13 00:34:27.643459 systemd[1722]: Reached target timers.target - Timers. Aug 13 00:34:27.644416 systemd[1722]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 13 00:34:27.653084 systemd[1722]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 13 00:34:27.653121 systemd[1722]: Reached target sockets.target - Sockets. Aug 13 00:34:27.653149 systemd[1722]: Reached target basic.target - Basic System. Aug 13 00:34:27.653174 systemd[1722]: Reached target default.target - Main User Target. Aug 13 00:34:27.653192 systemd[1722]: Startup finished in 143ms. Aug 13 00:34:27.653287 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 13 00:34:27.660888 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 13 00:34:28.337803 systemd[1]: Started sshd@1-46.62.157.78:22-139.178.89.65:43850.service - OpenSSH per-connection server daemon (139.178.89.65:43850). Aug 13 00:34:29.331641 sshd[1733]: Accepted publickey for core from 139.178.89.65 port 43850 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:29.333487 sshd-session[1733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:29.342573 systemd-logind[1533]: New session 2 of user core. Aug 13 00:34:29.345910 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 13 00:34:30.005257 sshd[1735]: Connection closed by 139.178.89.65 port 43850 Aug 13 00:34:30.005910 sshd-session[1733]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:30.009610 systemd-logind[1533]: Session 2 logged out. Waiting for processes to exit. Aug 13 00:34:30.009692 systemd[1]: sshd@1-46.62.157.78:22-139.178.89.65:43850.service: Deactivated successfully. Aug 13 00:34:30.011277 systemd[1]: session-2.scope: Deactivated successfully. Aug 13 00:34:30.012492 systemd-logind[1533]: Removed session 2. Aug 13 00:34:30.176434 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 13 00:34:30.177765 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:30.179893 systemd[1]: Started sshd@2-46.62.157.78:22-139.178.89.65:57822.service - OpenSSH per-connection server daemon (139.178.89.65:57822). Aug 13 00:34:30.286633 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:30.290953 (kubelet)[1751]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:34:30.320667 kubelet[1751]: E0813 00:34:30.320607 1751 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:34:30.324268 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:34:30.324398 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:34:30.324643 systemd[1]: kubelet.service: Consumed 121ms CPU time, 110.3M memory peak. Aug 13 00:34:31.153429 sshd[1742]: Accepted publickey for core from 139.178.89.65 port 57822 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:31.154563 sshd-session[1742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:31.159668 systemd-logind[1533]: New session 3 of user core. Aug 13 00:34:31.168814 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 13 00:34:31.821808 sshd[1759]: Connection closed by 139.178.89.65 port 57822 Aug 13 00:34:31.822341 sshd-session[1742]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:31.824965 systemd[1]: sshd@2-46.62.157.78:22-139.178.89.65:57822.service: Deactivated successfully. Aug 13 00:34:31.826536 systemd[1]: session-3.scope: Deactivated successfully. Aug 13 00:34:31.827655 systemd-logind[1533]: Session 3 logged out. Waiting for processes to exit. Aug 13 00:34:31.829053 systemd-logind[1533]: Removed session 3. Aug 13 00:34:31.989682 systemd[1]: Started sshd@3-46.62.157.78:22-139.178.89.65:57824.service - OpenSSH per-connection server daemon (139.178.89.65:57824). Aug 13 00:34:32.975067 sshd[1765]: Accepted publickey for core from 139.178.89.65 port 57824 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:32.976346 sshd-session[1765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:32.981464 systemd-logind[1533]: New session 4 of user core. Aug 13 00:34:32.987885 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 13 00:34:33.651818 sshd[1767]: Connection closed by 139.178.89.65 port 57824 Aug 13 00:34:33.652395 sshd-session[1765]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:33.655465 systemd[1]: sshd@3-46.62.157.78:22-139.178.89.65:57824.service: Deactivated successfully. Aug 13 00:34:33.657340 systemd[1]: session-4.scope: Deactivated successfully. Aug 13 00:34:33.659076 systemd-logind[1533]: Session 4 logged out. Waiting for processes to exit. Aug 13 00:34:33.660217 systemd-logind[1533]: Removed session 4. Aug 13 00:34:33.819490 systemd[1]: Started sshd@4-46.62.157.78:22-139.178.89.65:57828.service - OpenSSH per-connection server daemon (139.178.89.65:57828). Aug 13 00:34:34.798724 sshd[1773]: Accepted publickey for core from 139.178.89.65 port 57828 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:34.800369 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:34.805056 systemd-logind[1533]: New session 5 of user core. Aug 13 00:34:34.811893 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 13 00:34:35.322949 sudo[1776]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 13 00:34:35.323194 sudo[1776]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:34:35.346494 sudo[1776]: pam_unix(sudo:session): session closed for user root Aug 13 00:34:35.504058 sshd[1775]: Connection closed by 139.178.89.65 port 57828 Aug 13 00:34:35.504882 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:35.508036 systemd[1]: sshd@4-46.62.157.78:22-139.178.89.65:57828.service: Deactivated successfully. Aug 13 00:34:35.509857 systemd[1]: session-5.scope: Deactivated successfully. Aug 13 00:34:35.510943 systemd-logind[1533]: Session 5 logged out. Waiting for processes to exit. Aug 13 00:34:35.512686 systemd-logind[1533]: Removed session 5. Aug 13 00:34:35.670305 systemd[1]: Started sshd@5-46.62.157.78:22-139.178.89.65:57844.service - OpenSSH per-connection server daemon (139.178.89.65:57844). Aug 13 00:34:36.644595 sshd[1782]: Accepted publickey for core from 139.178.89.65 port 57844 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:36.646101 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:36.651589 systemd-logind[1533]: New session 6 of user core. Aug 13 00:34:36.657870 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 13 00:34:37.157995 sudo[1786]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 13 00:34:37.158238 sudo[1786]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:34:37.162829 sudo[1786]: pam_unix(sudo:session): session closed for user root Aug 13 00:34:37.168070 sudo[1785]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 13 00:34:37.168303 sudo[1785]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:34:37.178590 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 13 00:34:37.216131 augenrules[1808]: No rules Aug 13 00:34:37.218031 systemd[1]: audit-rules.service: Deactivated successfully. Aug 13 00:34:37.218410 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 13 00:34:37.219968 sudo[1785]: pam_unix(sudo:session): session closed for user root Aug 13 00:34:37.377165 sshd[1784]: Connection closed by 139.178.89.65 port 57844 Aug 13 00:34:37.378012 sshd-session[1782]: pam_unix(sshd:session): session closed for user core Aug 13 00:34:37.383361 systemd-logind[1533]: Session 6 logged out. Waiting for processes to exit. Aug 13 00:34:37.383647 systemd[1]: sshd@5-46.62.157.78:22-139.178.89.65:57844.service: Deactivated successfully. Aug 13 00:34:37.386073 systemd[1]: session-6.scope: Deactivated successfully. Aug 13 00:34:37.388410 systemd-logind[1533]: Removed session 6. Aug 13 00:34:37.543988 systemd[1]: Started sshd@6-46.62.157.78:22-139.178.89.65:57854.service - OpenSSH per-connection server daemon (139.178.89.65:57854). Aug 13 00:34:38.527084 sshd[1817]: Accepted publickey for core from 139.178.89.65 port 57854 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:34:38.528930 sshd-session[1817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:34:38.536777 systemd-logind[1533]: New session 7 of user core. Aug 13 00:34:38.545020 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 13 00:34:39.047318 sudo[1820]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 13 00:34:39.047833 sudo[1820]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 13 00:34:39.414888 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 13 00:34:39.429125 (dockerd)[1838]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 13 00:34:39.643009 dockerd[1838]: time="2025-08-13T00:34:39.642943285Z" level=info msg="Starting up" Aug 13 00:34:39.644173 dockerd[1838]: time="2025-08-13T00:34:39.644142304Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 13 00:34:39.668487 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2167891818-merged.mount: Deactivated successfully. Aug 13 00:34:39.695724 dockerd[1838]: time="2025-08-13T00:34:39.695655369Z" level=info msg="Loading containers: start." Aug 13 00:34:39.706753 kernel: Initializing XFRM netlink socket Aug 13 00:34:39.875650 systemd-timesyncd[1438]: Network configuration changed, trying to establish connection. Aug 13 00:34:39.906513 systemd-networkd[1481]: docker0: Link UP Aug 13 00:34:39.910423 dockerd[1838]: time="2025-08-13T00:34:39.910383561Z" level=info msg="Loading containers: done." Aug 13 00:34:39.922661 dockerd[1838]: time="2025-08-13T00:34:39.922572376Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 13 00:34:39.922911 dockerd[1838]: time="2025-08-13T00:34:39.922839518Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Aug 13 00:34:39.922984 dockerd[1838]: time="2025-08-13T00:34:39.922953041Z" level=info msg="Initializing buildkit" Aug 13 00:34:39.942554 dockerd[1838]: time="2025-08-13T00:34:39.942492325Z" level=info msg="Completed buildkit initialization" Aug 13 00:34:39.949324 dockerd[1838]: time="2025-08-13T00:34:39.949287421Z" level=info msg="Daemon has completed initialization" Aug 13 00:34:39.949467 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 13 00:34:39.950097 dockerd[1838]: time="2025-08-13T00:34:39.949839998Z" level=info msg="API listen on /run/docker.sock" Aug 13 00:34:40.081338 systemd-timesyncd[1438]: Contacted time server 144.76.138.23:123 (2.flatcar.pool.ntp.org). Aug 13 00:34:40.081906 systemd-timesyncd[1438]: Initial clock synchronization to Wed 2025-08-13 00:34:40.337185 UTC. Aug 13 00:34:40.575047 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 13 00:34:40.576890 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:40.698021 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:40.703948 (kubelet)[2048]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:34:40.735571 kubelet[2048]: E0813 00:34:40.735493 2048 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:34:40.738316 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:34:40.738432 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:34:40.738715 systemd[1]: kubelet.service: Consumed 119ms CPU time, 108M memory peak. Aug 13 00:34:40.776294 containerd[1566]: time="2025-08-13T00:34:40.776242993Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.3\"" Aug 13 00:34:41.340354 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2142426927.mount: Deactivated successfully. Aug 13 00:34:42.446618 containerd[1566]: time="2025-08-13T00:34:42.446556924Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:42.447454 containerd[1566]: time="2025-08-13T00:34:42.447379454Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.3: active requests=0, bytes read=30078331" Aug 13 00:34:42.448243 containerd[1566]: time="2025-08-13T00:34:42.448207916Z" level=info msg="ImageCreate event name:\"sha256:a92b4b92a991677d355596cc4aa9b0b12cbc38e8cbdc1e476548518ae045bc4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:42.450641 containerd[1566]: time="2025-08-13T00:34:42.450607808Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:125a8b488def5ea24e2de5682ab1abf063163aae4d89ce21811a45f3ecf23816\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:42.451522 containerd[1566]: time="2025-08-13T00:34:42.451485615Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.3\" with image id \"sha256:a92b4b92a991677d355596cc4aa9b0b12cbc38e8cbdc1e476548518ae045bc4a\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:125a8b488def5ea24e2de5682ab1abf063163aae4d89ce21811a45f3ecf23816\", size \"30075037\" in 1.675207514s" Aug 13 00:34:42.451593 containerd[1566]: time="2025-08-13T00:34:42.451525896Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.3\" returns image reference \"sha256:a92b4b92a991677d355596cc4aa9b0b12cbc38e8cbdc1e476548518ae045bc4a\"" Aug 13 00:34:42.452323 containerd[1566]: time="2025-08-13T00:34:42.452293633Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.3\"" Aug 13 00:34:43.615278 containerd[1566]: time="2025-08-13T00:34:43.615223083Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:43.616274 containerd[1566]: time="2025-08-13T00:34:43.616245183Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.3: active requests=0, bytes read=26019383" Aug 13 00:34:43.617344 containerd[1566]: time="2025-08-13T00:34:43.617310968Z" level=info msg="ImageCreate event name:\"sha256:bf97fadcef43049604abcf0caf4f35229fbee25bd0cdb6fdc1d2bbb4f03d9660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:43.619402 containerd[1566]: time="2025-08-13T00:34:43.619367619Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:96091626e37c5d5920ee6c3203b783cc01a08f287ec0713aeb7809bb62ccea90\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:43.620079 containerd[1566]: time="2025-08-13T00:34:43.619969179Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.3\" with image id \"sha256:bf97fadcef43049604abcf0caf4f35229fbee25bd0cdb6fdc1d2bbb4f03d9660\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:96091626e37c5d5920ee6c3203b783cc01a08f287ec0713aeb7809bb62ccea90\", size \"27646922\" in 1.167647945s" Aug 13 00:34:43.620079 containerd[1566]: time="2025-08-13T00:34:43.619995723Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.3\" returns image reference \"sha256:bf97fadcef43049604abcf0caf4f35229fbee25bd0cdb6fdc1d2bbb4f03d9660\"" Aug 13 00:34:43.620551 containerd[1566]: time="2025-08-13T00:34:43.620525998Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.3\"" Aug 13 00:34:44.773408 containerd[1566]: time="2025-08-13T00:34:44.773335969Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:44.774520 containerd[1566]: time="2025-08-13T00:34:44.774492870Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.3: active requests=0, bytes read=20155035" Aug 13 00:34:44.775666 containerd[1566]: time="2025-08-13T00:34:44.775631473Z" level=info msg="ImageCreate event name:\"sha256:41376797d5122e388663ab6d0ad583e58cff63e1a0f1eebfb31d615d8f1c1c87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:44.777916 containerd[1566]: time="2025-08-13T00:34:44.777877222Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f3a2ffdd7483168205236f7762e9a1933f17dd733bc0188b52bddab9c0762868\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:44.778597 containerd[1566]: time="2025-08-13T00:34:44.778496551Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.3\" with image id \"sha256:41376797d5122e388663ab6d0ad583e58cff63e1a0f1eebfb31d615d8f1c1c87\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f3a2ffdd7483168205236f7762e9a1933f17dd733bc0188b52bddab9c0762868\", size \"21782592\" in 1.157943845s" Aug 13 00:34:44.778597 containerd[1566]: time="2025-08-13T00:34:44.778522545Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.3\" returns image reference \"sha256:41376797d5122e388663ab6d0ad583e58cff63e1a0f1eebfb31d615d8f1c1c87\"" Aug 13 00:34:44.778983 containerd[1566]: time="2025-08-13T00:34:44.778968987Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.3\"" Aug 13 00:34:45.750957 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2541726708.mount: Deactivated successfully. Aug 13 00:34:46.038279 containerd[1566]: time="2025-08-13T00:34:46.038153394Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:46.039223 containerd[1566]: time="2025-08-13T00:34:46.039053076Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.3: active requests=0, bytes read=31892694" Aug 13 00:34:46.039946 containerd[1566]: time="2025-08-13T00:34:46.039912513Z" level=info msg="ImageCreate event name:\"sha256:af855adae796077ff822e22c0102f686b2ca7b7c51948889b1825388eaac9234\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:46.041809 containerd[1566]: time="2025-08-13T00:34:46.041785004Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c69929cfba9e38305eb1e20ca859aeb90e0d2a7326eab9bb1e8298882fe626cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:46.042354 containerd[1566]: time="2025-08-13T00:34:46.042334137Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.3\" with image id \"sha256:af855adae796077ff822e22c0102f686b2ca7b7c51948889b1825388eaac9234\", repo tag \"registry.k8s.io/kube-proxy:v1.33.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:c69929cfba9e38305eb1e20ca859aeb90e0d2a7326eab9bb1e8298882fe626cd\", size \"31891685\" in 1.26325707s" Aug 13 00:34:46.042533 containerd[1566]: time="2025-08-13T00:34:46.042414615Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.3\" returns image reference \"sha256:af855adae796077ff822e22c0102f686b2ca7b7c51948889b1825388eaac9234\"" Aug 13 00:34:46.042870 containerd[1566]: time="2025-08-13T00:34:46.042854920Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Aug 13 00:34:46.561589 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2674393578.mount: Deactivated successfully. Aug 13 00:34:47.388753 containerd[1566]: time="2025-08-13T00:34:47.388687960Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.389618 containerd[1566]: time="2025-08-13T00:34:47.389521282Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942332" Aug 13 00:34:47.390348 containerd[1566]: time="2025-08-13T00:34:47.390303527Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.392525 containerd[1566]: time="2025-08-13T00:34:47.392484475Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:47.393664 containerd[1566]: time="2025-08-13T00:34:47.393264301Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.350339201s" Aug 13 00:34:47.393664 containerd[1566]: time="2025-08-13T00:34:47.393290449Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Aug 13 00:34:47.393802 containerd[1566]: time="2025-08-13T00:34:47.393774896Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 13 00:34:47.845535 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount62704369.mount: Deactivated successfully. Aug 13 00:34:47.850951 containerd[1566]: time="2025-08-13T00:34:47.850883739Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:34:47.851650 containerd[1566]: time="2025-08-13T00:34:47.851604040Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Aug 13 00:34:47.853028 containerd[1566]: time="2025-08-13T00:34:47.852992736Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:34:47.854948 containerd[1566]: time="2025-08-13T00:34:47.854907381Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 13 00:34:47.855474 containerd[1566]: time="2025-08-13T00:34:47.855426899Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 461.626455ms" Aug 13 00:34:47.855474 containerd[1566]: time="2025-08-13T00:34:47.855458932Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Aug 13 00:34:47.856260 containerd[1566]: time="2025-08-13T00:34:47.856236044Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Aug 13 00:34:48.336670 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3396639901.mount: Deactivated successfully. Aug 13 00:34:49.705290 containerd[1566]: time="2025-08-13T00:34:49.705238510Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:49.706244 containerd[1566]: time="2025-08-13T00:34:49.706217141Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58247215" Aug 13 00:34:49.707178 containerd[1566]: time="2025-08-13T00:34:49.707138727Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:49.709243 containerd[1566]: time="2025-08-13T00:34:49.709201747Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:34:49.710117 containerd[1566]: time="2025-08-13T00:34:49.709959183Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 1.853698537s" Aug 13 00:34:49.710117 containerd[1566]: time="2025-08-13T00:34:49.709989438Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Aug 13 00:34:50.870460 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 13 00:34:50.873445 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:51.019879 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:51.028057 (kubelet)[2266]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 13 00:34:51.065330 kubelet[2266]: E0813 00:34:51.065296 2266 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 13 00:34:51.067577 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 13 00:34:51.067688 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 13 00:34:51.068219 systemd[1]: kubelet.service: Consumed 130ms CPU time, 109.2M memory peak. Aug 13 00:34:53.697886 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:53.698608 systemd[1]: kubelet.service: Consumed 130ms CPU time, 109.2M memory peak. Aug 13 00:34:53.712510 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:53.742963 systemd[1]: Reload requested from client PID 2280 ('systemctl') (unit session-7.scope)... Aug 13 00:34:53.742996 systemd[1]: Reloading... Aug 13 00:34:53.864769 zram_generator::config[2324]: No configuration found. Aug 13 00:34:53.930970 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:34:54.023585 systemd[1]: Reloading finished in 280 ms. Aug 13 00:34:54.069424 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:54.072365 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:54.073528 systemd[1]: kubelet.service: Deactivated successfully. Aug 13 00:34:54.073756 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:54.073792 systemd[1]: kubelet.service: Consumed 101ms CPU time, 98.3M memory peak. Aug 13 00:34:54.075090 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:54.213869 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:54.220913 (kubelet)[2380]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:34:54.250742 kubelet[2380]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:54.250742 kubelet[2380]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 13 00:34:54.250742 kubelet[2380]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:54.252804 kubelet[2380]: I0813 00:34:54.252766 2380 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:34:54.497249 kubelet[2380]: I0813 00:34:54.497205 2380 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 13 00:34:54.497249 kubelet[2380]: I0813 00:34:54.497240 2380 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:34:54.497567 kubelet[2380]: I0813 00:34:54.497543 2380 server.go:956] "Client rotation is on, will bootstrap in background" Aug 13 00:34:54.528280 kubelet[2380]: I0813 00:34:54.528213 2380 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:34:54.529068 kubelet[2380]: E0813 00:34:54.528877 2380 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://46.62.157.78:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 46.62.157.78:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 13 00:34:54.544032 kubelet[2380]: I0813 00:34:54.544007 2380 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 00:34:54.551042 kubelet[2380]: I0813 00:34:54.551023 2380 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:34:54.553432 kubelet[2380]: I0813 00:34:54.553395 2380 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:34:54.555949 kubelet[2380]: I0813 00:34:54.553423 2380 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-1-0-b-5ba4a9a74b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:34:54.556655 kubelet[2380]: I0813 00:34:54.556609 2380 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:34:54.556655 kubelet[2380]: I0813 00:34:54.556629 2380 container_manager_linux.go:303] "Creating device plugin manager" Aug 13 00:34:54.557521 kubelet[2380]: I0813 00:34:54.557488 2380 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:54.559364 kubelet[2380]: I0813 00:34:54.559346 2380 kubelet.go:480] "Attempting to sync node with API server" Aug 13 00:34:54.559364 kubelet[2380]: I0813 00:34:54.559361 2380 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:34:54.560416 kubelet[2380]: I0813 00:34:54.560385 2380 kubelet.go:386] "Adding apiserver pod source" Aug 13 00:34:54.560416 kubelet[2380]: I0813 00:34:54.560403 2380 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:34:54.565451 kubelet[2380]: E0813 00:34:54.565064 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://46.62.157.78:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-1-0-b-5ba4a9a74b&limit=500&resourceVersion=0\": dial tcp 46.62.157.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 13 00:34:54.566149 kubelet[2380]: E0813 00:34:54.565781 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://46.62.157.78:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.62.157.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 13 00:34:54.566377 kubelet[2380]: I0813 00:34:54.566360 2380 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 00:34:54.567248 kubelet[2380]: I0813 00:34:54.567216 2380 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 13 00:34:54.568910 kubelet[2380]: W0813 00:34:54.568489 2380 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 13 00:34:54.574367 kubelet[2380]: I0813 00:34:54.574343 2380 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 13 00:34:54.574409 kubelet[2380]: I0813 00:34:54.574390 2380 server.go:1289] "Started kubelet" Aug 13 00:34:54.575660 kubelet[2380]: I0813 00:34:54.575622 2380 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:34:54.576453 kubelet[2380]: I0813 00:34:54.576371 2380 server.go:317] "Adding debug handlers to kubelet server" Aug 13 00:34:54.578538 kubelet[2380]: I0813 00:34:54.578488 2380 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:34:54.578796 kubelet[2380]: I0813 00:34:54.578738 2380 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:34:54.580622 kubelet[2380]: I0813 00:34:54.580607 2380 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:34:54.581342 kubelet[2380]: E0813 00:34:54.578821 2380 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://46.62.157.78:6443/api/v1/namespaces/default/events\": dial tcp 46.62.157.78:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372-1-0-b-5ba4a9a74b.185b2c6593d362e6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-1-0-b-5ba4a9a74b,UID:ci-4372-1-0-b-5ba4a9a74b,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-b-5ba4a9a74b,},FirstTimestamp:2025-08-13 00:34:54.574363366 +0000 UTC m=+0.350329624,LastTimestamp:2025-08-13 00:34:54.574363366 +0000 UTC m=+0.350329624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-b-5ba4a9a74b,}" Aug 13 00:34:54.581464 kubelet[2380]: I0813 00:34:54.581433 2380 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:34:54.593475 kubelet[2380]: I0813 00:34:54.593254 2380 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 13 00:34:54.593475 kubelet[2380]: E0813 00:34:54.593338 2380 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" Aug 13 00:34:54.593475 kubelet[2380]: I0813 00:34:54.593385 2380 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 13 00:34:54.593475 kubelet[2380]: I0813 00:34:54.593422 2380 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:34:54.593779 kubelet[2380]: E0813 00:34:54.593688 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://46.62.157.78:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.62.157.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 13 00:34:54.593961 kubelet[2380]: E0813 00:34:54.593927 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.157.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-b-5ba4a9a74b?timeout=10s\": dial tcp 46.62.157.78:6443: connect: connection refused" interval="200ms" Aug 13 00:34:54.595205 kubelet[2380]: I0813 00:34:54.594960 2380 factory.go:223] Registration of the systemd container factory successfully Aug 13 00:34:54.595205 kubelet[2380]: I0813 00:34:54.595169 2380 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:34:54.597164 kubelet[2380]: I0813 00:34:54.597150 2380 factory.go:223] Registration of the containerd container factory successfully Aug 13 00:34:54.607734 kubelet[2380]: E0813 00:34:54.606923 2380 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 13 00:34:54.609274 kubelet[2380]: I0813 00:34:54.609241 2380 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 13 00:34:54.610135 kubelet[2380]: I0813 00:34:54.610115 2380 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 13 00:34:54.610135 kubelet[2380]: I0813 00:34:54.610132 2380 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 13 00:34:54.610226 kubelet[2380]: I0813 00:34:54.610146 2380 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 13 00:34:54.610226 kubelet[2380]: I0813 00:34:54.610152 2380 kubelet.go:2436] "Starting kubelet main sync loop" Aug 13 00:34:54.610226 kubelet[2380]: E0813 00:34:54.610178 2380 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:34:54.614166 kubelet[2380]: I0813 00:34:54.614156 2380 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 13 00:34:54.614225 kubelet[2380]: I0813 00:34:54.614216 2380 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 13 00:34:54.614269 kubelet[2380]: I0813 00:34:54.614263 2380 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:54.616055 kubelet[2380]: I0813 00:34:54.616036 2380 policy_none.go:49] "None policy: Start" Aug 13 00:34:54.616055 kubelet[2380]: I0813 00:34:54.616055 2380 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 13 00:34:54.616114 kubelet[2380]: I0813 00:34:54.616064 2380 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:34:54.616258 kubelet[2380]: E0813 00:34:54.616242 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://46.62.157.78:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.62.157.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 13 00:34:54.620047 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 13 00:34:54.628641 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 13 00:34:54.631373 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 13 00:34:54.640366 kubelet[2380]: E0813 00:34:54.640342 2380 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 13 00:34:54.640607 kubelet[2380]: I0813 00:34:54.640536 2380 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:34:54.640607 kubelet[2380]: I0813 00:34:54.640553 2380 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:34:54.642461 kubelet[2380]: I0813 00:34:54.641607 2380 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:34:54.642461 kubelet[2380]: E0813 00:34:54.642297 2380 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 13 00:34:54.642461 kubelet[2380]: E0813 00:34:54.642334 2380 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372-1-0-b-5ba4a9a74b\" not found" Aug 13 00:34:54.731104 systemd[1]: Created slice kubepods-burstable-podfef85c3d0fdc44ae8154ece24a43b583.slice - libcontainer container kubepods-burstable-podfef85c3d0fdc44ae8154ece24a43b583.slice. Aug 13 00:34:54.746441 kubelet[2380]: E0813 00:34:54.745320 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.754331 kubelet[2380]: I0813 00:34:54.753416 2380 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.757735 kubelet[2380]: E0813 00:34:54.756846 2380 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.62.157.78:6443/api/v1/nodes\": dial tcp 46.62.157.78:6443: connect: connection refused" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.756909 systemd[1]: Created slice kubepods-burstable-pod2c31ec63200bf2219322105654562aae.slice - libcontainer container kubepods-burstable-pod2c31ec63200bf2219322105654562aae.slice. Aug 13 00:34:54.781912 kubelet[2380]: E0813 00:34:54.781841 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.786015 systemd[1]: Created slice kubepods-burstable-pod7983cdd5fb9e915219abb83d8b803493.slice - libcontainer container kubepods-burstable-pod7983cdd5fb9e915219abb83d8b803493.slice. Aug 13 00:34:54.789997 kubelet[2380]: E0813 00:34:54.789961 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.794724 kubelet[2380]: E0813 00:34:54.794628 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.157.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-b-5ba4a9a74b?timeout=10s\": dial tcp 46.62.157.78:6443: connect: connection refused" interval="400ms" Aug 13 00:34:54.794933 kubelet[2380]: I0813 00:34:54.794890 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fef85c3d0fdc44ae8154ece24a43b583-k8s-certs\") pod \"kube-apiserver-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"fef85c3d0fdc44ae8154ece24a43b583\") " pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.795137 kubelet[2380]: I0813 00:34:54.795032 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fef85c3d0fdc44ae8154ece24a43b583-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"fef85c3d0fdc44ae8154ece24a43b583\") " pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.795451 kubelet[2380]: I0813 00:34:54.795195 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-ca-certs\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.795451 kubelet[2380]: I0813 00:34:54.795297 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.795451 kubelet[2380]: I0813 00:34:54.795416 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-k8s-certs\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.795815 kubelet[2380]: I0813 00:34:54.795512 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fef85c3d0fdc44ae8154ece24a43b583-ca-certs\") pod \"kube-apiserver-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"fef85c3d0fdc44ae8154ece24a43b583\") " pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.897105 kubelet[2380]: I0813 00:34:54.896662 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-kubeconfig\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.897105 kubelet[2380]: I0813 00:34:54.896843 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.897105 kubelet[2380]: I0813 00:34:54.896891 2380 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7983cdd5fb9e915219abb83d8b803493-kubeconfig\") pod \"kube-scheduler-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"7983cdd5fb9e915219abb83d8b803493\") " pod="kube-system/kube-scheduler-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.959424 kubelet[2380]: I0813 00:34:54.959364 2380 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:54.959715 kubelet[2380]: E0813 00:34:54.959671 2380 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.62.157.78:6443/api/v1/nodes\": dial tcp 46.62.157.78:6443: connect: connection refused" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:55.046677 containerd[1566]: time="2025-08-13T00:34:55.046565270Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-1-0-b-5ba4a9a74b,Uid:fef85c3d0fdc44ae8154ece24a43b583,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:55.088044 containerd[1566]: time="2025-08-13T00:34:55.086769047Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b,Uid:2c31ec63200bf2219322105654562aae,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:55.091821 containerd[1566]: time="2025-08-13T00:34:55.091742842Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-1-0-b-5ba4a9a74b,Uid:7983cdd5fb9e915219abb83d8b803493,Namespace:kube-system,Attempt:0,}" Aug 13 00:34:55.129720 containerd[1566]: time="2025-08-13T00:34:55.129656234Z" level=info msg="connecting to shim d27332a835f795dcdf6a121f76c413a2c89b24de5938c4a4a1618adc9337618f" address="unix:///run/containerd/s/c80dd67d152be5a31bcc76335c3f82c63fdba837b4c7ea95c7d845c501af0bb9" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:55.153486 containerd[1566]: time="2025-08-13T00:34:55.153333961Z" level=info msg="connecting to shim 612c547cc59a5d040fb2527c038dabbc71ecb26663a2cddcfbfe3fbdd055d645" address="unix:///run/containerd/s/f6cb4dd5415de37ac5c6994160045166cddbd51ed8dee5bf76af502812eafdaf" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:55.163722 containerd[1566]: time="2025-08-13T00:34:55.163019746Z" level=info msg="connecting to shim 1f894da901fa8db159c519924efae66b9b8d971788c99a4d11443bd2659a722c" address="unix:///run/containerd/s/dbb53f5b91826f42a329c612249a4e4aa43461556d910cf9e2de4b869beed3ea" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:34:55.195161 kubelet[2380]: E0813 00:34:55.195113 2380 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.62.157.78:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-b-5ba4a9a74b?timeout=10s\": dial tcp 46.62.157.78:6443: connect: connection refused" interval="800ms" Aug 13 00:34:55.207813 systemd[1]: Started cri-containerd-612c547cc59a5d040fb2527c038dabbc71ecb26663a2cddcfbfe3fbdd055d645.scope - libcontainer container 612c547cc59a5d040fb2527c038dabbc71ecb26663a2cddcfbfe3fbdd055d645. Aug 13 00:34:55.212327 systemd[1]: Started cri-containerd-1f894da901fa8db159c519924efae66b9b8d971788c99a4d11443bd2659a722c.scope - libcontainer container 1f894da901fa8db159c519924efae66b9b8d971788c99a4d11443bd2659a722c. Aug 13 00:34:55.214114 systemd[1]: Started cri-containerd-d27332a835f795dcdf6a121f76c413a2c89b24de5938c4a4a1618adc9337618f.scope - libcontainer container d27332a835f795dcdf6a121f76c413a2c89b24de5938c4a4a1618adc9337618f. Aug 13 00:34:55.263217 containerd[1566]: time="2025-08-13T00:34:55.263173678Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-1-0-b-5ba4a9a74b,Uid:fef85c3d0fdc44ae8154ece24a43b583,Namespace:kube-system,Attempt:0,} returns sandbox id \"d27332a835f795dcdf6a121f76c413a2c89b24de5938c4a4a1618adc9337618f\"" Aug 13 00:34:55.273058 containerd[1566]: time="2025-08-13T00:34:55.272982618Z" level=info msg="CreateContainer within sandbox \"d27332a835f795dcdf6a121f76c413a2c89b24de5938c4a4a1618adc9337618f\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 13 00:34:55.285642 containerd[1566]: time="2025-08-13T00:34:55.285600406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b,Uid:2c31ec63200bf2219322105654562aae,Namespace:kube-system,Attempt:0,} returns sandbox id \"1f894da901fa8db159c519924efae66b9b8d971788c99a4d11443bd2659a722c\"" Aug 13 00:34:55.287568 containerd[1566]: time="2025-08-13T00:34:55.287525204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-1-0-b-5ba4a9a74b,Uid:7983cdd5fb9e915219abb83d8b803493,Namespace:kube-system,Attempt:0,} returns sandbox id \"612c547cc59a5d040fb2527c038dabbc71ecb26663a2cddcfbfe3fbdd055d645\"" Aug 13 00:34:55.289586 containerd[1566]: time="2025-08-13T00:34:55.289569662Z" level=info msg="Container 3950ea5c40834e60bbc1eb93c5bbd88ad0968290f296cab51e3ca15b6be6c59e: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:55.291438 containerd[1566]: time="2025-08-13T00:34:55.291238584Z" level=info msg="CreateContainer within sandbox \"1f894da901fa8db159c519924efae66b9b8d971788c99a4d11443bd2659a722c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 13 00:34:55.302213 containerd[1566]: time="2025-08-13T00:34:55.302137551Z" level=info msg="Container 4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:55.304547 containerd[1566]: time="2025-08-13T00:34:55.304515792Z" level=info msg="CreateContainer within sandbox \"d27332a835f795dcdf6a121f76c413a2c89b24de5938c4a4a1618adc9337618f\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"3950ea5c40834e60bbc1eb93c5bbd88ad0968290f296cab51e3ca15b6be6c59e\"" Aug 13 00:34:55.304765 containerd[1566]: time="2025-08-13T00:34:55.304735090Z" level=info msg="CreateContainer within sandbox \"612c547cc59a5d040fb2527c038dabbc71ecb26663a2cddcfbfe3fbdd055d645\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 13 00:34:55.315266 containerd[1566]: time="2025-08-13T00:34:55.315201988Z" level=info msg="CreateContainer within sandbox \"1f894da901fa8db159c519924efae66b9b8d971788c99a4d11443bd2659a722c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992\"" Aug 13 00:34:55.315677 containerd[1566]: time="2025-08-13T00:34:55.315652170Z" level=info msg="StartContainer for \"3950ea5c40834e60bbc1eb93c5bbd88ad0968290f296cab51e3ca15b6be6c59e\"" Aug 13 00:34:55.316256 containerd[1566]: time="2025-08-13T00:34:55.316098756Z" level=info msg="StartContainer for \"4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992\"" Aug 13 00:34:55.317608 containerd[1566]: time="2025-08-13T00:34:55.317584497Z" level=info msg="connecting to shim 3950ea5c40834e60bbc1eb93c5bbd88ad0968290f296cab51e3ca15b6be6c59e" address="unix:///run/containerd/s/c80dd67d152be5a31bcc76335c3f82c63fdba837b4c7ea95c7d845c501af0bb9" protocol=ttrpc version=3 Aug 13 00:34:55.318482 containerd[1566]: time="2025-08-13T00:34:55.318464119Z" level=info msg="connecting to shim 4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992" address="unix:///run/containerd/s/dbb53f5b91826f42a329c612249a4e4aa43461556d910cf9e2de4b869beed3ea" protocol=ttrpc version=3 Aug 13 00:34:55.319065 containerd[1566]: time="2025-08-13T00:34:55.319028636Z" level=info msg="Container c1592816364af5bd8b1acb4043985316f426a3a7a537c090de7aad20c87c4f0e: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:34:55.326670 containerd[1566]: time="2025-08-13T00:34:55.326646715Z" level=info msg="CreateContainer within sandbox \"612c547cc59a5d040fb2527c038dabbc71ecb26663a2cddcfbfe3fbdd055d645\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c1592816364af5bd8b1acb4043985316f426a3a7a537c090de7aad20c87c4f0e\"" Aug 13 00:34:55.327366 containerd[1566]: time="2025-08-13T00:34:55.327350656Z" level=info msg="StartContainer for \"c1592816364af5bd8b1acb4043985316f426a3a7a537c090de7aad20c87c4f0e\"" Aug 13 00:34:55.328845 containerd[1566]: time="2025-08-13T00:34:55.328827416Z" level=info msg="connecting to shim c1592816364af5bd8b1acb4043985316f426a3a7a537c090de7aad20c87c4f0e" address="unix:///run/containerd/s/f6cb4dd5415de37ac5c6994160045166cddbd51ed8dee5bf76af502812eafdaf" protocol=ttrpc version=3 Aug 13 00:34:55.334878 systemd[1]: Started cri-containerd-4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992.scope - libcontainer container 4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992. Aug 13 00:34:55.338720 systemd[1]: Started cri-containerd-3950ea5c40834e60bbc1eb93c5bbd88ad0968290f296cab51e3ca15b6be6c59e.scope - libcontainer container 3950ea5c40834e60bbc1eb93c5bbd88ad0968290f296cab51e3ca15b6be6c59e. Aug 13 00:34:55.358813 systemd[1]: Started cri-containerd-c1592816364af5bd8b1acb4043985316f426a3a7a537c090de7aad20c87c4f0e.scope - libcontainer container c1592816364af5bd8b1acb4043985316f426a3a7a537c090de7aad20c87c4f0e. Aug 13 00:34:55.362934 kubelet[2380]: I0813 00:34:55.362903 2380 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:55.363609 kubelet[2380]: E0813 00:34:55.363588 2380 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.62.157.78:6443/api/v1/nodes\": dial tcp 46.62.157.78:6443: connect: connection refused" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:55.401526 containerd[1566]: time="2025-08-13T00:34:55.401427055Z" level=info msg="StartContainer for \"4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992\" returns successfully" Aug 13 00:34:55.406363 containerd[1566]: time="2025-08-13T00:34:55.406303630Z" level=info msg="StartContainer for \"3950ea5c40834e60bbc1eb93c5bbd88ad0968290f296cab51e3ca15b6be6c59e\" returns successfully" Aug 13 00:34:55.435716 containerd[1566]: time="2025-08-13T00:34:55.435659497Z" level=info msg="StartContainer for \"c1592816364af5bd8b1acb4043985316f426a3a7a537c090de7aad20c87c4f0e\" returns successfully" Aug 13 00:34:55.538312 kubelet[2380]: E0813 00:34:55.537950 2380 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://46.62.157.78:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.62.157.78:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 13 00:34:55.622392 kubelet[2380]: E0813 00:34:55.622296 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:55.624512 kubelet[2380]: E0813 00:34:55.624495 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:55.626098 kubelet[2380]: E0813 00:34:55.626080 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:56.168136 kubelet[2380]: I0813 00:34:56.168093 2380 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:56.630290 kubelet[2380]: E0813 00:34:56.630265 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:56.630752 kubelet[2380]: E0813 00:34:56.630736 2380 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.204403 kubelet[2380]: E0813 00:34:57.204363 2380 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372-1-0-b-5ba4a9a74b\" not found" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.374649 kubelet[2380]: I0813 00:34:57.374614 2380 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.374649 kubelet[2380]: E0813 00:34:57.374647 2380 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4372-1-0-b-5ba4a9a74b\": node \"ci-4372-1-0-b-5ba4a9a74b\" not found" Aug 13 00:34:57.393260 kubelet[2380]: I0813 00:34:57.393228 2380 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.469945 kubelet[2380]: E0813 00:34:57.469839 2380 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372-1-0-b-5ba4a9a74b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.469945 kubelet[2380]: I0813 00:34:57.469876 2380 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.471760 kubelet[2380]: E0813 00:34:57.471739 2380 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.471760 kubelet[2380]: I0813 00:34:57.471757 2380 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.473039 kubelet[2380]: E0813 00:34:57.473017 2380 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372-1-0-b-5ba4a9a74b\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:57.567903 kubelet[2380]: I0813 00:34:57.567859 2380 apiserver.go:52] "Watching apiserver" Aug 13 00:34:57.593625 kubelet[2380]: I0813 00:34:57.593589 2380 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 13 00:34:59.179608 systemd[1]: Reload requested from client PID 2660 ('systemctl') (unit session-7.scope)... Aug 13 00:34:59.179642 systemd[1]: Reloading... Aug 13 00:34:59.264729 zram_generator::config[2704]: No configuration found. Aug 13 00:34:59.338853 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 13 00:34:59.444550 systemd[1]: Reloading finished in 264 ms. Aug 13 00:34:59.473617 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:59.484555 systemd[1]: kubelet.service: Deactivated successfully. Aug 13 00:34:59.484780 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:59.484822 systemd[1]: kubelet.service: Consumed 679ms CPU time, 126.3M memory peak. Aug 13 00:34:59.486234 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 13 00:34:59.611478 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 13 00:34:59.616964 (kubelet)[2755]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 13 00:34:59.664746 kubelet[2755]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:59.664746 kubelet[2755]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 13 00:34:59.664746 kubelet[2755]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 13 00:34:59.664746 kubelet[2755]: I0813 00:34:59.664681 2755 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 13 00:34:59.672747 kubelet[2755]: I0813 00:34:59.672129 2755 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 13 00:34:59.672747 kubelet[2755]: I0813 00:34:59.672147 2755 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 13 00:34:59.672747 kubelet[2755]: I0813 00:34:59.672315 2755 server.go:956] "Client rotation is on, will bootstrap in background" Aug 13 00:34:59.673437 kubelet[2755]: I0813 00:34:59.673410 2755 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Aug 13 00:34:59.676100 kubelet[2755]: I0813 00:34:59.675982 2755 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 13 00:34:59.679092 kubelet[2755]: I0813 00:34:59.679071 2755 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 13 00:34:59.684387 kubelet[2755]: I0813 00:34:59.684356 2755 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 13 00:34:59.684651 kubelet[2755]: I0813 00:34:59.684627 2755 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 13 00:34:59.684849 kubelet[2755]: I0813 00:34:59.684651 2755 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-1-0-b-5ba4a9a74b","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 13 00:34:59.684849 kubelet[2755]: I0813 00:34:59.684848 2755 topology_manager.go:138] "Creating topology manager with none policy" Aug 13 00:34:59.685028 kubelet[2755]: I0813 00:34:59.684855 2755 container_manager_linux.go:303] "Creating device plugin manager" Aug 13 00:34:59.685028 kubelet[2755]: I0813 00:34:59.684886 2755 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:59.685028 kubelet[2755]: I0813 00:34:59.685007 2755 kubelet.go:480] "Attempting to sync node with API server" Aug 13 00:34:59.685028 kubelet[2755]: I0813 00:34:59.685017 2755 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 13 00:34:59.685028 kubelet[2755]: I0813 00:34:59.685032 2755 kubelet.go:386] "Adding apiserver pod source" Aug 13 00:34:59.685170 kubelet[2755]: I0813 00:34:59.685039 2755 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 13 00:34:59.687626 kubelet[2755]: I0813 00:34:59.687586 2755 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 13 00:34:59.692407 kubelet[2755]: I0813 00:34:59.692374 2755 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 13 00:34:59.697928 kubelet[2755]: I0813 00:34:59.697821 2755 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 13 00:34:59.697928 kubelet[2755]: I0813 00:34:59.697854 2755 server.go:1289] "Started kubelet" Aug 13 00:34:59.699882 kubelet[2755]: I0813 00:34:59.699865 2755 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 13 00:34:59.700186 kubelet[2755]: I0813 00:34:59.700146 2755 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 13 00:34:59.700670 kubelet[2755]: I0813 00:34:59.700658 2755 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 13 00:34:59.702683 kubelet[2755]: I0813 00:34:59.702673 2755 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 13 00:34:59.703475 kubelet[2755]: I0813 00:34:59.701078 2755 server.go:317] "Adding debug handlers to kubelet server" Aug 13 00:34:59.704491 kubelet[2755]: I0813 00:34:59.704477 2755 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 13 00:34:59.704547 kubelet[2755]: I0813 00:34:59.701108 2755 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 13 00:34:59.704776 kubelet[2755]: I0813 00:34:59.704765 2755 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 13 00:34:59.704930 kubelet[2755]: I0813 00:34:59.704920 2755 reconciler.go:26] "Reconciler: start to sync state" Aug 13 00:34:59.705890 kubelet[2755]: I0813 00:34:59.705792 2755 factory.go:223] Registration of the systemd container factory successfully Aug 13 00:34:59.705933 kubelet[2755]: I0813 00:34:59.705892 2755 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 13 00:34:59.708951 kubelet[2755]: I0813 00:34:59.708939 2755 factory.go:223] Registration of the containerd container factory successfully Aug 13 00:34:59.727511 kubelet[2755]: I0813 00:34:59.727466 2755 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 13 00:34:59.728546 kubelet[2755]: I0813 00:34:59.728526 2755 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 13 00:34:59.728546 kubelet[2755]: I0813 00:34:59.728547 2755 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 13 00:34:59.728624 kubelet[2755]: I0813 00:34:59.728561 2755 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 13 00:34:59.728624 kubelet[2755]: I0813 00:34:59.728567 2755 kubelet.go:2436] "Starting kubelet main sync loop" Aug 13 00:34:59.728831 kubelet[2755]: E0813 00:34:59.728804 2755 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 13 00:34:59.749067 kubelet[2755]: I0813 00:34:59.749035 2755 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 13 00:34:59.749067 kubelet[2755]: I0813 00:34:59.749053 2755 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 13 00:34:59.749067 kubelet[2755]: I0813 00:34:59.749073 2755 state_mem.go:36] "Initialized new in-memory state store" Aug 13 00:34:59.749206 kubelet[2755]: I0813 00:34:59.749194 2755 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 13 00:34:59.749225 kubelet[2755]: I0813 00:34:59.749206 2755 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 13 00:34:59.749241 kubelet[2755]: I0813 00:34:59.749225 2755 policy_none.go:49] "None policy: Start" Aug 13 00:34:59.749241 kubelet[2755]: I0813 00:34:59.749236 2755 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 13 00:34:59.749274 kubelet[2755]: I0813 00:34:59.749248 2755 state_mem.go:35] "Initializing new in-memory state store" Aug 13 00:34:59.749363 kubelet[2755]: I0813 00:34:59.749342 2755 state_mem.go:75] "Updated machine memory state" Aug 13 00:34:59.753160 kubelet[2755]: E0813 00:34:59.753136 2755 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 13 00:34:59.753496 kubelet[2755]: I0813 00:34:59.753276 2755 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 13 00:34:59.753496 kubelet[2755]: I0813 00:34:59.753298 2755 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 13 00:34:59.753496 kubelet[2755]: I0813 00:34:59.753448 2755 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 13 00:34:59.754872 kubelet[2755]: E0813 00:34:59.754848 2755 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 13 00:34:59.830314 kubelet[2755]: I0813 00:34:59.830286 2755 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.831200 kubelet[2755]: I0813 00:34:59.830750 2755 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.831200 kubelet[2755]: I0813 00:34:59.830396 2755 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.860609 kubelet[2755]: I0813 00:34:59.860353 2755 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.870434 kubelet[2755]: I0813 00:34:59.870402 2755 kubelet_node_status.go:124] "Node was previously registered" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.870643 kubelet[2755]: I0813 00:34:59.870595 2755 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.906722 kubelet[2755]: I0813 00:34:59.906649 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fef85c3d0fdc44ae8154ece24a43b583-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"fef85c3d0fdc44ae8154ece24a43b583\") " pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.906722 kubelet[2755]: I0813 00:34:59.906722 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-ca-certs\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.906930 kubelet[2755]: I0813 00:34:59.906753 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-kubeconfig\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.906930 kubelet[2755]: I0813 00:34:59.906800 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fef85c3d0fdc44ae8154ece24a43b583-ca-certs\") pod \"kube-apiserver-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"fef85c3d0fdc44ae8154ece24a43b583\") " pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.906930 kubelet[2755]: I0813 00:34:59.906827 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fef85c3d0fdc44ae8154ece24a43b583-k8s-certs\") pod \"kube-apiserver-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"fef85c3d0fdc44ae8154ece24a43b583\") " pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.906930 kubelet[2755]: I0813 00:34:59.906849 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.906930 kubelet[2755]: I0813 00:34:59.906868 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-k8s-certs\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.907108 kubelet[2755]: I0813 00:34:59.906893 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2c31ec63200bf2219322105654562aae-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"2c31ec63200bf2219322105654562aae\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:34:59.907108 kubelet[2755]: I0813 00:34:59.906919 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7983cdd5fb9e915219abb83d8b803493-kubeconfig\") pod \"kube-scheduler-ci-4372-1-0-b-5ba4a9a74b\" (UID: \"7983cdd5fb9e915219abb83d8b803493\") " pod="kube-system/kube-scheduler-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:00.686365 kubelet[2755]: I0813 00:35:00.686256 2755 apiserver.go:52] "Watching apiserver" Aug 13 00:35:00.704942 kubelet[2755]: I0813 00:35:00.704881 2755 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 13 00:35:00.743222 kubelet[2755]: I0813 00:35:00.743173 2755 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:00.744215 kubelet[2755]: I0813 00:35:00.744088 2755 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:00.754056 kubelet[2755]: E0813 00:35:00.754006 2755 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372-1-0-b-5ba4a9a74b\" already exists" pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:00.759029 kubelet[2755]: E0813 00:35:00.758982 2755 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372-1-0-b-5ba4a9a74b\" already exists" pod="kube-system/kube-scheduler-ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:00.780627 kubelet[2755]: I0813 00:35:00.780541 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372-1-0-b-5ba4a9a74b" podStartSLOduration=1.7805264969999999 podStartE2EDuration="1.780526497s" podCreationTimestamp="2025-08-13 00:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:35:00.779554634 +0000 UTC m=+1.157141522" watchObservedRunningTime="2025-08-13 00:35:00.780526497 +0000 UTC m=+1.158113374" Aug 13 00:35:00.803691 kubelet[2755]: I0813 00:35:00.803631 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372-1-0-b-5ba4a9a74b" podStartSLOduration=1.803412555 podStartE2EDuration="1.803412555s" podCreationTimestamp="2025-08-13 00:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:35:00.79020989 +0000 UTC m=+1.167796767" watchObservedRunningTime="2025-08-13 00:35:00.803412555 +0000 UTC m=+1.180999432" Aug 13 00:35:00.816280 kubelet[2755]: I0813 00:35:00.816183 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372-1-0-b-5ba4a9a74b" podStartSLOduration=1.816160582 podStartE2EDuration="1.816160582s" podCreationTimestamp="2025-08-13 00:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:35:00.804155844 +0000 UTC m=+1.181742720" watchObservedRunningTime="2025-08-13 00:35:00.816160582 +0000 UTC m=+1.193747459" Aug 13 00:35:02.553849 update_engine[1535]: I20250813 00:35:02.553771 1535 update_attempter.cc:509] Updating boot flags... Aug 13 00:35:04.978089 kubelet[2755]: I0813 00:35:04.978027 2755 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 13 00:35:04.979017 containerd[1566]: time="2025-08-13T00:35:04.978961576Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 13 00:35:04.979448 kubelet[2755]: I0813 00:35:04.979212 2755 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 13 00:35:05.889070 systemd[1]: Created slice kubepods-besteffort-pod0b092ac2_d00c_4ca1_adcb_25113c9417d2.slice - libcontainer container kubepods-besteffort-pod0b092ac2_d00c_4ca1_adcb_25113c9417d2.slice. Aug 13 00:35:05.943155 kubelet[2755]: I0813 00:35:05.943103 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0b092ac2-d00c-4ca1-adcb-25113c9417d2-kube-proxy\") pod \"kube-proxy-f8n45\" (UID: \"0b092ac2-d00c-4ca1-adcb-25113c9417d2\") " pod="kube-system/kube-proxy-f8n45" Aug 13 00:35:05.943298 kubelet[2755]: I0813 00:35:05.943193 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98xw\" (UniqueName: \"kubernetes.io/projected/0b092ac2-d00c-4ca1-adcb-25113c9417d2-kube-api-access-m98xw\") pod \"kube-proxy-f8n45\" (UID: \"0b092ac2-d00c-4ca1-adcb-25113c9417d2\") " pod="kube-system/kube-proxy-f8n45" Aug 13 00:35:05.943298 kubelet[2755]: I0813 00:35:05.943217 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0b092ac2-d00c-4ca1-adcb-25113c9417d2-xtables-lock\") pod \"kube-proxy-f8n45\" (UID: \"0b092ac2-d00c-4ca1-adcb-25113c9417d2\") " pod="kube-system/kube-proxy-f8n45" Aug 13 00:35:05.943298 kubelet[2755]: I0813 00:35:05.943285 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b092ac2-d00c-4ca1-adcb-25113c9417d2-lib-modules\") pod \"kube-proxy-f8n45\" (UID: \"0b092ac2-d00c-4ca1-adcb-25113c9417d2\") " pod="kube-system/kube-proxy-f8n45" Aug 13 00:35:06.200956 containerd[1566]: time="2025-08-13T00:35:06.199642119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-f8n45,Uid:0b092ac2-d00c-4ca1-adcb-25113c9417d2,Namespace:kube-system,Attempt:0,}" Aug 13 00:35:06.229922 containerd[1566]: time="2025-08-13T00:35:06.229606363Z" level=info msg="connecting to shim 63a0c3ba3d9a9cbf6057d5c2debec58b24539cf7925102d8819eaae2326d2f6c" address="unix:///run/containerd/s/5dab49bd9ae1ecf92853e95b82333e9afcf0f7b373d7ded013f325bf44121617" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:06.258627 systemd[1]: Created slice kubepods-besteffort-pod5f6d8142_bee6_457f_9caf_2182d5dfb3f6.slice - libcontainer container kubepods-besteffort-pod5f6d8142_bee6_457f_9caf_2182d5dfb3f6.slice. Aug 13 00:35:06.282896 systemd[1]: Started cri-containerd-63a0c3ba3d9a9cbf6057d5c2debec58b24539cf7925102d8819eaae2326d2f6c.scope - libcontainer container 63a0c3ba3d9a9cbf6057d5c2debec58b24539cf7925102d8819eaae2326d2f6c. Aug 13 00:35:06.312238 containerd[1566]: time="2025-08-13T00:35:06.312207987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-f8n45,Uid:0b092ac2-d00c-4ca1-adcb-25113c9417d2,Namespace:kube-system,Attempt:0,} returns sandbox id \"63a0c3ba3d9a9cbf6057d5c2debec58b24539cf7925102d8819eaae2326d2f6c\"" Aug 13 00:35:06.316490 containerd[1566]: time="2025-08-13T00:35:06.316445667Z" level=info msg="CreateContainer within sandbox \"63a0c3ba3d9a9cbf6057d5c2debec58b24539cf7925102d8819eaae2326d2f6c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 13 00:35:06.327771 containerd[1566]: time="2025-08-13T00:35:06.326730872Z" level=info msg="Container f9071ae56b8e8890ba181c2904368bcc354c8b30bd4d4c55093dc4e17dd913ae: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:06.329387 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3427936595.mount: Deactivated successfully. Aug 13 00:35:06.338434 containerd[1566]: time="2025-08-13T00:35:06.338387424Z" level=info msg="CreateContainer within sandbox \"63a0c3ba3d9a9cbf6057d5c2debec58b24539cf7925102d8819eaae2326d2f6c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f9071ae56b8e8890ba181c2904368bcc354c8b30bd4d4c55093dc4e17dd913ae\"" Aug 13 00:35:06.339096 containerd[1566]: time="2025-08-13T00:35:06.339058994Z" level=info msg="StartContainer for \"f9071ae56b8e8890ba181c2904368bcc354c8b30bd4d4c55093dc4e17dd913ae\"" Aug 13 00:35:06.340714 containerd[1566]: time="2025-08-13T00:35:06.340578657Z" level=info msg="connecting to shim f9071ae56b8e8890ba181c2904368bcc354c8b30bd4d4c55093dc4e17dd913ae" address="unix:///run/containerd/s/5dab49bd9ae1ecf92853e95b82333e9afcf0f7b373d7ded013f325bf44121617" protocol=ttrpc version=3 Aug 13 00:35:06.346380 kubelet[2755]: I0813 00:35:06.346347 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-928hc\" (UniqueName: \"kubernetes.io/projected/5f6d8142-bee6-457f-9caf-2182d5dfb3f6-kube-api-access-928hc\") pod \"tigera-operator-747864d56d-kj2hs\" (UID: \"5f6d8142-bee6-457f-9caf-2182d5dfb3f6\") " pod="tigera-operator/tigera-operator-747864d56d-kj2hs" Aug 13 00:35:06.346828 kubelet[2755]: I0813 00:35:06.346761 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5f6d8142-bee6-457f-9caf-2182d5dfb3f6-var-lib-calico\") pod \"tigera-operator-747864d56d-kj2hs\" (UID: \"5f6d8142-bee6-457f-9caf-2182d5dfb3f6\") " pod="tigera-operator/tigera-operator-747864d56d-kj2hs" Aug 13 00:35:06.358544 systemd[1]: Started cri-containerd-f9071ae56b8e8890ba181c2904368bcc354c8b30bd4d4c55093dc4e17dd913ae.scope - libcontainer container f9071ae56b8e8890ba181c2904368bcc354c8b30bd4d4c55093dc4e17dd913ae. Aug 13 00:35:06.399990 containerd[1566]: time="2025-08-13T00:35:06.399941562Z" level=info msg="StartContainer for \"f9071ae56b8e8890ba181c2904368bcc354c8b30bd4d4c55093dc4e17dd913ae\" returns successfully" Aug 13 00:35:06.565717 containerd[1566]: time="2025-08-13T00:35:06.565598115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-kj2hs,Uid:5f6d8142-bee6-457f-9caf-2182d5dfb3f6,Namespace:tigera-operator,Attempt:0,}" Aug 13 00:35:06.582289 containerd[1566]: time="2025-08-13T00:35:06.582232482Z" level=info msg="connecting to shim 579a5f4d58f286c13d0754cd8cd47fff5b94c28bf10e617bfc659d41180903d8" address="unix:///run/containerd/s/8ecf781fceb1f87cf0735f04c73eba9ec24eb0ca56743392b8c9070afe3ca95a" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:06.606381 systemd[1]: Started cri-containerd-579a5f4d58f286c13d0754cd8cd47fff5b94c28bf10e617bfc659d41180903d8.scope - libcontainer container 579a5f4d58f286c13d0754cd8cd47fff5b94c28bf10e617bfc659d41180903d8. Aug 13 00:35:06.659902 containerd[1566]: time="2025-08-13T00:35:06.659849018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-kj2hs,Uid:5f6d8142-bee6-457f-9caf-2182d5dfb3f6,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"579a5f4d58f286c13d0754cd8cd47fff5b94c28bf10e617bfc659d41180903d8\"" Aug 13 00:35:06.661545 containerd[1566]: time="2025-08-13T00:35:06.661399814Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 13 00:35:08.656074 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount298286511.mount: Deactivated successfully. Aug 13 00:35:09.079339 containerd[1566]: time="2025-08-13T00:35:09.079113344Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:09.080313 containerd[1566]: time="2025-08-13T00:35:09.080281974Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Aug 13 00:35:09.081407 containerd[1566]: time="2025-08-13T00:35:09.081366342Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:09.083237 containerd[1566]: time="2025-08-13T00:35:09.083187320Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:09.083745 containerd[1566]: time="2025-08-13T00:35:09.083602093Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.422178732s" Aug 13 00:35:09.083745 containerd[1566]: time="2025-08-13T00:35:09.083628151Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Aug 13 00:35:09.086599 containerd[1566]: time="2025-08-13T00:35:09.086561210Z" level=info msg="CreateContainer within sandbox \"579a5f4d58f286c13d0754cd8cd47fff5b94c28bf10e617bfc659d41180903d8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 13 00:35:09.094516 containerd[1566]: time="2025-08-13T00:35:09.094078029Z" level=info msg="Container a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:09.111934 containerd[1566]: time="2025-08-13T00:35:09.111870262Z" level=info msg="CreateContainer within sandbox \"579a5f4d58f286c13d0754cd8cd47fff5b94c28bf10e617bfc659d41180903d8\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7\"" Aug 13 00:35:09.112574 containerd[1566]: time="2025-08-13T00:35:09.112551067Z" level=info msg="StartContainer for \"a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7\"" Aug 13 00:35:09.114513 containerd[1566]: time="2025-08-13T00:35:09.114466093Z" level=info msg="connecting to shim a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7" address="unix:///run/containerd/s/8ecf781fceb1f87cf0735f04c73eba9ec24eb0ca56743392b8c9070afe3ca95a" protocol=ttrpc version=3 Aug 13 00:35:09.133859 systemd[1]: Started cri-containerd-a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7.scope - libcontainer container a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7. Aug 13 00:35:09.162390 containerd[1566]: time="2025-08-13T00:35:09.161780911Z" level=info msg="StartContainer for \"a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7\" returns successfully" Aug 13 00:35:09.776323 kubelet[2755]: I0813 00:35:09.776182 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-f8n45" podStartSLOduration=4.776165095 podStartE2EDuration="4.776165095s" podCreationTimestamp="2025-08-13 00:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:35:06.770627649 +0000 UTC m=+7.148214506" watchObservedRunningTime="2025-08-13 00:35:09.776165095 +0000 UTC m=+10.153751952" Aug 13 00:35:09.779752 kubelet[2755]: I0813 00:35:09.776557 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-kj2hs" podStartSLOduration=1.353369357 podStartE2EDuration="3.776547914s" podCreationTimestamp="2025-08-13 00:35:06 +0000 UTC" firstStartedPulling="2025-08-13 00:35:06.661011335 +0000 UTC m=+7.038598192" lastFinishedPulling="2025-08-13 00:35:09.084189893 +0000 UTC m=+9.461776749" observedRunningTime="2025-08-13 00:35:09.776029457 +0000 UTC m=+10.153616324" watchObservedRunningTime="2025-08-13 00:35:09.776547914 +0000 UTC m=+10.154134781" Aug 13 00:35:14.924475 sudo[1820]: pam_unix(sudo:session): session closed for user root Aug 13 00:35:15.081634 sshd[1819]: Connection closed by 139.178.89.65 port 57854 Aug 13 00:35:15.082623 sshd-session[1817]: pam_unix(sshd:session): session closed for user core Aug 13 00:35:15.086075 systemd-logind[1533]: Session 7 logged out. Waiting for processes to exit. Aug 13 00:35:15.087354 systemd[1]: sshd@6-46.62.157.78:22-139.178.89.65:57854.service: Deactivated successfully. Aug 13 00:35:15.090625 systemd[1]: session-7.scope: Deactivated successfully. Aug 13 00:35:15.092103 systemd[1]: session-7.scope: Consumed 5.898s CPU time, 159.2M memory peak. Aug 13 00:35:15.095079 systemd-logind[1533]: Removed session 7. Aug 13 00:35:17.561752 systemd[1]: Created slice kubepods-besteffort-pod9b749361_cbce_41ce_9101_0eebab26e440.slice - libcontainer container kubepods-besteffort-pod9b749361_cbce_41ce_9101_0eebab26e440.slice. Aug 13 00:35:17.617213 kubelet[2755]: I0813 00:35:17.617112 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnt8b\" (UniqueName: \"kubernetes.io/projected/9b749361-cbce-41ce-9101-0eebab26e440-kube-api-access-tnt8b\") pod \"calico-typha-869cf6fd69-btdcs\" (UID: \"9b749361-cbce-41ce-9101-0eebab26e440\") " pod="calico-system/calico-typha-869cf6fd69-btdcs" Aug 13 00:35:17.617213 kubelet[2755]: I0813 00:35:17.617145 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b749361-cbce-41ce-9101-0eebab26e440-tigera-ca-bundle\") pod \"calico-typha-869cf6fd69-btdcs\" (UID: \"9b749361-cbce-41ce-9101-0eebab26e440\") " pod="calico-system/calico-typha-869cf6fd69-btdcs" Aug 13 00:35:17.617213 kubelet[2755]: I0813 00:35:17.617158 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9b749361-cbce-41ce-9101-0eebab26e440-typha-certs\") pod \"calico-typha-869cf6fd69-btdcs\" (UID: \"9b749361-cbce-41ce-9101-0eebab26e440\") " pod="calico-system/calico-typha-869cf6fd69-btdcs" Aug 13 00:35:17.876335 containerd[1566]: time="2025-08-13T00:35:17.876276041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-869cf6fd69-btdcs,Uid:9b749361-cbce-41ce-9101-0eebab26e440,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:17.915080 containerd[1566]: time="2025-08-13T00:35:17.914921957Z" level=info msg="connecting to shim 25f7b6352ce6c4502d6d00ce2ddcd8839dbb4ffdcdf333644ff6c1a65bd2e3b3" address="unix:///run/containerd/s/7d4bca30a2d846e6a4a3d676b807dbfe62f810cee6d6fd0768a184cffff968ad" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:17.951990 systemd[1]: Started cri-containerd-25f7b6352ce6c4502d6d00ce2ddcd8839dbb4ffdcdf333644ff6c1a65bd2e3b3.scope - libcontainer container 25f7b6352ce6c4502d6d00ce2ddcd8839dbb4ffdcdf333644ff6c1a65bd2e3b3. Aug 13 00:35:17.968386 systemd[1]: Created slice kubepods-besteffort-pod81a4343c_21d2_416f_bed6_05f918f0b85f.slice - libcontainer container kubepods-besteffort-pod81a4343c_21d2_416f_bed6_05f918f0b85f.slice. Aug 13 00:35:18.020112 kubelet[2755]: I0813 00:35:18.020078 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-cni-net-dir\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020230 kubelet[2755]: I0813 00:35:18.020151 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k94kq\" (UniqueName: \"kubernetes.io/projected/81a4343c-21d2-416f-bed6-05f918f0b85f-kube-api-access-k94kq\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020230 kubelet[2755]: I0813 00:35:18.020168 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/81a4343c-21d2-416f-bed6-05f918f0b85f-node-certs\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020230 kubelet[2755]: I0813 00:35:18.020182 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-xtables-lock\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020295 kubelet[2755]: I0813 00:35:18.020246 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a4343c-21d2-416f-bed6-05f918f0b85f-tigera-ca-bundle\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020295 kubelet[2755]: I0813 00:35:18.020261 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-var-lib-calico\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020295 kubelet[2755]: I0813 00:35:18.020272 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-var-run-calico\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020349 kubelet[2755]: I0813 00:35:18.020288 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-cni-bin-dir\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020367 kubelet[2755]: I0813 00:35:18.020348 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-cni-log-dir\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020367 kubelet[2755]: I0813 00:35:18.020360 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-lib-modules\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020886 kubelet[2755]: I0813 00:35:18.020375 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-flexvol-driver-host\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.020886 kubelet[2755]: I0813 00:35:18.020440 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/81a4343c-21d2-416f-bed6-05f918f0b85f-policysync\") pod \"calico-node-5jpt2\" (UID: \"81a4343c-21d2-416f-bed6-05f918f0b85f\") " pod="calico-system/calico-node-5jpt2" Aug 13 00:35:18.025154 containerd[1566]: time="2025-08-13T00:35:18.025120610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-869cf6fd69-btdcs,Uid:9b749361-cbce-41ce-9101-0eebab26e440,Namespace:calico-system,Attempt:0,} returns sandbox id \"25f7b6352ce6c4502d6d00ce2ddcd8839dbb4ffdcdf333644ff6c1a65bd2e3b3\"" Aug 13 00:35:18.027585 containerd[1566]: time="2025-08-13T00:35:18.027032024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 13 00:35:18.123547 kubelet[2755]: E0813 00:35:18.123474 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.123933 kubelet[2755]: W0813 00:35:18.123631 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.123933 kubelet[2755]: E0813 00:35:18.123654 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.124015 kubelet[2755]: E0813 00:35:18.123969 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.124015 kubelet[2755]: W0813 00:35:18.123980 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.124015 kubelet[2755]: E0813 00:35:18.123991 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.124704 kubelet[2755]: E0813 00:35:18.124626 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.124704 kubelet[2755]: W0813 00:35:18.124638 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.124704 kubelet[2755]: E0813 00:35:18.124649 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.131883 kubelet[2755]: E0813 00:35:18.131784 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.131883 kubelet[2755]: W0813 00:35:18.131804 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.131883 kubelet[2755]: E0813 00:35:18.131822 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.140362 kubelet[2755]: E0813 00:35:18.140310 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.140362 kubelet[2755]: W0813 00:35:18.140331 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.140362 kubelet[2755]: E0813 00:35:18.140348 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.183360 kubelet[2755]: E0813 00:35:18.183296 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-m42j7" podUID="7f11d4ba-b27a-4a80-a109-96b122ee11da" Aug 13 00:35:18.214320 kubelet[2755]: E0813 00:35:18.214252 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.214320 kubelet[2755]: W0813 00:35:18.214272 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.214320 kubelet[2755]: E0813 00:35:18.214290 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.215148 kubelet[2755]: E0813 00:35:18.215105 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.215148 kubelet[2755]: W0813 00:35:18.215115 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.215148 kubelet[2755]: E0813 00:35:18.215124 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.215433 kubelet[2755]: E0813 00:35:18.215372 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.215433 kubelet[2755]: W0813 00:35:18.215381 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.215433 kubelet[2755]: E0813 00:35:18.215389 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.215714 kubelet[2755]: E0813 00:35:18.215686 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.215822 kubelet[2755]: W0813 00:35:18.215772 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.215822 kubelet[2755]: E0813 00:35:18.215785 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.216058 kubelet[2755]: E0813 00:35:18.216006 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.216058 kubelet[2755]: W0813 00:35:18.216015 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.216058 kubelet[2755]: E0813 00:35:18.216023 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.216307 kubelet[2755]: E0813 00:35:18.216257 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.216307 kubelet[2755]: W0813 00:35:18.216265 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.216307 kubelet[2755]: E0813 00:35:18.216272 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.216715 kubelet[2755]: E0813 00:35:18.216541 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.216865 kubelet[2755]: W0813 00:35:18.216770 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.216865 kubelet[2755]: E0813 00:35:18.216787 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.217034 kubelet[2755]: E0813 00:35:18.217025 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.217184 kubelet[2755]: W0813 00:35:18.217108 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.217184 kubelet[2755]: E0813 00:35:18.217119 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.217471 kubelet[2755]: E0813 00:35:18.217459 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.217617 kubelet[2755]: W0813 00:35:18.217519 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.217617 kubelet[2755]: E0813 00:35:18.217529 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.218240 kubelet[2755]: E0813 00:35:18.218205 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.218240 kubelet[2755]: W0813 00:35:18.218214 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.218240 kubelet[2755]: E0813 00:35:18.218222 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.218487 kubelet[2755]: E0813 00:35:18.218441 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.218487 kubelet[2755]: W0813 00:35:18.218449 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.218487 kubelet[2755]: E0813 00:35:18.218456 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.218767 kubelet[2755]: E0813 00:35:18.218717 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.218767 kubelet[2755]: W0813 00:35:18.218726 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.218767 kubelet[2755]: E0813 00:35:18.218733 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.219037 kubelet[2755]: E0813 00:35:18.218976 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.219037 kubelet[2755]: W0813 00:35:18.218985 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.219037 kubelet[2755]: E0813 00:35:18.219005 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.219289 kubelet[2755]: E0813 00:35:18.219236 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.219289 kubelet[2755]: W0813 00:35:18.219245 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.219289 kubelet[2755]: E0813 00:35:18.219252 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.219894 kubelet[2755]: E0813 00:35:18.219843 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.219894 kubelet[2755]: W0813 00:35:18.219852 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.219894 kubelet[2755]: E0813 00:35:18.219862 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.220189 kubelet[2755]: E0813 00:35:18.220139 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.220189 kubelet[2755]: W0813 00:35:18.220147 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.220189 kubelet[2755]: E0813 00:35:18.220155 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.220384 kubelet[2755]: E0813 00:35:18.220375 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.220485 kubelet[2755]: W0813 00:35:18.220439 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.220485 kubelet[2755]: E0813 00:35:18.220450 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.220742 kubelet[2755]: E0813 00:35:18.220660 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.220742 kubelet[2755]: W0813 00:35:18.220668 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.220742 kubelet[2755]: E0813 00:35:18.220675 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.221131 kubelet[2755]: E0813 00:35:18.221047 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.221131 kubelet[2755]: W0813 00:35:18.221056 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.221131 kubelet[2755]: E0813 00:35:18.221063 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.221859 kubelet[2755]: E0813 00:35:18.221827 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.221859 kubelet[2755]: W0813 00:35:18.221838 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.221859 kubelet[2755]: E0813 00:35:18.221846 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.222442 kubelet[2755]: E0813 00:35:18.222412 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.222442 kubelet[2755]: W0813 00:35:18.222421 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.222442 kubelet[2755]: E0813 00:35:18.222429 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.222550 kubelet[2755]: I0813 00:35:18.222539 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f11d4ba-b27a-4a80-a109-96b122ee11da-kubelet-dir\") pod \"csi-node-driver-m42j7\" (UID: \"7f11d4ba-b27a-4a80-a109-96b122ee11da\") " pod="calico-system/csi-node-driver-m42j7" Aug 13 00:35:18.222869 kubelet[2755]: E0813 00:35:18.222786 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.223020 kubelet[2755]: W0813 00:35:18.222964 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.223105 kubelet[2755]: E0813 00:35:18.223061 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.223350 kubelet[2755]: E0813 00:35:18.223326 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.223350 kubelet[2755]: W0813 00:35:18.223334 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.223350 kubelet[2755]: E0813 00:35:18.223342 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.223600 kubelet[2755]: E0813 00:35:18.223574 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.223600 kubelet[2755]: W0813 00:35:18.223582 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.223600 kubelet[2755]: E0813 00:35:18.223590 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.223809 kubelet[2755]: I0813 00:35:18.223744 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7f11d4ba-b27a-4a80-a109-96b122ee11da-varrun\") pod \"csi-node-driver-m42j7\" (UID: \"7f11d4ba-b27a-4a80-a109-96b122ee11da\") " pod="calico-system/csi-node-driver-m42j7" Aug 13 00:35:18.224029 kubelet[2755]: E0813 00:35:18.223966 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.224029 kubelet[2755]: W0813 00:35:18.223975 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.224029 kubelet[2755]: E0813 00:35:18.224016 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.224837 kubelet[2755]: I0813 00:35:18.224740 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7f11d4ba-b27a-4a80-a109-96b122ee11da-registration-dir\") pod \"csi-node-driver-m42j7\" (UID: \"7f11d4ba-b27a-4a80-a109-96b122ee11da\") " pod="calico-system/csi-node-driver-m42j7" Aug 13 00:35:18.225074 kubelet[2755]: E0813 00:35:18.225062 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.225138 kubelet[2755]: W0813 00:35:18.225117 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.225138 kubelet[2755]: E0813 00:35:18.225129 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.225297 kubelet[2755]: I0813 00:35:18.225274 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7f11d4ba-b27a-4a80-a109-96b122ee11da-socket-dir\") pod \"csi-node-driver-m42j7\" (UID: \"7f11d4ba-b27a-4a80-a109-96b122ee11da\") " pod="calico-system/csi-node-driver-m42j7" Aug 13 00:35:18.225386 kubelet[2755]: E0813 00:35:18.225363 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.225386 kubelet[2755]: W0813 00:35:18.225371 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.225386 kubelet[2755]: E0813 00:35:18.225378 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.225614 kubelet[2755]: E0813 00:35:18.225590 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.225614 kubelet[2755]: W0813 00:35:18.225598 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.225614 kubelet[2755]: E0813 00:35:18.225606 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.225878 kubelet[2755]: E0813 00:35:18.225855 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.225878 kubelet[2755]: W0813 00:35:18.225869 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.225932 kubelet[2755]: E0813 00:35:18.225879 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.225932 kubelet[2755]: I0813 00:35:18.225908 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wf8h\" (UniqueName: \"kubernetes.io/projected/7f11d4ba-b27a-4a80-a109-96b122ee11da-kube-api-access-6wf8h\") pod \"csi-node-driver-m42j7\" (UID: \"7f11d4ba-b27a-4a80-a109-96b122ee11da\") " pod="calico-system/csi-node-driver-m42j7" Aug 13 00:35:18.226193 kubelet[2755]: E0813 00:35:18.226174 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.226193 kubelet[2755]: W0813 00:35:18.226188 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.226259 kubelet[2755]: E0813 00:35:18.226196 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.226578 kubelet[2755]: E0813 00:35:18.226320 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.226578 kubelet[2755]: W0813 00:35:18.226352 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.226578 kubelet[2755]: E0813 00:35:18.226360 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.226747 kubelet[2755]: E0813 00:35:18.226682 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.226819 kubelet[2755]: W0813 00:35:18.226803 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.226861 kubelet[2755]: E0813 00:35:18.226853 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.227053 kubelet[2755]: E0813 00:35:18.227043 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.227187 kubelet[2755]: W0813 00:35:18.227159 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.227187 kubelet[2755]: E0813 00:35:18.227171 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.227469 kubelet[2755]: E0813 00:35:18.227441 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.227762 kubelet[2755]: W0813 00:35:18.227744 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.227762 kubelet[2755]: E0813 00:35:18.227761 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.227909 kubelet[2755]: E0813 00:35:18.227894 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.227909 kubelet[2755]: W0813 00:35:18.227906 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.227968 kubelet[2755]: E0813 00:35:18.227914 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.271624 containerd[1566]: time="2025-08-13T00:35:18.271579352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5jpt2,Uid:81a4343c-21d2-416f-bed6-05f918f0b85f,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:18.294041 containerd[1566]: time="2025-08-13T00:35:18.293946663Z" level=info msg="connecting to shim 9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12" address="unix:///run/containerd/s/32abafaa6178dd3e24e3879c7bc79b30bea7fd451bedc77db56c0fc189f76ef2" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:18.315900 systemd[1]: Started cri-containerd-9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12.scope - libcontainer container 9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12. Aug 13 00:35:18.328506 kubelet[2755]: E0813 00:35:18.328381 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.328506 kubelet[2755]: W0813 00:35:18.328401 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.328506 kubelet[2755]: E0813 00:35:18.328420 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.328813 kubelet[2755]: E0813 00:35:18.328789 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.328959 kubelet[2755]: W0813 00:35:18.328877 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.328959 kubelet[2755]: E0813 00:35:18.328891 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.329250 kubelet[2755]: E0813 00:35:18.329143 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.329250 kubelet[2755]: W0813 00:35:18.329156 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.329250 kubelet[2755]: E0813 00:35:18.329167 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.329415 kubelet[2755]: E0813 00:35:18.329395 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.329415 kubelet[2755]: W0813 00:35:18.329410 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.329647 kubelet[2755]: E0813 00:35:18.329421 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.330829 kubelet[2755]: E0813 00:35:18.330814 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.330829 kubelet[2755]: W0813 00:35:18.330828 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.330942 kubelet[2755]: E0813 00:35:18.330837 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.331373 kubelet[2755]: E0813 00:35:18.331315 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.331901 kubelet[2755]: W0813 00:35:18.331812 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.331901 kubelet[2755]: E0813 00:35:18.331829 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.332314 kubelet[2755]: E0813 00:35:18.332293 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.332314 kubelet[2755]: W0813 00:35:18.332310 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.332510 kubelet[2755]: E0813 00:35:18.332320 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.332510 kubelet[2755]: E0813 00:35:18.332458 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.332510 kubelet[2755]: W0813 00:35:18.332466 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.332510 kubelet[2755]: E0813 00:35:18.332474 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.333018 kubelet[2755]: E0813 00:35:18.332629 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.333018 kubelet[2755]: W0813 00:35:18.332636 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.333018 kubelet[2755]: E0813 00:35:18.332644 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.333018 kubelet[2755]: E0813 00:35:18.332989 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.333115 kubelet[2755]: W0813 00:35:18.333079 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.333115 kubelet[2755]: E0813 00:35:18.333093 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.333615 kubelet[2755]: E0813 00:35:18.333597 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.333615 kubelet[2755]: W0813 00:35:18.333614 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.333679 kubelet[2755]: E0813 00:35:18.333625 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.334724 kubelet[2755]: E0813 00:35:18.333990 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.334724 kubelet[2755]: W0813 00:35:18.334002 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.334724 kubelet[2755]: E0813 00:35:18.334015 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.334724 kubelet[2755]: E0813 00:35:18.334581 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.334724 kubelet[2755]: W0813 00:35:18.334591 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.334724 kubelet[2755]: E0813 00:35:18.334600 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.335269 kubelet[2755]: E0813 00:35:18.335253 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.335309 kubelet[2755]: W0813 00:35:18.335270 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.335309 kubelet[2755]: E0813 00:35:18.335281 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.335752 kubelet[2755]: E0813 00:35:18.335735 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.335752 kubelet[2755]: W0813 00:35:18.335748 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.335831 kubelet[2755]: E0813 00:35:18.335756 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.336770 kubelet[2755]: E0813 00:35:18.336753 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.336770 kubelet[2755]: W0813 00:35:18.336767 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.336849 kubelet[2755]: E0813 00:35:18.336775 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.336965 kubelet[2755]: E0813 00:35:18.336943 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.336965 kubelet[2755]: W0813 00:35:18.336952 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.336965 kubelet[2755]: E0813 00:35:18.336958 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.337238 kubelet[2755]: E0813 00:35:18.337157 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.337238 kubelet[2755]: W0813 00:35:18.337165 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.337238 kubelet[2755]: E0813 00:35:18.337172 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.337537 kubelet[2755]: E0813 00:35:18.337520 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.337537 kubelet[2755]: W0813 00:35:18.337532 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.337588 kubelet[2755]: E0813 00:35:18.337540 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.337991 kubelet[2755]: E0813 00:35:18.337975 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.337991 kubelet[2755]: W0813 00:35:18.337986 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.338092 kubelet[2755]: E0813 00:35:18.337994 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.338265 kubelet[2755]: E0813 00:35:18.338233 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.338265 kubelet[2755]: W0813 00:35:18.338261 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.338325 kubelet[2755]: E0813 00:35:18.338269 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.339053 kubelet[2755]: E0813 00:35:18.339023 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.339380 kubelet[2755]: W0813 00:35:18.339036 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.339380 kubelet[2755]: E0813 00:35:18.339129 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.339800 kubelet[2755]: E0813 00:35:18.339783 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.339800 kubelet[2755]: W0813 00:35:18.339797 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.339870 kubelet[2755]: E0813 00:35:18.339804 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.340158 kubelet[2755]: E0813 00:35:18.340140 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.340158 kubelet[2755]: W0813 00:35:18.340149 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.340545 kubelet[2755]: E0813 00:35:18.340527 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.341071 kubelet[2755]: E0813 00:35:18.340853 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.341071 kubelet[2755]: W0813 00:35:18.340861 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.341071 kubelet[2755]: E0813 00:35:18.340869 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.355208 kubelet[2755]: E0813 00:35:18.355010 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:18.355208 kubelet[2755]: W0813 00:35:18.355027 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:18.355208 kubelet[2755]: E0813 00:35:18.355044 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:18.361945 containerd[1566]: time="2025-08-13T00:35:18.361881480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5jpt2,Uid:81a4343c-21d2-416f-bed6-05f918f0b85f,Namespace:calico-system,Attempt:0,} returns sandbox id \"9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12\"" Aug 13 00:35:19.731726 kubelet[2755]: E0813 00:35:19.731533 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-m42j7" podUID="7f11d4ba-b27a-4a80-a109-96b122ee11da" Aug 13 00:35:19.871165 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount179152598.mount: Deactivated successfully. Aug 13 00:35:21.127506 containerd[1566]: time="2025-08-13T00:35:21.127441303Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:21.128635 containerd[1566]: time="2025-08-13T00:35:21.128606864Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Aug 13 00:35:21.129719 containerd[1566]: time="2025-08-13T00:35:21.129359352Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:21.131095 containerd[1566]: time="2025-08-13T00:35:21.131076192Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:21.131495 containerd[1566]: time="2025-08-13T00:35:21.131470111Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 3.104417671s" Aug 13 00:35:21.131534 containerd[1566]: time="2025-08-13T00:35:21.131497352Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Aug 13 00:35:21.133058 containerd[1566]: time="2025-08-13T00:35:21.133040374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 13 00:35:21.143546 containerd[1566]: time="2025-08-13T00:35:21.143520495Z" level=info msg="CreateContainer within sandbox \"25f7b6352ce6c4502d6d00ce2ddcd8839dbb4ffdcdf333644ff6c1a65bd2e3b3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 13 00:35:21.149838 containerd[1566]: time="2025-08-13T00:35:21.148796456Z" level=info msg="Container 6a044ba9bc8c7e2b3d30368c2b1f93b589e5d979e73b348bfaf6441ac4ae36f4: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:21.155797 containerd[1566]: time="2025-08-13T00:35:21.155776047Z" level=info msg="CreateContainer within sandbox \"25f7b6352ce6c4502d6d00ce2ddcd8839dbb4ffdcdf333644ff6c1a65bd2e3b3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"6a044ba9bc8c7e2b3d30368c2b1f93b589e5d979e73b348bfaf6441ac4ae36f4\"" Aug 13 00:35:21.158256 containerd[1566]: time="2025-08-13T00:35:21.157328310Z" level=info msg="StartContainer for \"6a044ba9bc8c7e2b3d30368c2b1f93b589e5d979e73b348bfaf6441ac4ae36f4\"" Aug 13 00:35:21.159084 containerd[1566]: time="2025-08-13T00:35:21.158683293Z" level=info msg="connecting to shim 6a044ba9bc8c7e2b3d30368c2b1f93b589e5d979e73b348bfaf6441ac4ae36f4" address="unix:///run/containerd/s/7d4bca30a2d846e6a4a3d676b807dbfe62f810cee6d6fd0768a184cffff968ad" protocol=ttrpc version=3 Aug 13 00:35:21.178861 systemd[1]: Started cri-containerd-6a044ba9bc8c7e2b3d30368c2b1f93b589e5d979e73b348bfaf6441ac4ae36f4.scope - libcontainer container 6a044ba9bc8c7e2b3d30368c2b1f93b589e5d979e73b348bfaf6441ac4ae36f4. Aug 13 00:35:21.229476 containerd[1566]: time="2025-08-13T00:35:21.229358855Z" level=info msg="StartContainer for \"6a044ba9bc8c7e2b3d30368c2b1f93b589e5d979e73b348bfaf6441ac4ae36f4\" returns successfully" Aug 13 00:35:21.730711 kubelet[2755]: E0813 00:35:21.729869 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-m42j7" podUID="7f11d4ba-b27a-4a80-a109-96b122ee11da" Aug 13 00:35:21.826381 kubelet[2755]: I0813 00:35:21.826205 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-869cf6fd69-btdcs" podStartSLOduration=1.71910371 podStartE2EDuration="4.825327783s" podCreationTimestamp="2025-08-13 00:35:17 +0000 UTC" firstStartedPulling="2025-08-13 00:35:18.026540166 +0000 UTC m=+18.404127023" lastFinishedPulling="2025-08-13 00:35:21.132764239 +0000 UTC m=+21.510351096" observedRunningTime="2025-08-13 00:35:21.813486716 +0000 UTC m=+22.191073583" watchObservedRunningTime="2025-08-13 00:35:21.825327783 +0000 UTC m=+22.202914650" Aug 13 00:35:21.846655 kubelet[2755]: E0813 00:35:21.846617 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.846655 kubelet[2755]: W0813 00:35:21.846645 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.846655 kubelet[2755]: E0813 00:35:21.846667 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.846954 kubelet[2755]: E0813 00:35:21.846879 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.846954 kubelet[2755]: W0813 00:35:21.846901 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.846954 kubelet[2755]: E0813 00:35:21.846909 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.847609 kubelet[2755]: E0813 00:35:21.847102 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.847609 kubelet[2755]: W0813 00:35:21.847112 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.847609 kubelet[2755]: E0813 00:35:21.847122 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.847609 kubelet[2755]: E0813 00:35:21.847305 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.847609 kubelet[2755]: W0813 00:35:21.847313 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.847609 kubelet[2755]: E0813 00:35:21.847323 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.847609 kubelet[2755]: E0813 00:35:21.847506 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.847609 kubelet[2755]: W0813 00:35:21.847515 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.847609 kubelet[2755]: E0813 00:35:21.847525 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.847889 kubelet[2755]: E0813 00:35:21.847811 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.847889 kubelet[2755]: W0813 00:35:21.847820 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.847889 kubelet[2755]: E0813 00:35:21.847828 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.848005 kubelet[2755]: E0813 00:35:21.847979 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.848005 kubelet[2755]: W0813 00:35:21.847992 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.848005 kubelet[2755]: E0813 00:35:21.848000 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.848180 kubelet[2755]: E0813 00:35:21.848157 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.848180 kubelet[2755]: W0813 00:35:21.848170 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.848180 kubelet[2755]: E0813 00:35:21.848178 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.848585 kubelet[2755]: E0813 00:35:21.848337 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.848585 kubelet[2755]: W0813 00:35:21.848345 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.848585 kubelet[2755]: E0813 00:35:21.848353 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.848585 kubelet[2755]: E0813 00:35:21.848525 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.848585 kubelet[2755]: W0813 00:35:21.848532 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.848585 kubelet[2755]: E0813 00:35:21.848540 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.849051 kubelet[2755]: E0813 00:35:21.848724 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.849051 kubelet[2755]: W0813 00:35:21.848733 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.849051 kubelet[2755]: E0813 00:35:21.848741 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.849051 kubelet[2755]: E0813 00:35:21.848896 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.849051 kubelet[2755]: W0813 00:35:21.848905 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.849051 kubelet[2755]: E0813 00:35:21.848912 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.849546 kubelet[2755]: E0813 00:35:21.849197 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.849546 kubelet[2755]: W0813 00:35:21.849205 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.849546 kubelet[2755]: E0813 00:35:21.849213 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.849546 kubelet[2755]: E0813 00:35:21.849392 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.849546 kubelet[2755]: W0813 00:35:21.849415 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.849546 kubelet[2755]: E0813 00:35:21.849423 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.849806 kubelet[2755]: E0813 00:35:21.849781 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.849806 kubelet[2755]: W0813 00:35:21.849807 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.849873 kubelet[2755]: E0813 00:35:21.849815 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.857087 kubelet[2755]: E0813 00:35:21.857071 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.857087 kubelet[2755]: W0813 00:35:21.857085 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.857322 kubelet[2755]: E0813 00:35:21.857095 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.857322 kubelet[2755]: E0813 00:35:21.857303 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.857677 kubelet[2755]: W0813 00:35:21.857331 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.857677 kubelet[2755]: E0813 00:35:21.857340 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.857849 kubelet[2755]: E0813 00:35:21.857835 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.857969 kubelet[2755]: W0813 00:35:21.857944 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.857969 kubelet[2755]: E0813 00:35:21.857963 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.858199 kubelet[2755]: E0813 00:35:21.858172 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.858199 kubelet[2755]: W0813 00:35:21.858186 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.858199 kubelet[2755]: E0813 00:35:21.858196 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.858391 kubelet[2755]: E0813 00:35:21.858369 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.858391 kubelet[2755]: W0813 00:35:21.858383 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.858391 kubelet[2755]: E0813 00:35:21.858393 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.858627 kubelet[2755]: E0813 00:35:21.858608 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.858627 kubelet[2755]: W0813 00:35:21.858621 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.858945 kubelet[2755]: E0813 00:35:21.858629 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.858945 kubelet[2755]: E0813 00:35:21.858789 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.858945 kubelet[2755]: W0813 00:35:21.858797 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.858945 kubelet[2755]: E0813 00:35:21.858804 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.858945 kubelet[2755]: E0813 00:35:21.858939 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.858945 kubelet[2755]: W0813 00:35:21.858946 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.859087 kubelet[2755]: E0813 00:35:21.858954 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.859087 kubelet[2755]: E0813 00:35:21.859062 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.859087 kubelet[2755]: W0813 00:35:21.859069 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.859087 kubelet[2755]: E0813 00:35:21.859076 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.859185 kubelet[2755]: E0813 00:35:21.859167 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.859185 kubelet[2755]: W0813 00:35:21.859180 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.859236 kubelet[2755]: E0813 00:35:21.859188 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.859549 kubelet[2755]: E0813 00:35:21.859345 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.859549 kubelet[2755]: W0813 00:35:21.859358 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.859549 kubelet[2755]: E0813 00:35:21.859368 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.859678 kubelet[2755]: E0813 00:35:21.859666 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.859810 kubelet[2755]: W0813 00:35:21.859767 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.860159 kubelet[2755]: E0813 00:35:21.859829 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.860159 kubelet[2755]: E0813 00:35:21.859989 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.860159 kubelet[2755]: W0813 00:35:21.859997 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.860159 kubelet[2755]: E0813 00:35:21.860005 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.860311 kubelet[2755]: E0813 00:35:21.860290 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.860311 kubelet[2755]: W0813 00:35:21.860303 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.860311 kubelet[2755]: E0813 00:35:21.860312 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.860513 kubelet[2755]: E0813 00:35:21.860491 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.860513 kubelet[2755]: W0813 00:35:21.860506 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.860513 kubelet[2755]: E0813 00:35:21.860514 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.860711 kubelet[2755]: E0813 00:35:21.860676 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.860759 kubelet[2755]: W0813 00:35:21.860692 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.860759 kubelet[2755]: E0813 00:35:21.860735 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.860939 kubelet[2755]: E0813 00:35:21.860913 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.860939 kubelet[2755]: W0813 00:35:21.860930 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.860939 kubelet[2755]: E0813 00:35:21.860938 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:21.861348 kubelet[2755]: E0813 00:35:21.861315 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:21.861348 kubelet[2755]: W0813 00:35:21.861329 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:21.861433 kubelet[2755]: E0813 00:35:21.861350 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.860578 kubelet[2755]: E0813 00:35:22.860097 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.860578 kubelet[2755]: W0813 00:35:22.860397 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.860578 kubelet[2755]: E0813 00:35:22.860420 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.862123 kubelet[2755]: E0813 00:35:22.861925 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.862123 kubelet[2755]: W0813 00:35:22.861938 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.862123 kubelet[2755]: E0813 00:35:22.861955 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.862963 kubelet[2755]: E0813 00:35:22.862282 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.862963 kubelet[2755]: W0813 00:35:22.862292 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.862963 kubelet[2755]: E0813 00:35:22.862304 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.862963 kubelet[2755]: E0813 00:35:22.862438 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.862963 kubelet[2755]: W0813 00:35:22.862445 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.862963 kubelet[2755]: E0813 00:35:22.862455 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.862963 kubelet[2755]: E0813 00:35:22.862598 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.862963 kubelet[2755]: W0813 00:35:22.862613 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.862963 kubelet[2755]: E0813 00:35:22.862622 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.862963 kubelet[2755]: E0813 00:35:22.862813 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.863171 kubelet[2755]: W0813 00:35:22.862821 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.863171 kubelet[2755]: E0813 00:35:22.862829 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.863171 kubelet[2755]: E0813 00:35:22.862969 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.863171 kubelet[2755]: W0813 00:35:22.862977 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.863171 kubelet[2755]: E0813 00:35:22.862986 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.863171 kubelet[2755]: E0813 00:35:22.863088 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.863171 kubelet[2755]: W0813 00:35:22.863096 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.863171 kubelet[2755]: E0813 00:35:22.863103 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.863324 kubelet[2755]: E0813 00:35:22.863217 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.863324 kubelet[2755]: W0813 00:35:22.863225 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.863324 kubelet[2755]: E0813 00:35:22.863232 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.863386 kubelet[2755]: E0813 00:35:22.863327 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.863386 kubelet[2755]: W0813 00:35:22.863334 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.863386 kubelet[2755]: E0813 00:35:22.863341 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.863469 kubelet[2755]: E0813 00:35:22.863432 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.863469 kubelet[2755]: W0813 00:35:22.863438 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.863469 kubelet[2755]: E0813 00:35:22.863446 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.863553 kubelet[2755]: E0813 00:35:22.863546 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.863581 kubelet[2755]: W0813 00:35:22.863553 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.863581 kubelet[2755]: E0813 00:35:22.863561 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.864736 kubelet[2755]: E0813 00:35:22.863735 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.864736 kubelet[2755]: W0813 00:35:22.863763 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.864736 kubelet[2755]: E0813 00:35:22.863785 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.864736 kubelet[2755]: E0813 00:35:22.863960 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.864736 kubelet[2755]: W0813 00:35:22.863997 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.864736 kubelet[2755]: E0813 00:35:22.864008 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.864736 kubelet[2755]: E0813 00:35:22.864130 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.864736 kubelet[2755]: W0813 00:35:22.864137 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.864736 kubelet[2755]: E0813 00:35:22.864172 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.867413 kubelet[2755]: E0813 00:35:22.867397 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.867654 kubelet[2755]: W0813 00:35:22.867511 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.867654 kubelet[2755]: E0813 00:35:22.867536 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.867898 kubelet[2755]: E0813 00:35:22.867857 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.867898 kubelet[2755]: W0813 00:35:22.867879 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.867898 kubelet[2755]: E0813 00:35:22.867891 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.868218 kubelet[2755]: E0813 00:35:22.868199 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.868218 kubelet[2755]: W0813 00:35:22.868213 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.868316 kubelet[2755]: E0813 00:35:22.868226 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.868676 kubelet[2755]: E0813 00:35:22.868648 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.868676 kubelet[2755]: W0813 00:35:22.868663 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.868676 kubelet[2755]: E0813 00:35:22.868674 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.869058 kubelet[2755]: E0813 00:35:22.869023 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.869208 kubelet[2755]: W0813 00:35:22.869184 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.869208 kubelet[2755]: E0813 00:35:22.869201 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.869580 kubelet[2755]: E0813 00:35:22.869555 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.869580 kubelet[2755]: W0813 00:35:22.869570 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.869580 kubelet[2755]: E0813 00:35:22.869579 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.870071 kubelet[2755]: E0813 00:35:22.870044 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.870071 kubelet[2755]: W0813 00:35:22.870061 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.870071 kubelet[2755]: E0813 00:35:22.870070 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.870366 kubelet[2755]: E0813 00:35:22.870340 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.870366 kubelet[2755]: W0813 00:35:22.870355 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.870366 kubelet[2755]: E0813 00:35:22.870364 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.871038 kubelet[2755]: E0813 00:35:22.871013 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.871038 kubelet[2755]: W0813 00:35:22.871030 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.871038 kubelet[2755]: E0813 00:35:22.871040 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.872013 kubelet[2755]: E0813 00:35:22.871988 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.872013 kubelet[2755]: W0813 00:35:22.872003 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.872013 kubelet[2755]: E0813 00:35:22.872013 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.872337 kubelet[2755]: E0813 00:35:22.872295 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.872337 kubelet[2755]: W0813 00:35:22.872312 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.872337 kubelet[2755]: E0813 00:35:22.872321 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.872731 kubelet[2755]: E0813 00:35:22.872711 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.872731 kubelet[2755]: W0813 00:35:22.872724 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.872731 kubelet[2755]: E0813 00:35:22.872733 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.872999 kubelet[2755]: E0813 00:35:22.872973 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.872999 kubelet[2755]: W0813 00:35:22.872988 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.872999 kubelet[2755]: E0813 00:35:22.872997 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.873331 kubelet[2755]: E0813 00:35:22.873306 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.873331 kubelet[2755]: W0813 00:35:22.873320 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.873472 kubelet[2755]: E0813 00:35:22.873437 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.874187 kubelet[2755]: E0813 00:35:22.874112 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.874187 kubelet[2755]: W0813 00:35:22.874129 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.874187 kubelet[2755]: E0813 00:35:22.874138 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.874799 kubelet[2755]: E0813 00:35:22.874775 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.874799 kubelet[2755]: W0813 00:35:22.874789 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.874799 kubelet[2755]: E0813 00:35:22.874798 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.875334 kubelet[2755]: E0813 00:35:22.875306 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.875420 kubelet[2755]: W0813 00:35:22.875320 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.875420 kubelet[2755]: E0813 00:35:22.875418 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.875776 kubelet[2755]: E0813 00:35:22.875737 2755 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 13 00:35:22.875776 kubelet[2755]: W0813 00:35:22.875766 2755 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 13 00:35:22.875776 kubelet[2755]: E0813 00:35:22.875775 2755 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 13 00:35:22.920299 containerd[1566]: time="2025-08-13T00:35:22.919745070Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:22.920675 containerd[1566]: time="2025-08-13T00:35:22.920654442Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Aug 13 00:35:22.922147 containerd[1566]: time="2025-08-13T00:35:22.922125026Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:22.924833 containerd[1566]: time="2025-08-13T00:35:22.924757060Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:22.925197 containerd[1566]: time="2025-08-13T00:35:22.925148215Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.792028855s" Aug 13 00:35:22.925261 containerd[1566]: time="2025-08-13T00:35:22.925198166Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Aug 13 00:35:22.929066 containerd[1566]: time="2025-08-13T00:35:22.929023820Z" level=info msg="CreateContainer within sandbox \"9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 13 00:35:22.940883 containerd[1566]: time="2025-08-13T00:35:22.939721841Z" level=info msg="Container 44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:22.953733 containerd[1566]: time="2025-08-13T00:35:22.953646652Z" level=info msg="CreateContainer within sandbox \"9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05\"" Aug 13 00:35:22.954941 containerd[1566]: time="2025-08-13T00:35:22.954715365Z" level=info msg="StartContainer for \"44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05\"" Aug 13 00:35:22.956649 containerd[1566]: time="2025-08-13T00:35:22.956626193Z" level=info msg="connecting to shim 44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05" address="unix:///run/containerd/s/32abafaa6178dd3e24e3879c7bc79b30bea7fd451bedc77db56c0fc189f76ef2" protocol=ttrpc version=3 Aug 13 00:35:22.981859 systemd[1]: Started cri-containerd-44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05.scope - libcontainer container 44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05. Aug 13 00:35:23.050606 containerd[1566]: time="2025-08-13T00:35:23.050562507Z" level=info msg="StartContainer for \"44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05\" returns successfully" Aug 13 00:35:23.058147 systemd[1]: cri-containerd-44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05.scope: Deactivated successfully. Aug 13 00:35:23.082677 containerd[1566]: time="2025-08-13T00:35:23.082599650Z" level=info msg="TaskExit event in podsandbox handler container_id:\"44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05\" id:\"44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05\" pid:3490 exited_at:{seconds:1755045323 nanos:57850065}" Aug 13 00:35:23.083446 containerd[1566]: time="2025-08-13T00:35:23.083397422Z" level=info msg="received exit event container_id:\"44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05\" id:\"44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05\" pid:3490 exited_at:{seconds:1755045323 nanos:57850065}" Aug 13 00:35:23.105299 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-44eaeeeb7d297b23d85d4d9a9bf8a7464108d6b73fdfd09c3f844cc291a3cc05-rootfs.mount: Deactivated successfully. Aug 13 00:35:23.730639 kubelet[2755]: E0813 00:35:23.730059 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-m42j7" podUID="7f11d4ba-b27a-4a80-a109-96b122ee11da" Aug 13 00:35:23.814925 containerd[1566]: time="2025-08-13T00:35:23.814862973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 13 00:35:25.729946 kubelet[2755]: E0813 00:35:25.729890 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-m42j7" podUID="7f11d4ba-b27a-4a80-a109-96b122ee11da" Aug 13 00:35:27.729758 kubelet[2755]: E0813 00:35:27.729719 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-m42j7" podUID="7f11d4ba-b27a-4a80-a109-96b122ee11da" Aug 13 00:35:27.832341 containerd[1566]: time="2025-08-13T00:35:27.832280013Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:27.833215 containerd[1566]: time="2025-08-13T00:35:27.833176467Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Aug 13 00:35:27.834305 containerd[1566]: time="2025-08-13T00:35:27.834266487Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:27.836071 containerd[1566]: time="2025-08-13T00:35:27.836042021Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:27.836619 containerd[1566]: time="2025-08-13T00:35:27.836587897Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 4.021669283s" Aug 13 00:35:27.836619 containerd[1566]: time="2025-08-13T00:35:27.836612590Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Aug 13 00:35:27.841014 containerd[1566]: time="2025-08-13T00:35:27.840983449Z" level=info msg="CreateContainer within sandbox \"9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 13 00:35:27.854953 containerd[1566]: time="2025-08-13T00:35:27.854126872Z" level=info msg="Container 2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:27.859224 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3976433433.mount: Deactivated successfully. Aug 13 00:35:27.863480 containerd[1566]: time="2025-08-13T00:35:27.863440835Z" level=info msg="CreateContainer within sandbox \"9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7\"" Aug 13 00:35:27.864369 containerd[1566]: time="2025-08-13T00:35:27.864171505Z" level=info msg="StartContainer for \"2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7\"" Aug 13 00:35:27.865589 containerd[1566]: time="2025-08-13T00:35:27.865559751Z" level=info msg="connecting to shim 2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7" address="unix:///run/containerd/s/32abafaa6178dd3e24e3879c7bc79b30bea7fd451bedc77db56c0fc189f76ef2" protocol=ttrpc version=3 Aug 13 00:35:27.888833 systemd[1]: Started cri-containerd-2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7.scope - libcontainer container 2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7. Aug 13 00:35:27.920576 containerd[1566]: time="2025-08-13T00:35:27.920528965Z" level=info msg="StartContainer for \"2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7\" returns successfully" Aug 13 00:35:28.278032 systemd[1]: cri-containerd-2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7.scope: Deactivated successfully. Aug 13 00:35:28.278255 systemd[1]: cri-containerd-2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7.scope: Consumed 345ms CPU time, 157.5M memory peak, 7M read from disk, 171.2M written to disk. Aug 13 00:35:28.339157 containerd[1566]: time="2025-08-13T00:35:28.339124662Z" level=info msg="received exit event container_id:\"2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7\" id:\"2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7\" pid:3548 exited_at:{seconds:1755045328 nanos:338916981}" Aug 13 00:35:28.339522 containerd[1566]: time="2025-08-13T00:35:28.339203189Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7\" id:\"2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7\" pid:3548 exited_at:{seconds:1755045328 nanos:338916981}" Aug 13 00:35:28.357490 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2d7e1cd7194deb08723c2a21441a84d8761a59e6f3b395ded535b2425a8e23f7-rootfs.mount: Deactivated successfully. Aug 13 00:35:28.362900 kubelet[2755]: I0813 00:35:28.362859 2755 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 13 00:35:28.428419 systemd[1]: Created slice kubepods-burstable-pod334452d9_3977_436e_9dcf_2778778479f8.slice - libcontainer container kubepods-burstable-pod334452d9_3977_436e_9dcf_2778778479f8.slice. Aug 13 00:35:28.437277 systemd[1]: Created slice kubepods-burstable-podda2b92e7_ec49_43a2_b5c9_35d5000b7d8f.slice - libcontainer container kubepods-burstable-podda2b92e7_ec49_43a2_b5c9_35d5000b7d8f.slice. Aug 13 00:35:28.445544 systemd[1]: Created slice kubepods-besteffort-pod6c4854d0_1965_49ea_9f27_a943b10dac0f.slice - libcontainer container kubepods-besteffort-pod6c4854d0_1965_49ea_9f27_a943b10dac0f.slice. Aug 13 00:35:28.454588 systemd[1]: Created slice kubepods-besteffort-pod69725ba8_05ad_4005_a951_7dc187af0ddc.slice - libcontainer container kubepods-besteffort-pod69725ba8_05ad_4005_a951_7dc187af0ddc.slice. Aug 13 00:35:28.460284 systemd[1]: Created slice kubepods-besteffort-pod64302c8f_d4eb_438d_8d05_03180a2fc1dd.slice - libcontainer container kubepods-besteffort-pod64302c8f_d4eb_438d_8d05_03180a2fc1dd.slice. Aug 13 00:35:28.466526 systemd[1]: Created slice kubepods-besteffort-poddc66db4b_26a1_4eb6_8d19_d3d299e80b8a.slice - libcontainer container kubepods-besteffort-poddc66db4b_26a1_4eb6_8d19_d3d299e80b8a.slice. Aug 13 00:35:28.471592 systemd[1]: Created slice kubepods-besteffort-pod0c695393_6132_4469_8782_33ba9afa51c3.slice - libcontainer container kubepods-besteffort-pod0c695393_6132_4469_8782_33ba9afa51c3.slice. Aug 13 00:35:28.507053 kubelet[2755]: I0813 00:35:28.506335 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h46cv\" (UniqueName: \"kubernetes.io/projected/64302c8f-d4eb-438d-8d05-03180a2fc1dd-kube-api-access-h46cv\") pod \"calico-kube-controllers-7c5f9dd4c5-hhmqc\" (UID: \"64302c8f-d4eb-438d-8d05-03180a2fc1dd\") " pod="calico-system/calico-kube-controllers-7c5f9dd4c5-hhmqc" Aug 13 00:35:28.507053 kubelet[2755]: I0813 00:35:28.506371 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-whisker-ca-bundle\") pod \"whisker-fcb848d58-jjshp\" (UID: \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\") " pod="calico-system/whisker-fcb848d58-jjshp" Aug 13 00:35:28.507053 kubelet[2755]: I0813 00:35:28.506407 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c695393-6132-4469-8782-33ba9afa51c3-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-5hrzz\" (UID: \"0c695393-6132-4469-8782-33ba9afa51c3\") " pod="calico-system/goldmane-768f4c5c69-5hrzz" Aug 13 00:35:28.507053 kubelet[2755]: I0813 00:35:28.506420 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/334452d9-3977-436e-9dcf-2778778479f8-config-volume\") pod \"coredns-674b8bbfcf-22k9t\" (UID: \"334452d9-3977-436e-9dcf-2778778479f8\") " pod="kube-system/coredns-674b8bbfcf-22k9t" Aug 13 00:35:28.507053 kubelet[2755]: I0813 00:35:28.506436 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-whisker-backend-key-pair\") pod \"whisker-fcb848d58-jjshp\" (UID: \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\") " pod="calico-system/whisker-fcb848d58-jjshp" Aug 13 00:35:28.507282 kubelet[2755]: I0813 00:35:28.506448 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c695393-6132-4469-8782-33ba9afa51c3-config\") pod \"goldmane-768f4c5c69-5hrzz\" (UID: \"0c695393-6132-4469-8782-33ba9afa51c3\") " pod="calico-system/goldmane-768f4c5c69-5hrzz" Aug 13 00:35:28.507282 kubelet[2755]: I0813 00:35:28.506460 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6c4854d0-1965-49ea-9f27-a943b10dac0f-calico-apiserver-certs\") pod \"calico-apiserver-8545d6cbb4-zrjtl\" (UID: \"6c4854d0-1965-49ea-9f27-a943b10dac0f\") " pod="calico-apiserver/calico-apiserver-8545d6cbb4-zrjtl" Aug 13 00:35:28.507282 kubelet[2755]: I0813 00:35:28.506505 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn8kk\" (UniqueName: \"kubernetes.io/projected/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-kube-api-access-jn8kk\") pod \"whisker-fcb848d58-jjshp\" (UID: \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\") " pod="calico-system/whisker-fcb848d58-jjshp" Aug 13 00:35:28.507282 kubelet[2755]: I0813 00:35:28.506519 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/0c695393-6132-4469-8782-33ba9afa51c3-goldmane-key-pair\") pod \"goldmane-768f4c5c69-5hrzz\" (UID: \"0c695393-6132-4469-8782-33ba9afa51c3\") " pod="calico-system/goldmane-768f4c5c69-5hrzz" Aug 13 00:35:28.507282 kubelet[2755]: I0813 00:35:28.506531 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsf7d\" (UniqueName: \"kubernetes.io/projected/0c695393-6132-4469-8782-33ba9afa51c3-kube-api-access-vsf7d\") pod \"goldmane-768f4c5c69-5hrzz\" (UID: \"0c695393-6132-4469-8782-33ba9afa51c3\") " pod="calico-system/goldmane-768f4c5c69-5hrzz" Aug 13 00:35:28.508055 kubelet[2755]: I0813 00:35:28.506545 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/69725ba8-05ad-4005-a951-7dc187af0ddc-calico-apiserver-certs\") pod \"calico-apiserver-8545d6cbb4-c8wdj\" (UID: \"69725ba8-05ad-4005-a951-7dc187af0ddc\") " pod="calico-apiserver/calico-apiserver-8545d6cbb4-c8wdj" Aug 13 00:35:28.508055 kubelet[2755]: I0813 00:35:28.506555 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjv25\" (UniqueName: \"kubernetes.io/projected/da2b92e7-ec49-43a2-b5c9-35d5000b7d8f-kube-api-access-bjv25\") pod \"coredns-674b8bbfcf-f7knb\" (UID: \"da2b92e7-ec49-43a2-b5c9-35d5000b7d8f\") " pod="kube-system/coredns-674b8bbfcf-f7knb" Aug 13 00:35:28.508055 kubelet[2755]: I0813 00:35:28.506567 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22pwb\" (UniqueName: \"kubernetes.io/projected/334452d9-3977-436e-9dcf-2778778479f8-kube-api-access-22pwb\") pod \"coredns-674b8bbfcf-22k9t\" (UID: \"334452d9-3977-436e-9dcf-2778778479f8\") " pod="kube-system/coredns-674b8bbfcf-22k9t" Aug 13 00:35:28.508055 kubelet[2755]: I0813 00:35:28.506578 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/64302c8f-d4eb-438d-8d05-03180a2fc1dd-tigera-ca-bundle\") pod \"calico-kube-controllers-7c5f9dd4c5-hhmqc\" (UID: \"64302c8f-d4eb-438d-8d05-03180a2fc1dd\") " pod="calico-system/calico-kube-controllers-7c5f9dd4c5-hhmqc" Aug 13 00:35:28.508055 kubelet[2755]: I0813 00:35:28.506589 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs42m\" (UniqueName: \"kubernetes.io/projected/69725ba8-05ad-4005-a951-7dc187af0ddc-kube-api-access-cs42m\") pod \"calico-apiserver-8545d6cbb4-c8wdj\" (UID: \"69725ba8-05ad-4005-a951-7dc187af0ddc\") " pod="calico-apiserver/calico-apiserver-8545d6cbb4-c8wdj" Aug 13 00:35:28.508451 kubelet[2755]: I0813 00:35:28.506601 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzkmn\" (UniqueName: \"kubernetes.io/projected/6c4854d0-1965-49ea-9f27-a943b10dac0f-kube-api-access-zzkmn\") pod \"calico-apiserver-8545d6cbb4-zrjtl\" (UID: \"6c4854d0-1965-49ea-9f27-a943b10dac0f\") " pod="calico-apiserver/calico-apiserver-8545d6cbb4-zrjtl" Aug 13 00:35:28.508451 kubelet[2755]: I0813 00:35:28.506614 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/da2b92e7-ec49-43a2-b5c9-35d5000b7d8f-config-volume\") pod \"coredns-674b8bbfcf-f7knb\" (UID: \"da2b92e7-ec49-43a2-b5c9-35d5000b7d8f\") " pod="kube-system/coredns-674b8bbfcf-f7knb" Aug 13 00:35:28.743516 containerd[1566]: time="2025-08-13T00:35:28.743145590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-f7knb,Uid:da2b92e7-ec49-43a2-b5c9-35d5000b7d8f,Namespace:kube-system,Attempt:0,}" Aug 13 00:35:28.743928 containerd[1566]: time="2025-08-13T00:35:28.743885331Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-22k9t,Uid:334452d9-3977-436e-9dcf-2778778479f8,Namespace:kube-system,Attempt:0,}" Aug 13 00:35:28.751812 containerd[1566]: time="2025-08-13T00:35:28.751640945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8545d6cbb4-zrjtl,Uid:6c4854d0-1965-49ea-9f27-a943b10dac0f,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:35:28.766649 containerd[1566]: time="2025-08-13T00:35:28.766582124Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c5f9dd4c5-hhmqc,Uid:64302c8f-d4eb-438d-8d05-03180a2fc1dd,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:28.776183 containerd[1566]: time="2025-08-13T00:35:28.775899136Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8545d6cbb4-c8wdj,Uid:69725ba8-05ad-4005-a951-7dc187af0ddc,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:35:28.779486 containerd[1566]: time="2025-08-13T00:35:28.779249113Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5hrzz,Uid:0c695393-6132-4469-8782-33ba9afa51c3,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:28.779840 containerd[1566]: time="2025-08-13T00:35:28.779677172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fcb848d58-jjshp,Uid:dc66db4b-26a1-4eb6-8d19-d3d299e80b8a,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:28.844953 containerd[1566]: time="2025-08-13T00:35:28.844444345Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 13 00:35:28.990724 containerd[1566]: time="2025-08-13T00:35:28.990667180Z" level=error msg="Failed to destroy network for sandbox \"83b35df26257ac93c3ee0a74afaeff61d066c33ff7d0fe886218672ffe56081e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:28.993569 systemd[1]: run-netns-cni\x2dcc892d7d\x2ddb3a\x2db1b6\x2d815f\x2dffd2423451de.mount: Deactivated successfully. Aug 13 00:35:29.007309 containerd[1566]: time="2025-08-13T00:35:28.996028229Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-fcb848d58-jjshp,Uid:dc66db4b-26a1-4eb6-8d19-d3d299e80b8a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"83b35df26257ac93c3ee0a74afaeff61d066c33ff7d0fe886218672ffe56081e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.007585 containerd[1566]: time="2025-08-13T00:35:29.003175934Z" level=error msg="Failed to destroy network for sandbox \"48c349a1ecb07953d77edaf0dd301bd373757bb17c89094b74685caaecab4ac3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.009708 kubelet[2755]: E0813 00:35:29.009654 2755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83b35df26257ac93c3ee0a74afaeff61d066c33ff7d0fe886218672ffe56081e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.009986 containerd[1566]: time="2025-08-13T00:35:29.006019321Z" level=error msg="Failed to destroy network for sandbox \"95234c02d042325a5f57a4cd1a2636be26700f5fab7d67cdd7e4f52dcb7ebcde\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.010102 kubelet[2755]: E0813 00:35:29.010084 2755 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83b35df26257ac93c3ee0a74afaeff61d066c33ff7d0fe886218672ffe56081e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-fcb848d58-jjshp" Aug 13 00:35:29.010676 kubelet[2755]: E0813 00:35:29.010212 2755 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83b35df26257ac93c3ee0a74afaeff61d066c33ff7d0fe886218672ffe56081e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-fcb848d58-jjshp" Aug 13 00:35:29.010469 systemd[1]: run-netns-cni\x2d236924ba\x2de39f\x2d7d83\x2de0e7\x2d04215c73b955.mount: Deactivated successfully. Aug 13 00:35:29.014796 kubelet[2755]: E0813 00:35:29.010893 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-fcb848d58-jjshp_calico-system(dc66db4b-26a1-4eb6-8d19-d3d299e80b8a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-fcb848d58-jjshp_calico-system(dc66db4b-26a1-4eb6-8d19-d3d299e80b8a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83b35df26257ac93c3ee0a74afaeff61d066c33ff7d0fe886218672ffe56081e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-fcb848d58-jjshp" podUID="dc66db4b-26a1-4eb6-8d19-d3d299e80b8a" Aug 13 00:35:29.015984 systemd[1]: run-netns-cni\x2d6cb19742\x2dc84f\x2db23f\x2dc76c\x2db7d4dd0eed69.mount: Deactivated successfully. Aug 13 00:35:29.022272 containerd[1566]: time="2025-08-13T00:35:29.022241171Z" level=error msg="Failed to destroy network for sandbox \"a57a9c0daa2825b2e0daea68e3a22a2fdb8a215865f398ca1eb813014354a7a1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.026498 systemd[1]: run-netns-cni\x2d00195569\x2dcbf3\x2d21d9\x2d4362\x2d11058a199602.mount: Deactivated successfully. Aug 13 00:35:29.027239 containerd[1566]: time="2025-08-13T00:35:29.026825896Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c5f9dd4c5-hhmqc,Uid:64302c8f-d4eb-438d-8d05-03180a2fc1dd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"95234c02d042325a5f57a4cd1a2636be26700f5fab7d67cdd7e4f52dcb7ebcde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.028939 kubelet[2755]: E0813 00:35:29.028259 2755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95234c02d042325a5f57a4cd1a2636be26700f5fab7d67cdd7e4f52dcb7ebcde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.028939 kubelet[2755]: E0813 00:35:29.028311 2755 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95234c02d042325a5f57a4cd1a2636be26700f5fab7d67cdd7e4f52dcb7ebcde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c5f9dd4c5-hhmqc" Aug 13 00:35:29.028939 kubelet[2755]: E0813 00:35:29.028349 2755 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"95234c02d042325a5f57a4cd1a2636be26700f5fab7d67cdd7e4f52dcb7ebcde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c5f9dd4c5-hhmqc" Aug 13 00:35:29.029034 kubelet[2755]: E0813 00:35:29.028393 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c5f9dd4c5-hhmqc_calico-system(64302c8f-d4eb-438d-8d05-03180a2fc1dd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c5f9dd4c5-hhmqc_calico-system(64302c8f-d4eb-438d-8d05-03180a2fc1dd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"95234c02d042325a5f57a4cd1a2636be26700f5fab7d67cdd7e4f52dcb7ebcde\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c5f9dd4c5-hhmqc" podUID="64302c8f-d4eb-438d-8d05-03180a2fc1dd" Aug 13 00:35:29.029327 containerd[1566]: time="2025-08-13T00:35:29.029187926Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8545d6cbb4-zrjtl,Uid:6c4854d0-1965-49ea-9f27-a943b10dac0f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"48c349a1ecb07953d77edaf0dd301bd373757bb17c89094b74685caaecab4ac3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.029454 kubelet[2755]: E0813 00:35:29.029432 2755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48c349a1ecb07953d77edaf0dd301bd373757bb17c89094b74685caaecab4ac3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.029483 kubelet[2755]: E0813 00:35:29.029466 2755 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48c349a1ecb07953d77edaf0dd301bd373757bb17c89094b74685caaecab4ac3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8545d6cbb4-zrjtl" Aug 13 00:35:29.029529 kubelet[2755]: E0813 00:35:29.029484 2755 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"48c349a1ecb07953d77edaf0dd301bd373757bb17c89094b74685caaecab4ac3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8545d6cbb4-zrjtl" Aug 13 00:35:29.030374 kubelet[2755]: E0813 00:35:29.029568 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8545d6cbb4-zrjtl_calico-apiserver(6c4854d0-1965-49ea-9f27-a943b10dac0f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8545d6cbb4-zrjtl_calico-apiserver(6c4854d0-1965-49ea-9f27-a943b10dac0f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"48c349a1ecb07953d77edaf0dd301bd373757bb17c89094b74685caaecab4ac3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8545d6cbb4-zrjtl" podUID="6c4854d0-1965-49ea-9f27-a943b10dac0f" Aug 13 00:35:29.030548 containerd[1566]: time="2025-08-13T00:35:29.030519679Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-f7knb,Uid:da2b92e7-ec49-43a2-b5c9-35d5000b7d8f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a57a9c0daa2825b2e0daea68e3a22a2fdb8a215865f398ca1eb813014354a7a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.030719 kubelet[2755]: E0813 00:35:29.030673 2755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a57a9c0daa2825b2e0daea68e3a22a2fdb8a215865f398ca1eb813014354a7a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.030781 kubelet[2755]: E0813 00:35:29.030733 2755 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a57a9c0daa2825b2e0daea68e3a22a2fdb8a215865f398ca1eb813014354a7a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-f7knb" Aug 13 00:35:29.030781 kubelet[2755]: E0813 00:35:29.030746 2755 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a57a9c0daa2825b2e0daea68e3a22a2fdb8a215865f398ca1eb813014354a7a1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-f7knb" Aug 13 00:35:29.030826 kubelet[2755]: E0813 00:35:29.030791 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-f7knb_kube-system(da2b92e7-ec49-43a2-b5c9-35d5000b7d8f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-f7knb_kube-system(da2b92e7-ec49-43a2-b5c9-35d5000b7d8f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a57a9c0daa2825b2e0daea68e3a22a2fdb8a215865f398ca1eb813014354a7a1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-f7knb" podUID="da2b92e7-ec49-43a2-b5c9-35d5000b7d8f" Aug 13 00:35:29.046802 containerd[1566]: time="2025-08-13T00:35:29.046748494Z" level=error msg="Failed to destroy network for sandbox \"2286567ee700804cbaa70d5f2ed9b5d9d08fda88260ca5a544c4bb7d551e7ab0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.047574 containerd[1566]: time="2025-08-13T00:35:29.047535517Z" level=error msg="Failed to destroy network for sandbox \"5b7d98a1eb666d1221e6d3a6b805f0c721dc518d3542fb9ea3f077f65c2904f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.048674 containerd[1566]: time="2025-08-13T00:35:29.048651886Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8545d6cbb4-c8wdj,Uid:69725ba8-05ad-4005-a951-7dc187af0ddc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2286567ee700804cbaa70d5f2ed9b5d9d08fda88260ca5a544c4bb7d551e7ab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.048989 containerd[1566]: time="2025-08-13T00:35:29.048937699Z" level=error msg="Failed to destroy network for sandbox \"2464937cf4e45302ee0fcbced312a7b01e431e61041787a827cdd39d7ed3f921\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.049255 kubelet[2755]: E0813 00:35:29.049203 2755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2286567ee700804cbaa70d5f2ed9b5d9d08fda88260ca5a544c4bb7d551e7ab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.049316 kubelet[2755]: E0813 00:35:29.049256 2755 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2286567ee700804cbaa70d5f2ed9b5d9d08fda88260ca5a544c4bb7d551e7ab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8545d6cbb4-c8wdj" Aug 13 00:35:29.049316 kubelet[2755]: E0813 00:35:29.049276 2755 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2286567ee700804cbaa70d5f2ed9b5d9d08fda88260ca5a544c4bb7d551e7ab0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8545d6cbb4-c8wdj" Aug 13 00:35:29.049446 kubelet[2755]: E0813 00:35:29.049321 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8545d6cbb4-c8wdj_calico-apiserver(69725ba8-05ad-4005-a951-7dc187af0ddc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8545d6cbb4-c8wdj_calico-apiserver(69725ba8-05ad-4005-a951-7dc187af0ddc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2286567ee700804cbaa70d5f2ed9b5d9d08fda88260ca5a544c4bb7d551e7ab0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8545d6cbb4-c8wdj" podUID="69725ba8-05ad-4005-a951-7dc187af0ddc" Aug 13 00:35:29.049876 containerd[1566]: time="2025-08-13T00:35:29.049790090Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5hrzz,Uid:0c695393-6132-4469-8782-33ba9afa51c3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b7d98a1eb666d1221e6d3a6b805f0c721dc518d3542fb9ea3f077f65c2904f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.050166 kubelet[2755]: E0813 00:35:29.050113 2755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b7d98a1eb666d1221e6d3a6b805f0c721dc518d3542fb9ea3f077f65c2904f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.050207 kubelet[2755]: E0813 00:35:29.050176 2755 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b7d98a1eb666d1221e6d3a6b805f0c721dc518d3542fb9ea3f077f65c2904f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-5hrzz" Aug 13 00:35:29.050227 kubelet[2755]: E0813 00:35:29.050203 2755 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b7d98a1eb666d1221e6d3a6b805f0c721dc518d3542fb9ea3f077f65c2904f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-5hrzz" Aug 13 00:35:29.050298 kubelet[2755]: E0813 00:35:29.050269 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-5hrzz_calico-system(0c695393-6132-4469-8782-33ba9afa51c3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-5hrzz_calico-system(0c695393-6132-4469-8782-33ba9afa51c3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b7d98a1eb666d1221e6d3a6b805f0c721dc518d3542fb9ea3f077f65c2904f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-5hrzz" podUID="0c695393-6132-4469-8782-33ba9afa51c3" Aug 13 00:35:29.050866 containerd[1566]: time="2025-08-13T00:35:29.050808631Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-22k9t,Uid:334452d9-3977-436e-9dcf-2778778479f8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2464937cf4e45302ee0fcbced312a7b01e431e61041787a827cdd39d7ed3f921\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.051231 kubelet[2755]: E0813 00:35:29.050974 2755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2464937cf4e45302ee0fcbced312a7b01e431e61041787a827cdd39d7ed3f921\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.051231 kubelet[2755]: E0813 00:35:29.051006 2755 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2464937cf4e45302ee0fcbced312a7b01e431e61041787a827cdd39d7ed3f921\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-22k9t" Aug 13 00:35:29.051231 kubelet[2755]: E0813 00:35:29.051020 2755 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2464937cf4e45302ee0fcbced312a7b01e431e61041787a827cdd39d7ed3f921\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-22k9t" Aug 13 00:35:29.051326 kubelet[2755]: E0813 00:35:29.051052 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-22k9t_kube-system(334452d9-3977-436e-9dcf-2778778479f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-22k9t_kube-system(334452d9-3977-436e-9dcf-2778778479f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2464937cf4e45302ee0fcbced312a7b01e431e61041787a827cdd39d7ed3f921\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-22k9t" podUID="334452d9-3977-436e-9dcf-2778778479f8" Aug 13 00:35:29.735639 systemd[1]: Created slice kubepods-besteffort-pod7f11d4ba_b27a_4a80_a109_96b122ee11da.slice - libcontainer container kubepods-besteffort-pod7f11d4ba_b27a_4a80_a109_96b122ee11da.slice. Aug 13 00:35:29.740305 containerd[1566]: time="2025-08-13T00:35:29.740258737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-m42j7,Uid:7f11d4ba-b27a-4a80-a109-96b122ee11da,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:29.786736 containerd[1566]: time="2025-08-13T00:35:29.786669263Z" level=error msg="Failed to destroy network for sandbox \"28a51989c109c839ca38380575f3de5803587941401c65ccf497201f35f2c84f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.788399 containerd[1566]: time="2025-08-13T00:35:29.788342318Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-m42j7,Uid:7f11d4ba-b27a-4a80-a109-96b122ee11da,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"28a51989c109c839ca38380575f3de5803587941401c65ccf497201f35f2c84f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.788633 kubelet[2755]: E0813 00:35:29.788574 2755 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28a51989c109c839ca38380575f3de5803587941401c65ccf497201f35f2c84f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 13 00:35:29.788718 kubelet[2755]: E0813 00:35:29.788632 2755 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28a51989c109c839ca38380575f3de5803587941401c65ccf497201f35f2c84f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-m42j7" Aug 13 00:35:29.788718 kubelet[2755]: E0813 00:35:29.788653 2755 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"28a51989c109c839ca38380575f3de5803587941401c65ccf497201f35f2c84f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-m42j7" Aug 13 00:35:29.788793 kubelet[2755]: E0813 00:35:29.788737 2755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-m42j7_calico-system(7f11d4ba-b27a-4a80-a109-96b122ee11da)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-m42j7_calico-system(7f11d4ba-b27a-4a80-a109-96b122ee11da)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"28a51989c109c839ca38380575f3de5803587941401c65ccf497201f35f2c84f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-m42j7" podUID="7f11d4ba-b27a-4a80-a109-96b122ee11da" Aug 13 00:35:29.855666 systemd[1]: run-netns-cni\x2daafc6355\x2dc3c0\x2d8daa\x2d1cca\x2ddf5b4e7aa10b.mount: Deactivated successfully. Aug 13 00:35:29.855892 systemd[1]: run-netns-cni\x2d5117f519\x2dafe3\x2dab9b\x2db583\x2d3802c0139635.mount: Deactivated successfully. Aug 13 00:35:29.855985 systemd[1]: run-netns-cni\x2d44a7d185\x2d5f9a\x2d2a7c\x2d7bd6\x2d3b69f83ba711.mount: Deactivated successfully. Aug 13 00:35:36.441614 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3584285411.mount: Deactivated successfully. Aug 13 00:35:36.513198 containerd[1566]: time="2025-08-13T00:35:36.505753370Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:36.515205 containerd[1566]: time="2025-08-13T00:35:36.515070209Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Aug 13 00:35:36.523867 containerd[1566]: time="2025-08-13T00:35:36.523808343Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:36.528142 containerd[1566]: time="2025-08-13T00:35:36.528087580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:36.528780 containerd[1566]: time="2025-08-13T00:35:36.528579789Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 7.684075259s" Aug 13 00:35:36.528780 containerd[1566]: time="2025-08-13T00:35:36.528616795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Aug 13 00:35:36.569437 containerd[1566]: time="2025-08-13T00:35:36.569403093Z" level=info msg="CreateContainer within sandbox \"9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 13 00:35:36.597952 containerd[1566]: time="2025-08-13T00:35:36.597811751Z" level=info msg="Container 63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:36.598177 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount487462145.mount: Deactivated successfully. Aug 13 00:35:36.615502 containerd[1566]: time="2025-08-13T00:35:36.615430691Z" level=info msg="CreateContainer within sandbox \"9e968a4c765af5e99c71b6f0a373398443560fe5e423bfca582a4b7ab03aec12\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\"" Aug 13 00:35:36.616364 containerd[1566]: time="2025-08-13T00:35:36.616090021Z" level=info msg="StartContainer for \"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\"" Aug 13 00:35:36.620461 containerd[1566]: time="2025-08-13T00:35:36.620429011Z" level=info msg="connecting to shim 63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565" address="unix:///run/containerd/s/32abafaa6178dd3e24e3879c7bc79b30bea7fd451bedc77db56c0fc189f76ef2" protocol=ttrpc version=3 Aug 13 00:35:36.710860 systemd[1]: Started cri-containerd-63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565.scope - libcontainer container 63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565. Aug 13 00:35:36.753485 containerd[1566]: time="2025-08-13T00:35:36.753445462Z" level=info msg="StartContainer for \"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\" returns successfully" Aug 13 00:35:36.874834 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 13 00:35:36.878261 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 13 00:35:36.908679 kubelet[2755]: I0813 00:35:36.907078 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5jpt2" podStartSLOduration=1.738743997 podStartE2EDuration="19.907064225s" podCreationTimestamp="2025-08-13 00:35:17 +0000 UTC" firstStartedPulling="2025-08-13 00:35:18.363689969 +0000 UTC m=+18.741276826" lastFinishedPulling="2025-08-13 00:35:36.532010197 +0000 UTC m=+36.909597054" observedRunningTime="2025-08-13 00:35:36.905591868 +0000 UTC m=+37.283178726" watchObservedRunningTime="2025-08-13 00:35:36.907064225 +0000 UTC m=+37.284651082" Aug 13 00:35:37.180152 kubelet[2755]: I0813 00:35:37.180114 2755 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn8kk\" (UniqueName: \"kubernetes.io/projected/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-kube-api-access-jn8kk\") pod \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\" (UID: \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\") " Aug 13 00:35:37.180354 kubelet[2755]: I0813 00:35:37.180166 2755 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-whisker-backend-key-pair\") pod \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\" (UID: \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\") " Aug 13 00:35:37.180354 kubelet[2755]: I0813 00:35:37.180196 2755 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-whisker-ca-bundle\") pod \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\" (UID: \"dc66db4b-26a1-4eb6-8d19-d3d299e80b8a\") " Aug 13 00:35:37.180535 kubelet[2755]: I0813 00:35:37.180503 2755 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "dc66db4b-26a1-4eb6-8d19-d3d299e80b8a" (UID: "dc66db4b-26a1-4eb6-8d19-d3d299e80b8a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 13 00:35:37.199892 kubelet[2755]: I0813 00:35:37.199821 2755 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-kube-api-access-jn8kk" (OuterVolumeSpecName: "kube-api-access-jn8kk") pod "dc66db4b-26a1-4eb6-8d19-d3d299e80b8a" (UID: "dc66db4b-26a1-4eb6-8d19-d3d299e80b8a"). InnerVolumeSpecName "kube-api-access-jn8kk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 13 00:35:37.200211 kubelet[2755]: I0813 00:35:37.199981 2755 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "dc66db4b-26a1-4eb6-8d19-d3d299e80b8a" (UID: "dc66db4b-26a1-4eb6-8d19-d3d299e80b8a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 13 00:35:37.281064 kubelet[2755]: I0813 00:35:37.281026 2755 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-whisker-ca-bundle\") on node \"ci-4372-1-0-b-5ba4a9a74b\" DevicePath \"\"" Aug 13 00:35:37.281064 kubelet[2755]: I0813 00:35:37.281053 2755 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jn8kk\" (UniqueName: \"kubernetes.io/projected/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-kube-api-access-jn8kk\") on node \"ci-4372-1-0-b-5ba4a9a74b\" DevicePath \"\"" Aug 13 00:35:37.281064 kubelet[2755]: I0813 00:35:37.281062 2755 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a-whisker-backend-key-pair\") on node \"ci-4372-1-0-b-5ba4a9a74b\" DevicePath \"\"" Aug 13 00:35:37.442926 systemd[1]: var-lib-kubelet-pods-dc66db4b\x2d26a1\x2d4eb6\x2d8d19\x2dd3d299e80b8a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2djn8kk.mount: Deactivated successfully. Aug 13 00:35:37.443043 systemd[1]: var-lib-kubelet-pods-dc66db4b\x2d26a1\x2d4eb6\x2d8d19\x2dd3d299e80b8a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 13 00:35:37.737579 systemd[1]: Removed slice kubepods-besteffort-poddc66db4b_26a1_4eb6_8d19_d3d299e80b8a.slice - libcontainer container kubepods-besteffort-poddc66db4b_26a1_4eb6_8d19_d3d299e80b8a.slice. Aug 13 00:35:37.877596 kubelet[2755]: I0813 00:35:37.877562 2755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:35:37.956113 systemd[1]: Created slice kubepods-besteffort-podd1edd816_38eb_4378_8a79_6372d806a5b9.slice - libcontainer container kubepods-besteffort-podd1edd816_38eb_4378_8a79_6372d806a5b9.slice. Aug 13 00:35:37.985266 kubelet[2755]: I0813 00:35:37.985214 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1edd816-38eb-4378-8a79-6372d806a5b9-whisker-ca-bundle\") pod \"whisker-7bc44c9f66-42gjz\" (UID: \"d1edd816-38eb-4378-8a79-6372d806a5b9\") " pod="calico-system/whisker-7bc44c9f66-42gjz" Aug 13 00:35:37.985266 kubelet[2755]: I0813 00:35:37.985273 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d1edd816-38eb-4378-8a79-6372d806a5b9-whisker-backend-key-pair\") pod \"whisker-7bc44c9f66-42gjz\" (UID: \"d1edd816-38eb-4378-8a79-6372d806a5b9\") " pod="calico-system/whisker-7bc44c9f66-42gjz" Aug 13 00:35:37.985628 kubelet[2755]: I0813 00:35:37.985304 2755 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqv4d\" (UniqueName: \"kubernetes.io/projected/d1edd816-38eb-4378-8a79-6372d806a5b9-kube-api-access-vqv4d\") pod \"whisker-7bc44c9f66-42gjz\" (UID: \"d1edd816-38eb-4378-8a79-6372d806a5b9\") " pod="calico-system/whisker-7bc44c9f66-42gjz" Aug 13 00:35:38.269629 containerd[1566]: time="2025-08-13T00:35:38.269567664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bc44c9f66-42gjz,Uid:d1edd816-38eb-4378-8a79-6372d806a5b9,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:38.619233 systemd-networkd[1481]: cali90d38fca8a1: Link UP Aug 13 00:35:38.620004 systemd-networkd[1481]: cali90d38fca8a1: Gained carrier Aug 13 00:35:38.654381 containerd[1566]: 2025-08-13 00:35:38.355 [INFO][3896] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 13 00:35:38.654381 containerd[1566]: 2025-08-13 00:35:38.385 [INFO][3896] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0 whisker-7bc44c9f66- calico-system d1edd816-38eb-4378-8a79-6372d806a5b9 914 0 2025-08-13 00:35:37 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7bc44c9f66 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372-1-0-b-5ba4a9a74b whisker-7bc44c9f66-42gjz eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali90d38fca8a1 [] [] }} ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Namespace="calico-system" Pod="whisker-7bc44c9f66-42gjz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-" Aug 13 00:35:38.654381 containerd[1566]: 2025-08-13 00:35:38.385 [INFO][3896] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Namespace="calico-system" Pod="whisker-7bc44c9f66-42gjz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" Aug 13 00:35:38.654381 containerd[1566]: 2025-08-13 00:35:38.539 [INFO][3953] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" HandleID="k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.542 [INFO][3953] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" HandleID="k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00034a1a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-b-5ba4a9a74b", "pod":"whisker-7bc44c9f66-42gjz", "timestamp":"2025-08-13 00:35:38.539137302 +0000 UTC"}, Hostname:"ci-4372-1-0-b-5ba4a9a74b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.542 [INFO][3953] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.543 [INFO][3953] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.544 [INFO][3953] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-b-5ba4a9a74b' Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.563 [INFO][3953] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.572 [INFO][3953] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.580 [INFO][3953] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.583 [INFO][3953] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.655429 containerd[1566]: 2025-08-13 00:35:38.587 [INFO][3953] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.657533 containerd[1566]: 2025-08-13 00:35:38.587 [INFO][3953] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.657533 containerd[1566]: 2025-08-13 00:35:38.589 [INFO][3953] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83 Aug 13 00:35:38.657533 containerd[1566]: 2025-08-13 00:35:38.595 [INFO][3953] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.657533 containerd[1566]: 2025-08-13 00:35:38.603 [INFO][3953] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.65/26] block=192.168.46.64/26 handle="k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.657533 containerd[1566]: 2025-08-13 00:35:38.603 [INFO][3953] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.65/26] handle="k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:38.657533 containerd[1566]: 2025-08-13 00:35:38.603 [INFO][3953] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:35:38.657533 containerd[1566]: 2025-08-13 00:35:38.603 [INFO][3953] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.65/26] IPv6=[] ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" HandleID="k8s-pod-network.2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" Aug 13 00:35:38.657644 containerd[1566]: 2025-08-13 00:35:38.606 [INFO][3896] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Namespace="calico-system" Pod="whisker-7bc44c9f66-42gjz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0", GenerateName:"whisker-7bc44c9f66-", Namespace:"calico-system", SelfLink:"", UID:"d1edd816-38eb-4378-8a79-6372d806a5b9", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bc44c9f66", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"", Pod:"whisker-7bc44c9f66-42gjz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.46.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali90d38fca8a1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:38.657644 containerd[1566]: 2025-08-13 00:35:38.606 [INFO][3896] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.65/32] ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Namespace="calico-system" Pod="whisker-7bc44c9f66-42gjz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" Aug 13 00:35:38.658049 containerd[1566]: 2025-08-13 00:35:38.606 [INFO][3896] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali90d38fca8a1 ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Namespace="calico-system" Pod="whisker-7bc44c9f66-42gjz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" Aug 13 00:35:38.658049 containerd[1566]: 2025-08-13 00:35:38.625 [INFO][3896] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Namespace="calico-system" Pod="whisker-7bc44c9f66-42gjz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" Aug 13 00:35:38.658736 containerd[1566]: 2025-08-13 00:35:38.629 [INFO][3896] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Namespace="calico-system" Pod="whisker-7bc44c9f66-42gjz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0", GenerateName:"whisker-7bc44c9f66-", Namespace:"calico-system", SelfLink:"", UID:"d1edd816-38eb-4378-8a79-6372d806a5b9", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7bc44c9f66", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83", Pod:"whisker-7bc44c9f66-42gjz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.46.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali90d38fca8a1", MAC:"86:a8:41:62:d1:1c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:38.658788 containerd[1566]: 2025-08-13 00:35:38.650 [INFO][3896] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" Namespace="calico-system" Pod="whisker-7bc44c9f66-42gjz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-whisker--7bc44c9f66--42gjz-eth0" Aug 13 00:35:38.811523 containerd[1566]: time="2025-08-13T00:35:38.811480486Z" level=info msg="connecting to shim 2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83" address="unix:///run/containerd/s/9bc8072a0d356a1e124ff4b62542b33b80667d9c3e5a92d7fe6e92859e259da1" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:38.853836 systemd[1]: Started cri-containerd-2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83.scope - libcontainer container 2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83. Aug 13 00:35:38.908282 containerd[1566]: time="2025-08-13T00:35:38.908196895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bc44c9f66-42gjz,Uid:d1edd816-38eb-4378-8a79-6372d806a5b9,Namespace:calico-system,Attempt:0,} returns sandbox id \"2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83\"" Aug 13 00:35:38.915162 containerd[1566]: time="2025-08-13T00:35:38.915133042Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 13 00:35:38.955626 systemd-networkd[1481]: vxlan.calico: Link UP Aug 13 00:35:38.955639 systemd-networkd[1481]: vxlan.calico: Gained carrier Aug 13 00:35:39.730724 containerd[1566]: time="2025-08-13T00:35:39.730424417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8545d6cbb4-c8wdj,Uid:69725ba8-05ad-4005-a951-7dc187af0ddc,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:35:39.745379 kubelet[2755]: I0813 00:35:39.745197 2755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc66db4b-26a1-4eb6-8d19-d3d299e80b8a" path="/var/lib/kubelet/pods/dc66db4b-26a1-4eb6-8d19-d3d299e80b8a/volumes" Aug 13 00:35:39.814795 kubelet[2755]: I0813 00:35:39.814767 2755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 13 00:35:39.848962 systemd-networkd[1481]: calid17b3b59303: Link UP Aug 13 00:35:39.850668 systemd-networkd[1481]: calid17b3b59303: Gained carrier Aug 13 00:35:39.871239 containerd[1566]: 2025-08-13 00:35:39.775 [INFO][4140] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0 calico-apiserver-8545d6cbb4- calico-apiserver 69725ba8-05ad-4005-a951-7dc187af0ddc 843 0 2025-08-13 00:35:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8545d6cbb4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-b-5ba4a9a74b calico-apiserver-8545d6cbb4-c8wdj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid17b3b59303 [] [] }} ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-c8wdj" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-" Aug 13 00:35:39.871239 containerd[1566]: 2025-08-13 00:35:39.776 [INFO][4140] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-c8wdj" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" Aug 13 00:35:39.871239 containerd[1566]: 2025-08-13 00:35:39.800 [INFO][4152] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" HandleID="k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.800 [INFO][4152] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" HandleID="k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f790), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-b-5ba4a9a74b", "pod":"calico-apiserver-8545d6cbb4-c8wdj", "timestamp":"2025-08-13 00:35:39.800201184 +0000 UTC"}, Hostname:"ci-4372-1-0-b-5ba4a9a74b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.800 [INFO][4152] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.800 [INFO][4152] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.800 [INFO][4152] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-b-5ba4a9a74b' Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.809 [INFO][4152] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.814 [INFO][4152] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.819 [INFO][4152] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.822 [INFO][4152] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.872080 containerd[1566]: 2025-08-13 00:35:39.824 [INFO][4152] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.873916 containerd[1566]: 2025-08-13 00:35:39.824 [INFO][4152] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.873916 containerd[1566]: 2025-08-13 00:35:39.826 [INFO][4152] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6 Aug 13 00:35:39.873916 containerd[1566]: 2025-08-13 00:35:39.831 [INFO][4152] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.873916 containerd[1566]: 2025-08-13 00:35:39.837 [INFO][4152] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.66/26] block=192.168.46.64/26 handle="k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.873916 containerd[1566]: 2025-08-13 00:35:39.838 [INFO][4152] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.66/26] handle="k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:39.873916 containerd[1566]: 2025-08-13 00:35:39.838 [INFO][4152] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:35:39.873916 containerd[1566]: 2025-08-13 00:35:39.838 [INFO][4152] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.66/26] IPv6=[] ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" HandleID="k8s-pod-network.38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" Aug 13 00:35:39.874418 containerd[1566]: 2025-08-13 00:35:39.843 [INFO][4140] cni-plugin/k8s.go 418: Populated endpoint ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-c8wdj" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0", GenerateName:"calico-apiserver-8545d6cbb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"69725ba8-05ad-4005-a951-7dc187af0ddc", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8545d6cbb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"", Pod:"calico-apiserver-8545d6cbb4-c8wdj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.46.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid17b3b59303", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:39.874470 containerd[1566]: 2025-08-13 00:35:39.843 [INFO][4140] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.66/32] ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-c8wdj" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" Aug 13 00:35:39.874470 containerd[1566]: 2025-08-13 00:35:39.843 [INFO][4140] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid17b3b59303 ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-c8wdj" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" Aug 13 00:35:39.874470 containerd[1566]: 2025-08-13 00:35:39.850 [INFO][4140] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-c8wdj" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" Aug 13 00:35:39.874522 containerd[1566]: 2025-08-13 00:35:39.851 [INFO][4140] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-c8wdj" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0", GenerateName:"calico-apiserver-8545d6cbb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"69725ba8-05ad-4005-a951-7dc187af0ddc", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8545d6cbb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6", Pod:"calico-apiserver-8545d6cbb4-c8wdj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.46.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid17b3b59303", MAC:"be:ba:cb:5e:c5:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:39.874565 containerd[1566]: 2025-08-13 00:35:39.862 [INFO][4140] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-c8wdj" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--c8wdj-eth0" Aug 13 00:35:39.901474 containerd[1566]: time="2025-08-13T00:35:39.901442183Z" level=info msg="connecting to shim 38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6" address="unix:///run/containerd/s/ecde4c1ac489505f8b76f3027ffd33e7b35ee9447f92aa2897bb4ad1f3366b4a" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:39.926845 systemd[1]: Started cri-containerd-38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6.scope - libcontainer container 38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6. Aug 13 00:35:39.985638 containerd[1566]: time="2025-08-13T00:35:39.985018375Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\" id:\"c5472a352580438f36aae9ac8db34d00b2558304ade4147e4e787f2840c06860\" pid:4180 exit_status:1 exited_at:{seconds:1755045339 nanos:978229708}" Aug 13 00:35:39.987183 systemd-networkd[1481]: cali90d38fca8a1: Gained IPv6LL Aug 13 00:35:39.990206 containerd[1566]: time="2025-08-13T00:35:39.990178626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8545d6cbb4-c8wdj,Uid:69725ba8-05ad-4005-a951-7dc187af0ddc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6\"" Aug 13 00:35:40.051970 containerd[1566]: time="2025-08-13T00:35:40.051915484Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\" id:\"6aa4d38dd11904a9293a219158c162eb00d8e4aa1df19decfde71dd616a7f5a6\" pid:4250 exit_status:1 exited_at:{seconds:1755045340 nanos:51617982}" Aug 13 00:35:40.434940 systemd-networkd[1481]: vxlan.calico: Gained IPv6LL Aug 13 00:35:40.730616 containerd[1566]: time="2025-08-13T00:35:40.730516042Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-22k9t,Uid:334452d9-3977-436e-9dcf-2778778479f8,Namespace:kube-system,Attempt:0,}" Aug 13 00:35:40.756063 containerd[1566]: time="2025-08-13T00:35:40.755993657Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:40.775466 containerd[1566]: time="2025-08-13T00:35:40.775379816Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Aug 13 00:35:40.778175 containerd[1566]: time="2025-08-13T00:35:40.778147917Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:40.781714 containerd[1566]: time="2025-08-13T00:35:40.781552778Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:40.785442 containerd[1566]: time="2025-08-13T00:35:40.785408219Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.870246036s" Aug 13 00:35:40.785442 containerd[1566]: time="2025-08-13T00:35:40.785438289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Aug 13 00:35:40.786736 containerd[1566]: time="2025-08-13T00:35:40.786692648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 13 00:35:40.795520 containerd[1566]: time="2025-08-13T00:35:40.795460841Z" level=info msg="CreateContainer within sandbox \"2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 13 00:35:40.806888 containerd[1566]: time="2025-08-13T00:35:40.804789828Z" level=info msg="Container b442e369fd16cad4bc88780e0be7fa6b44d4419bf7a169aebfbd64471c693002: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:40.808199 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1821113447.mount: Deactivated successfully. Aug 13 00:35:40.815858 containerd[1566]: time="2025-08-13T00:35:40.815809317Z" level=info msg="CreateContainer within sandbox \"2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"b442e369fd16cad4bc88780e0be7fa6b44d4419bf7a169aebfbd64471c693002\"" Aug 13 00:35:40.816832 containerd[1566]: time="2025-08-13T00:35:40.816813559Z" level=info msg="StartContainer for \"b442e369fd16cad4bc88780e0be7fa6b44d4419bf7a169aebfbd64471c693002\"" Aug 13 00:35:40.818243 containerd[1566]: time="2025-08-13T00:35:40.818116486Z" level=info msg="connecting to shim b442e369fd16cad4bc88780e0be7fa6b44d4419bf7a169aebfbd64471c693002" address="unix:///run/containerd/s/9bc8072a0d356a1e124ff4b62542b33b80667d9c3e5a92d7fe6e92859e259da1" protocol=ttrpc version=3 Aug 13 00:35:40.850049 systemd[1]: Started cri-containerd-b442e369fd16cad4bc88780e0be7fa6b44d4419bf7a169aebfbd64471c693002.scope - libcontainer container b442e369fd16cad4bc88780e0be7fa6b44d4419bf7a169aebfbd64471c693002. Aug 13 00:35:40.863552 systemd-networkd[1481]: calidb906919b1f: Link UP Aug 13 00:35:40.864992 systemd-networkd[1481]: calidb906919b1f: Gained carrier Aug 13 00:35:40.880412 containerd[1566]: 2025-08-13 00:35:40.779 [INFO][4266] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0 coredns-674b8bbfcf- kube-system 334452d9-3977-436e-9dcf-2778778479f8 839 0 2025-08-13 00:35:06 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-1-0-b-5ba4a9a74b coredns-674b8bbfcf-22k9t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calidb906919b1f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Namespace="kube-system" Pod="coredns-674b8bbfcf-22k9t" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-" Aug 13 00:35:40.880412 containerd[1566]: 2025-08-13 00:35:40.779 [INFO][4266] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Namespace="kube-system" Pod="coredns-674b8bbfcf-22k9t" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" Aug 13 00:35:40.880412 containerd[1566]: 2025-08-13 00:35:40.813 [INFO][4278] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" HandleID="k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.813 [INFO][4278] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" HandleID="k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5860), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-1-0-b-5ba4a9a74b", "pod":"coredns-674b8bbfcf-22k9t", "timestamp":"2025-08-13 00:35:40.813505235 +0000 UTC"}, Hostname:"ci-4372-1-0-b-5ba4a9a74b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.813 [INFO][4278] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.813 [INFO][4278] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.813 [INFO][4278] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-b-5ba4a9a74b' Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.822 [INFO][4278] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.829 [INFO][4278] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.835 [INFO][4278] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.837 [INFO][4278] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881140 containerd[1566]: 2025-08-13 00:35:40.840 [INFO][4278] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881323 containerd[1566]: 2025-08-13 00:35:40.840 [INFO][4278] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881323 containerd[1566]: 2025-08-13 00:35:40.843 [INFO][4278] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3 Aug 13 00:35:40.881323 containerd[1566]: 2025-08-13 00:35:40.850 [INFO][4278] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881323 containerd[1566]: 2025-08-13 00:35:40.855 [INFO][4278] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.67/26] block=192.168.46.64/26 handle="k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881323 containerd[1566]: 2025-08-13 00:35:40.856 [INFO][4278] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.67/26] handle="k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:40.881323 containerd[1566]: 2025-08-13 00:35:40.856 [INFO][4278] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:35:40.881323 containerd[1566]: 2025-08-13 00:35:40.856 [INFO][4278] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.67/26] IPv6=[] ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" HandleID="k8s-pod-network.11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" Aug 13 00:35:40.881887 containerd[1566]: 2025-08-13 00:35:40.859 [INFO][4266] cni-plugin/k8s.go 418: Populated endpoint ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Namespace="kube-system" Pod="coredns-674b8bbfcf-22k9t" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"334452d9-3977-436e-9dcf-2778778479f8", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"", Pod:"coredns-674b8bbfcf-22k9t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.46.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidb906919b1f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:40.881887 containerd[1566]: 2025-08-13 00:35:40.860 [INFO][4266] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.67/32] ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Namespace="kube-system" Pod="coredns-674b8bbfcf-22k9t" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" Aug 13 00:35:40.881887 containerd[1566]: 2025-08-13 00:35:40.860 [INFO][4266] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidb906919b1f ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Namespace="kube-system" Pod="coredns-674b8bbfcf-22k9t" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" Aug 13 00:35:40.881887 containerd[1566]: 2025-08-13 00:35:40.867 [INFO][4266] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Namespace="kube-system" Pod="coredns-674b8bbfcf-22k9t" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" Aug 13 00:35:40.881887 containerd[1566]: 2025-08-13 00:35:40.870 [INFO][4266] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Namespace="kube-system" Pod="coredns-674b8bbfcf-22k9t" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"334452d9-3977-436e-9dcf-2778778479f8", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3", Pod:"coredns-674b8bbfcf-22k9t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.46.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calidb906919b1f", MAC:"4a:81:dd:6b:ea:df", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:40.881887 containerd[1566]: 2025-08-13 00:35:40.877 [INFO][4266] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" Namespace="kube-system" Pod="coredns-674b8bbfcf-22k9t" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--22k9t-eth0" Aug 13 00:35:40.908511 containerd[1566]: time="2025-08-13T00:35:40.906859753Z" level=info msg="connecting to shim 11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3" address="unix:///run/containerd/s/f0a1048751b879232aebe6622ddcac6b44b3159634f172f90296f698df196ec1" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:40.930727 containerd[1566]: time="2025-08-13T00:35:40.930669664Z" level=info msg="StartContainer for \"b442e369fd16cad4bc88780e0be7fa6b44d4419bf7a169aebfbd64471c693002\" returns successfully" Aug 13 00:35:40.940957 systemd[1]: Started cri-containerd-11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3.scope - libcontainer container 11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3. Aug 13 00:35:40.995157 containerd[1566]: time="2025-08-13T00:35:40.995067386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-22k9t,Uid:334452d9-3977-436e-9dcf-2778778479f8,Namespace:kube-system,Attempt:0,} returns sandbox id \"11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3\"" Aug 13 00:35:41.000194 containerd[1566]: time="2025-08-13T00:35:41.000178198Z" level=info msg="CreateContainer within sandbox \"11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:35:41.017549 containerd[1566]: time="2025-08-13T00:35:41.017131471Z" level=info msg="Container 18879de7fe395b78c4a47018e4c243d64c809053274245f8749cf5bc54b6a752: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:41.021280 containerd[1566]: time="2025-08-13T00:35:41.021252729Z" level=info msg="CreateContainer within sandbox \"11168de5ee0e28864a95d7af56f979a2e0767fb0d0d0b01071255fe342f7b8a3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"18879de7fe395b78c4a47018e4c243d64c809053274245f8749cf5bc54b6a752\"" Aug 13 00:35:41.021836 containerd[1566]: time="2025-08-13T00:35:41.021805655Z" level=info msg="StartContainer for \"18879de7fe395b78c4a47018e4c243d64c809053274245f8749cf5bc54b6a752\"" Aug 13 00:35:41.022372 containerd[1566]: time="2025-08-13T00:35:41.022346756Z" level=info msg="connecting to shim 18879de7fe395b78c4a47018e4c243d64c809053274245f8749cf5bc54b6a752" address="unix:///run/containerd/s/f0a1048751b879232aebe6622ddcac6b44b3159634f172f90296f698df196ec1" protocol=ttrpc version=3 Aug 13 00:35:41.044891 systemd[1]: Started cri-containerd-18879de7fe395b78c4a47018e4c243d64c809053274245f8749cf5bc54b6a752.scope - libcontainer container 18879de7fe395b78c4a47018e4c243d64c809053274245f8749cf5bc54b6a752. Aug 13 00:35:41.070980 containerd[1566]: time="2025-08-13T00:35:41.070932210Z" level=info msg="StartContainer for \"18879de7fe395b78c4a47018e4c243d64c809053274245f8749cf5bc54b6a752\" returns successfully" Aug 13 00:35:41.523065 systemd-networkd[1481]: calid17b3b59303: Gained IPv6LL Aug 13 00:35:41.730821 containerd[1566]: time="2025-08-13T00:35:41.730773375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5hrzz,Uid:0c695393-6132-4469-8782-33ba9afa51c3,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:41.821562 systemd-networkd[1481]: cali76406f4e81c: Link UP Aug 13 00:35:41.822915 systemd-networkd[1481]: cali76406f4e81c: Gained carrier Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.767 [INFO][4410] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0 goldmane-768f4c5c69- calico-system 0c695393-6132-4469-8782-33ba9afa51c3 846 0 2025-08-13 00:35:17 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372-1-0-b-5ba4a9a74b goldmane-768f4c5c69-5hrzz eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali76406f4e81c [] [] }} ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Namespace="calico-system" Pod="goldmane-768f4c5c69-5hrzz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.767 [INFO][4410] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Namespace="calico-system" Pod="goldmane-768f4c5c69-5hrzz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.788 [INFO][4423] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" HandleID="k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.788 [INFO][4423] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" HandleID="k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a8090), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-b-5ba4a9a74b", "pod":"goldmane-768f4c5c69-5hrzz", "timestamp":"2025-08-13 00:35:41.788757918 +0000 UTC"}, Hostname:"ci-4372-1-0-b-5ba4a9a74b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.788 [INFO][4423] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.788 [INFO][4423] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.789 [INFO][4423] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-b-5ba4a9a74b' Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.795 [INFO][4423] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.799 [INFO][4423] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.803 [INFO][4423] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.804 [INFO][4423] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.806 [INFO][4423] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.806 [INFO][4423] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.808 [INFO][4423] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0 Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.811 [INFO][4423] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.816 [INFO][4423] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.68/26] block=192.168.46.64/26 handle="k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.816 [INFO][4423] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.68/26] handle="k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.816 [INFO][4423] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:35:41.837777 containerd[1566]: 2025-08-13 00:35:41.816 [INFO][4423] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.68/26] IPv6=[] ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" HandleID="k8s-pod-network.b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" Aug 13 00:35:41.839040 containerd[1566]: 2025-08-13 00:35:41.818 [INFO][4410] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Namespace="calico-system" Pod="goldmane-768f4c5c69-5hrzz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"0c695393-6132-4469-8782-33ba9afa51c3", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"", Pod:"goldmane-768f4c5c69-5hrzz", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.46.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali76406f4e81c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:41.839040 containerd[1566]: 2025-08-13 00:35:41.818 [INFO][4410] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.68/32] ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Namespace="calico-system" Pod="goldmane-768f4c5c69-5hrzz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" Aug 13 00:35:41.839040 containerd[1566]: 2025-08-13 00:35:41.818 [INFO][4410] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76406f4e81c ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Namespace="calico-system" Pod="goldmane-768f4c5c69-5hrzz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" Aug 13 00:35:41.839040 containerd[1566]: 2025-08-13 00:35:41.823 [INFO][4410] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Namespace="calico-system" Pod="goldmane-768f4c5c69-5hrzz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" Aug 13 00:35:41.839040 containerd[1566]: 2025-08-13 00:35:41.824 [INFO][4410] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Namespace="calico-system" Pod="goldmane-768f4c5c69-5hrzz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"0c695393-6132-4469-8782-33ba9afa51c3", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0", Pod:"goldmane-768f4c5c69-5hrzz", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.46.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali76406f4e81c", MAC:"ae:67:33:1a:c5:1f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:41.839040 containerd[1566]: 2025-08-13 00:35:41.835 [INFO][4410] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" Namespace="calico-system" Pod="goldmane-768f4c5c69-5hrzz" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-goldmane--768f4c5c69--5hrzz-eth0" Aug 13 00:35:41.858292 containerd[1566]: time="2025-08-13T00:35:41.858251288Z" level=info msg="connecting to shim b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0" address="unix:///run/containerd/s/319ba20546e03c3913e6990cbd448040cb96a74b94e34fd2bf959286b3a81b8c" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:41.907814 systemd[1]: Started cri-containerd-b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0.scope - libcontainer container b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0. Aug 13 00:35:41.930712 kubelet[2755]: I0813 00:35:41.930650 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-22k9t" podStartSLOduration=35.923301831 podStartE2EDuration="35.923301831s" podCreationTimestamp="2025-08-13 00:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:35:41.922772131 +0000 UTC m=+42.300358988" watchObservedRunningTime="2025-08-13 00:35:41.923301831 +0000 UTC m=+42.300888688" Aug 13 00:35:41.994460 containerd[1566]: time="2025-08-13T00:35:41.994419308Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-5hrzz,Uid:0c695393-6132-4469-8782-33ba9afa51c3,Namespace:calico-system,Attempt:0,} returns sandbox id \"b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0\"" Aug 13 00:35:42.730301 containerd[1566]: time="2025-08-13T00:35:42.730204695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c5f9dd4c5-hhmqc,Uid:64302c8f-d4eb-438d-8d05-03180a2fc1dd,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:42.802946 systemd-networkd[1481]: calidb906919b1f: Gained IPv6LL Aug 13 00:35:42.877894 systemd-networkd[1481]: calid302af40ac7: Link UP Aug 13 00:35:42.879844 systemd-networkd[1481]: calid302af40ac7: Gained carrier Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.785 [INFO][4491] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0 calico-kube-controllers-7c5f9dd4c5- calico-system 64302c8f-d4eb-438d-8d05-03180a2fc1dd 844 0 2025-08-13 00:35:18 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7c5f9dd4c5 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372-1-0-b-5ba4a9a74b calico-kube-controllers-7c5f9dd4c5-hhmqc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid302af40ac7 [] [] }} ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Namespace="calico-system" Pod="calico-kube-controllers-7c5f9dd4c5-hhmqc" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.785 [INFO][4491] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Namespace="calico-system" Pod="calico-kube-controllers-7c5f9dd4c5-hhmqc" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.815 [INFO][4504] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" HandleID="k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.815 [INFO][4504] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" HandleID="k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024eff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-b-5ba4a9a74b", "pod":"calico-kube-controllers-7c5f9dd4c5-hhmqc", "timestamp":"2025-08-13 00:35:42.815335013 +0000 UTC"}, Hostname:"ci-4372-1-0-b-5ba4a9a74b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.816 [INFO][4504] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.816 [INFO][4504] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.816 [INFO][4504] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-b-5ba4a9a74b' Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.825 [INFO][4504] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.831 [INFO][4504] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.838 [INFO][4504] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.841 [INFO][4504] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.844 [INFO][4504] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.844 [INFO][4504] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.847 [INFO][4504] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5 Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.856 [INFO][4504] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.868 [INFO][4504] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.69/26] block=192.168.46.64/26 handle="k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.869 [INFO][4504] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.69/26] handle="k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.869 [INFO][4504] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:35:42.899888 containerd[1566]: 2025-08-13 00:35:42.870 [INFO][4504] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.69/26] IPv6=[] ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" HandleID="k8s-pod-network.230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" Aug 13 00:35:42.902248 containerd[1566]: 2025-08-13 00:35:42.873 [INFO][4491] cni-plugin/k8s.go 418: Populated endpoint ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Namespace="calico-system" Pod="calico-kube-controllers-7c5f9dd4c5-hhmqc" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0", GenerateName:"calico-kube-controllers-7c5f9dd4c5-", Namespace:"calico-system", SelfLink:"", UID:"64302c8f-d4eb-438d-8d05-03180a2fc1dd", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c5f9dd4c5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"", Pod:"calico-kube-controllers-7c5f9dd4c5-hhmqc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.46.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid302af40ac7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:42.902248 containerd[1566]: 2025-08-13 00:35:42.873 [INFO][4491] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.69/32] ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Namespace="calico-system" Pod="calico-kube-controllers-7c5f9dd4c5-hhmqc" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" Aug 13 00:35:42.902248 containerd[1566]: 2025-08-13 00:35:42.873 [INFO][4491] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid302af40ac7 ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Namespace="calico-system" Pod="calico-kube-controllers-7c5f9dd4c5-hhmqc" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" Aug 13 00:35:42.902248 containerd[1566]: 2025-08-13 00:35:42.879 [INFO][4491] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Namespace="calico-system" Pod="calico-kube-controllers-7c5f9dd4c5-hhmqc" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" Aug 13 00:35:42.902248 containerd[1566]: 2025-08-13 00:35:42.883 [INFO][4491] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Namespace="calico-system" Pod="calico-kube-controllers-7c5f9dd4c5-hhmqc" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0", GenerateName:"calico-kube-controllers-7c5f9dd4c5-", Namespace:"calico-system", SelfLink:"", UID:"64302c8f-d4eb-438d-8d05-03180a2fc1dd", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c5f9dd4c5", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5", Pod:"calico-kube-controllers-7c5f9dd4c5-hhmqc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.46.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid302af40ac7", MAC:"ca:3d:b6:1b:16:8c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:42.902248 containerd[1566]: 2025-08-13 00:35:42.895 [INFO][4491] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" Namespace="calico-system" Pod="calico-kube-controllers-7c5f9dd4c5-hhmqc" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--kube--controllers--7c5f9dd4c5--hhmqc-eth0" Aug 13 00:35:42.924987 containerd[1566]: time="2025-08-13T00:35:42.924906544Z" level=info msg="connecting to shim 230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5" address="unix:///run/containerd/s/a08de796faf2184a21b77366176b9bb9cb8e0ce550c45c49fbff38611eb162f0" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:42.954928 systemd[1]: Started cri-containerd-230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5.scope - libcontainer container 230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5. Aug 13 00:35:43.007797 containerd[1566]: time="2025-08-13T00:35:43.007562329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c5f9dd4c5-hhmqc,Uid:64302c8f-d4eb-438d-8d05-03180a2fc1dd,Namespace:calico-system,Attempt:0,} returns sandbox id \"230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5\"" Aug 13 00:35:43.059296 systemd-networkd[1481]: cali76406f4e81c: Gained IPv6LL Aug 13 00:35:44.056146 containerd[1566]: time="2025-08-13T00:35:44.056099031Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:44.057084 containerd[1566]: time="2025-08-13T00:35:44.057060377Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Aug 13 00:35:44.057926 containerd[1566]: time="2025-08-13T00:35:44.057888467Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:44.062293 containerd[1566]: time="2025-08-13T00:35:44.061879644Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:44.062711 containerd[1566]: time="2025-08-13T00:35:44.062672544Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 3.275930716s" Aug 13 00:35:44.062745 containerd[1566]: time="2025-08-13T00:35:44.062712403Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Aug 13 00:35:44.070424 containerd[1566]: time="2025-08-13T00:35:44.070401622Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 13 00:35:44.092130 containerd[1566]: time="2025-08-13T00:35:44.092080637Z" level=info msg="CreateContainer within sandbox \"38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:35:44.099524 containerd[1566]: time="2025-08-13T00:35:44.099500916Z" level=info msg="Container ddb118a9675e24af3d1a9486b6a5ce87dd529de4bc10c44864c246132b81d41a: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:44.106138 containerd[1566]: time="2025-08-13T00:35:44.106105281Z" level=info msg="CreateContainer within sandbox \"38800da3c50ab9a525ce6977ca386ec3929fba837ec2212deb5546668ebdaec6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ddb118a9675e24af3d1a9486b6a5ce87dd529de4bc10c44864c246132b81d41a\"" Aug 13 00:35:44.107561 containerd[1566]: time="2025-08-13T00:35:44.106640954Z" level=info msg="StartContainer for \"ddb118a9675e24af3d1a9486b6a5ce87dd529de4bc10c44864c246132b81d41a\"" Aug 13 00:35:44.107809 containerd[1566]: time="2025-08-13T00:35:44.107781070Z" level=info msg="connecting to shim ddb118a9675e24af3d1a9486b6a5ce87dd529de4bc10c44864c246132b81d41a" address="unix:///run/containerd/s/ecde4c1ac489505f8b76f3027ffd33e7b35ee9447f92aa2897bb4ad1f3366b4a" protocol=ttrpc version=3 Aug 13 00:35:44.133819 systemd[1]: Started cri-containerd-ddb118a9675e24af3d1a9486b6a5ce87dd529de4bc10c44864c246132b81d41a.scope - libcontainer container ddb118a9675e24af3d1a9486b6a5ce87dd529de4bc10c44864c246132b81d41a. Aug 13 00:35:44.173682 containerd[1566]: time="2025-08-13T00:35:44.173516260Z" level=info msg="StartContainer for \"ddb118a9675e24af3d1a9486b6a5ce87dd529de4bc10c44864c246132b81d41a\" returns successfully" Aug 13 00:35:44.595388 systemd-networkd[1481]: calid302af40ac7: Gained IPv6LL Aug 13 00:35:44.731152 containerd[1566]: time="2025-08-13T00:35:44.731102081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-m42j7,Uid:7f11d4ba-b27a-4a80-a109-96b122ee11da,Namespace:calico-system,Attempt:0,}" Aug 13 00:35:44.732490 containerd[1566]: time="2025-08-13T00:35:44.731853837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8545d6cbb4-zrjtl,Uid:6c4854d0-1965-49ea-9f27-a943b10dac0f,Namespace:calico-apiserver,Attempt:0,}" Aug 13 00:35:44.732891 containerd[1566]: time="2025-08-13T00:35:44.731277412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-f7knb,Uid:da2b92e7-ec49-43a2-b5c9-35d5000b7d8f,Namespace:kube-system,Attempt:0,}" Aug 13 00:35:44.928739 systemd-networkd[1481]: cali3d1382d59bc: Link UP Aug 13 00:35:44.932727 systemd-networkd[1481]: cali3d1382d59bc: Gained carrier Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.839 [INFO][4617] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0 calico-apiserver-8545d6cbb4- calico-apiserver 6c4854d0-1965-49ea-9f27-a943b10dac0f 845 0 2025-08-13 00:35:15 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8545d6cbb4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-b-5ba4a9a74b calico-apiserver-8545d6cbb4-zrjtl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3d1382d59bc [] [] }} ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-zrjtl" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.840 [INFO][4617] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-zrjtl" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.874 [INFO][4654] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" HandleID="k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.876 [INFO][4654] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" HandleID="k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-b-5ba4a9a74b", "pod":"calico-apiserver-8545d6cbb4-zrjtl", "timestamp":"2025-08-13 00:35:44.874511137 +0000 UTC"}, Hostname:"ci-4372-1-0-b-5ba4a9a74b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.877 [INFO][4654] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.877 [INFO][4654] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.877 [INFO][4654] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-b-5ba4a9a74b' Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.888 [INFO][4654] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.893 [INFO][4654] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.898 [INFO][4654] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.900 [INFO][4654] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.903 [INFO][4654] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.903 [INFO][4654] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.906 [INFO][4654] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2 Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.911 [INFO][4654] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.917 [INFO][4654] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.70/26] block=192.168.46.64/26 handle="k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.917 [INFO][4654] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.70/26] handle="k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.917 [INFO][4654] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:35:44.956793 containerd[1566]: 2025-08-13 00:35:44.917 [INFO][4654] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.70/26] IPv6=[] ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" HandleID="k8s-pod-network.d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" Aug 13 00:35:44.959780 containerd[1566]: 2025-08-13 00:35:44.922 [INFO][4617] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-zrjtl" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0", GenerateName:"calico-apiserver-8545d6cbb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"6c4854d0-1965-49ea-9f27-a943b10dac0f", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8545d6cbb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"", Pod:"calico-apiserver-8545d6cbb4-zrjtl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.46.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3d1382d59bc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:44.959780 containerd[1566]: 2025-08-13 00:35:44.922 [INFO][4617] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.70/32] ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-zrjtl" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" Aug 13 00:35:44.959780 containerd[1566]: 2025-08-13 00:35:44.922 [INFO][4617] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3d1382d59bc ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-zrjtl" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" Aug 13 00:35:44.959780 containerd[1566]: 2025-08-13 00:35:44.933 [INFO][4617] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-zrjtl" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" Aug 13 00:35:44.959780 containerd[1566]: 2025-08-13 00:35:44.934 [INFO][4617] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-zrjtl" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0", GenerateName:"calico-apiserver-8545d6cbb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"6c4854d0-1965-49ea-9f27-a943b10dac0f", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8545d6cbb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2", Pod:"calico-apiserver-8545d6cbb4-zrjtl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.46.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3d1382d59bc", MAC:"82:7f:c8:ac:71:9a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:44.959780 containerd[1566]: 2025-08-13 00:35:44.952 [INFO][4617] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" Namespace="calico-apiserver" Pod="calico-apiserver-8545d6cbb4-zrjtl" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-calico--apiserver--8545d6cbb4--zrjtl-eth0" Aug 13 00:35:45.026216 containerd[1566]: time="2025-08-13T00:35:45.025865200Z" level=info msg="connecting to shim d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2" address="unix:///run/containerd/s/312d894536c542c42684ed0f4197852997b7d46bd9f6f9c3cb2eb25359985af9" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:45.064669 systemd-networkd[1481]: calic16cf9134e6: Link UP Aug 13 00:35:45.066090 systemd-networkd[1481]: calic16cf9134e6: Gained carrier Aug 13 00:35:45.085899 kubelet[2755]: I0813 00:35:45.085580 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8545d6cbb4-c8wdj" podStartSLOduration=26.014466609 podStartE2EDuration="30.085515298s" podCreationTimestamp="2025-08-13 00:35:15 +0000 UTC" firstStartedPulling="2025-08-13 00:35:39.993404222 +0000 UTC m=+40.370991079" lastFinishedPulling="2025-08-13 00:35:44.064452911 +0000 UTC m=+44.442039768" observedRunningTime="2025-08-13 00:35:44.986497984 +0000 UTC m=+45.364084861" watchObservedRunningTime="2025-08-13 00:35:45.085515298 +0000 UTC m=+45.463102156" Aug 13 00:35:45.087930 systemd[1]: Started cri-containerd-d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2.scope - libcontainer container d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2. Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:44.843 [INFO][4624] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0 csi-node-driver- calico-system 7f11d4ba-b27a-4a80-a109-96b122ee11da 736 0 2025-08-13 00:35:18 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372-1-0-b-5ba4a9a74b csi-node-driver-m42j7 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic16cf9134e6 [] [] }} ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Namespace="calico-system" Pod="csi-node-driver-m42j7" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:44.843 [INFO][4624] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Namespace="calico-system" Pod="csi-node-driver-m42j7" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:44.888 [INFO][4658] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" HandleID="k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:44.888 [INFO][4658] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" HandleID="k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5640), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-b-5ba4a9a74b", "pod":"csi-node-driver-m42j7", "timestamp":"2025-08-13 00:35:44.888143275 +0000 UTC"}, Hostname:"ci-4372-1-0-b-5ba4a9a74b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:44.888 [INFO][4658] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:44.918 [INFO][4658] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:44.918 [INFO][4658] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-b-5ba4a9a74b' Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:44.991 [INFO][4658] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.000 [INFO][4658] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.005 [INFO][4658] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.013 [INFO][4658] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.018 [INFO][4658] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.020 [INFO][4658] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.022 [INFO][4658] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.033 [INFO][4658] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.046 [INFO][4658] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.71/26] block=192.168.46.64/26 handle="k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.046 [INFO][4658] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.71/26] handle="k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.046 [INFO][4658] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:35:45.091517 containerd[1566]: 2025-08-13 00:35:45.051 [INFO][4658] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.71/26] IPv6=[] ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" HandleID="k8s-pod-network.bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" Aug 13 00:35:45.093307 containerd[1566]: 2025-08-13 00:35:45.058 [INFO][4624] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Namespace="calico-system" Pod="csi-node-driver-m42j7" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7f11d4ba-b27a-4a80-a109-96b122ee11da", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"", Pod:"csi-node-driver-m42j7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.46.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic16cf9134e6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:45.093307 containerd[1566]: 2025-08-13 00:35:45.058 [INFO][4624] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.71/32] ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Namespace="calico-system" Pod="csi-node-driver-m42j7" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" Aug 13 00:35:45.093307 containerd[1566]: 2025-08-13 00:35:45.058 [INFO][4624] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic16cf9134e6 ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Namespace="calico-system" Pod="csi-node-driver-m42j7" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" Aug 13 00:35:45.093307 containerd[1566]: 2025-08-13 00:35:45.067 [INFO][4624] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Namespace="calico-system" Pod="csi-node-driver-m42j7" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" Aug 13 00:35:45.093307 containerd[1566]: 2025-08-13 00:35:45.071 [INFO][4624] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Namespace="calico-system" Pod="csi-node-driver-m42j7" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7f11d4ba-b27a-4a80-a109-96b122ee11da", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce", Pod:"csi-node-driver-m42j7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.46.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic16cf9134e6", MAC:"fe:4b:90:5f:2c:24", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:45.093307 containerd[1566]: 2025-08-13 00:35:45.085 [INFO][4624] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" Namespace="calico-system" Pod="csi-node-driver-m42j7" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-csi--node--driver--m42j7-eth0" Aug 13 00:35:45.130971 containerd[1566]: time="2025-08-13T00:35:45.130893756Z" level=info msg="connecting to shim bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce" address="unix:///run/containerd/s/32a258ea25386327a8ee6e8350cc96fe2d898c359b4e404c865c64352b703da3" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:45.160232 systemd-networkd[1481]: calia24bd6ad7cf: Link UP Aug 13 00:35:45.160404 systemd-networkd[1481]: calia24bd6ad7cf: Gained carrier Aug 13 00:35:45.197554 systemd[1]: Started cri-containerd-bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce.scope - libcontainer container bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce. Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:44.839 [INFO][4628] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0 coredns-674b8bbfcf- kube-system da2b92e7-ec49-43a2-b5c9-35d5000b7d8f 842 0 2025-08-13 00:35:06 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-1-0-b-5ba4a9a74b coredns-674b8bbfcf-f7knb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia24bd6ad7cf [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Namespace="kube-system" Pod="coredns-674b8bbfcf-f7knb" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:44.840 [INFO][4628] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Namespace="kube-system" Pod="coredns-674b8bbfcf-f7knb" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:44.899 [INFO][4657] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" HandleID="k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:44.899 [INFO][4657] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" HandleID="k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf1a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-1-0-b-5ba4a9a74b", "pod":"coredns-674b8bbfcf-f7knb", "timestamp":"2025-08-13 00:35:44.899846067 +0000 UTC"}, Hostname:"ci-4372-1-0-b-5ba4a9a74b", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:44.900 [INFO][4657] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.049 [INFO][4657] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.050 [INFO][4657] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-b-5ba4a9a74b' Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.089 [INFO][4657] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.101 [INFO][4657] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.108 [INFO][4657] ipam/ipam.go 511: Trying affinity for 192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.111 [INFO][4657] ipam/ipam.go 158: Attempting to load block cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.114 [INFO][4657] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.46.64/26 host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.114 [INFO][4657] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.46.64/26 handle="k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.117 [INFO][4657] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5 Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.128 [INFO][4657] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.46.64/26 handle="k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.144 [INFO][4657] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.46.72/26] block=192.168.46.64/26 handle="k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.144 [INFO][4657] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.46.72/26] handle="k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" host="ci-4372-1-0-b-5ba4a9a74b" Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.144 [INFO][4657] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 13 00:35:45.201575 containerd[1566]: 2025-08-13 00:35:45.144 [INFO][4657] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.46.72/26] IPv6=[] ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" HandleID="k8s-pod-network.325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Workload="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" Aug 13 00:35:45.202734 containerd[1566]: 2025-08-13 00:35:45.149 [INFO][4628] cni-plugin/k8s.go 418: Populated endpoint ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Namespace="kube-system" Pod="coredns-674b8bbfcf-f7knb" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"da2b92e7-ec49-43a2-b5c9-35d5000b7d8f", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"", Pod:"coredns-674b8bbfcf-f7knb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.46.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia24bd6ad7cf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:45.202734 containerd[1566]: 2025-08-13 00:35:45.149 [INFO][4628] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.46.72/32] ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Namespace="kube-system" Pod="coredns-674b8bbfcf-f7knb" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" Aug 13 00:35:45.202734 containerd[1566]: 2025-08-13 00:35:45.149 [INFO][4628] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia24bd6ad7cf ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Namespace="kube-system" Pod="coredns-674b8bbfcf-f7knb" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" Aug 13 00:35:45.202734 containerd[1566]: 2025-08-13 00:35:45.168 [INFO][4628] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Namespace="kube-system" Pod="coredns-674b8bbfcf-f7knb" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" Aug 13 00:35:45.202734 containerd[1566]: 2025-08-13 00:35:45.173 [INFO][4628] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Namespace="kube-system" Pod="coredns-674b8bbfcf-f7knb" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"da2b92e7-ec49-43a2-b5c9-35d5000b7d8f", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.August, 13, 0, 35, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-b-5ba4a9a74b", ContainerID:"325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5", Pod:"coredns-674b8bbfcf-f7knb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.46.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia24bd6ad7cf", MAC:"92:e3:e0:09:eb:94", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 13 00:35:45.202734 containerd[1566]: 2025-08-13 00:35:45.196 [INFO][4628] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" Namespace="kube-system" Pod="coredns-674b8bbfcf-f7knb" WorkloadEndpoint="ci--4372--1--0--b--5ba4a9a74b-k8s-coredns--674b8bbfcf--f7knb-eth0" Aug 13 00:35:45.230520 containerd[1566]: time="2025-08-13T00:35:45.228396673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8545d6cbb4-zrjtl,Uid:6c4854d0-1965-49ea-9f27-a943b10dac0f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2\"" Aug 13 00:35:45.236503 containerd[1566]: time="2025-08-13T00:35:45.236236261Z" level=info msg="CreateContainer within sandbox \"d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 13 00:35:45.239301 containerd[1566]: time="2025-08-13T00:35:45.239255198Z" level=info msg="connecting to shim 325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5" address="unix:///run/containerd/s/84e4d2f857bc8cd87007dbbd2fe8b3d09f54218b8c0c9630a45b1722bfa33f13" namespace=k8s.io protocol=ttrpc version=3 Aug 13 00:35:45.251690 containerd[1566]: time="2025-08-13T00:35:45.251444183Z" level=info msg="Container 96c0aeed88d3a21b391169d09680d273465a90a3b3c97aecae837804db97347e: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:45.260099 containerd[1566]: time="2025-08-13T00:35:45.260067278Z" level=info msg="CreateContainer within sandbox \"d6812db05a0a6c3531ba02bf7a9b2daf7988f130d5b1c42b9afbe9dd7747a2a2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"96c0aeed88d3a21b391169d09680d273465a90a3b3c97aecae837804db97347e\"" Aug 13 00:35:45.262254 containerd[1566]: time="2025-08-13T00:35:45.261456656Z" level=info msg="StartContainer for \"96c0aeed88d3a21b391169d09680d273465a90a3b3c97aecae837804db97347e\"" Aug 13 00:35:45.263710 containerd[1566]: time="2025-08-13T00:35:45.263606485Z" level=info msg="connecting to shim 96c0aeed88d3a21b391169d09680d273465a90a3b3c97aecae837804db97347e" address="unix:///run/containerd/s/312d894536c542c42684ed0f4197852997b7d46bd9f6f9c3cb2eb25359985af9" protocol=ttrpc version=3 Aug 13 00:35:45.281817 systemd[1]: Started cri-containerd-325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5.scope - libcontainer container 325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5. Aug 13 00:35:45.289277 systemd[1]: Started cri-containerd-96c0aeed88d3a21b391169d09680d273465a90a3b3c97aecae837804db97347e.scope - libcontainer container 96c0aeed88d3a21b391169d09680d273465a90a3b3c97aecae837804db97347e. Aug 13 00:35:45.290712 containerd[1566]: time="2025-08-13T00:35:45.290509967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-m42j7,Uid:7f11d4ba-b27a-4a80-a109-96b122ee11da,Namespace:calico-system,Attempt:0,} returns sandbox id \"bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce\"" Aug 13 00:35:45.353444 containerd[1566]: time="2025-08-13T00:35:45.353412891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-f7knb,Uid:da2b92e7-ec49-43a2-b5c9-35d5000b7d8f,Namespace:kube-system,Attempt:0,} returns sandbox id \"325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5\"" Aug 13 00:35:45.367737 containerd[1566]: time="2025-08-13T00:35:45.367714231Z" level=info msg="CreateContainer within sandbox \"325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 13 00:35:45.375581 containerd[1566]: time="2025-08-13T00:35:45.375151554Z" level=info msg="Container 1cdfbe362a3365bf4e9c94a5097543bc902c9925c54e6f6c5e3f638db961656c: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:45.380939 containerd[1566]: time="2025-08-13T00:35:45.380917805Z" level=info msg="CreateContainer within sandbox \"325bb7def033c0721706a8f2a9b60a6e77d141f423b601c52c319acfd0dd99d5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1cdfbe362a3365bf4e9c94a5097543bc902c9925c54e6f6c5e3f638db961656c\"" Aug 13 00:35:45.381417 containerd[1566]: time="2025-08-13T00:35:45.381401152Z" level=info msg="StartContainer for \"1cdfbe362a3365bf4e9c94a5097543bc902c9925c54e6f6c5e3f638db961656c\"" Aug 13 00:35:45.382076 containerd[1566]: time="2025-08-13T00:35:45.382058536Z" level=info msg="connecting to shim 1cdfbe362a3365bf4e9c94a5097543bc902c9925c54e6f6c5e3f638db961656c" address="unix:///run/containerd/s/84e4d2f857bc8cd87007dbbd2fe8b3d09f54218b8c0c9630a45b1722bfa33f13" protocol=ttrpc version=3 Aug 13 00:35:45.421568 systemd[1]: Started cri-containerd-1cdfbe362a3365bf4e9c94a5097543bc902c9925c54e6f6c5e3f638db961656c.scope - libcontainer container 1cdfbe362a3365bf4e9c94a5097543bc902c9925c54e6f6c5e3f638db961656c. Aug 13 00:35:45.443647 containerd[1566]: time="2025-08-13T00:35:45.443521130Z" level=info msg="StartContainer for \"96c0aeed88d3a21b391169d09680d273465a90a3b3c97aecae837804db97347e\" returns successfully" Aug 13 00:35:45.510271 containerd[1566]: time="2025-08-13T00:35:45.510176307Z" level=info msg="StartContainer for \"1cdfbe362a3365bf4e9c94a5097543bc902c9925c54e6f6c5e3f638db961656c\" returns successfully" Aug 13 00:35:45.953597 kubelet[2755]: I0813 00:35:45.953284 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-f7knb" podStartSLOduration=39.953269666 podStartE2EDuration="39.953269666s" podCreationTimestamp="2025-08-13 00:35:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:35:45.952229907 +0000 UTC m=+46.329816764" watchObservedRunningTime="2025-08-13 00:35:45.953269666 +0000 UTC m=+46.330856523" Aug 13 00:35:45.955353 kubelet[2755]: I0813 00:35:45.955278 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8545d6cbb4-zrjtl" podStartSLOduration=30.955235597 podStartE2EDuration="30.955235597s" podCreationTimestamp="2025-08-13 00:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-13 00:35:45.940197234 +0000 UTC m=+46.317784091" watchObservedRunningTime="2025-08-13 00:35:45.955235597 +0000 UTC m=+46.332822455" Aug 13 00:35:46.258862 systemd-networkd[1481]: cali3d1382d59bc: Gained IPv6LL Aug 13 00:35:46.643004 systemd-networkd[1481]: calia24bd6ad7cf: Gained IPv6LL Aug 13 00:35:47.090940 systemd-networkd[1481]: calic16cf9134e6: Gained IPv6LL Aug 13 00:35:47.092685 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount847600027.mount: Deactivated successfully. Aug 13 00:35:47.113081 containerd[1566]: time="2025-08-13T00:35:47.113042437Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:47.113972 containerd[1566]: time="2025-08-13T00:35:47.113944203Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Aug 13 00:35:47.115085 containerd[1566]: time="2025-08-13T00:35:47.115049225Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:47.145886 containerd[1566]: time="2025-08-13T00:35:47.145817042Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:47.146747 containerd[1566]: time="2025-08-13T00:35:47.146396846Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 3.075966827s" Aug 13 00:35:47.146747 containerd[1566]: time="2025-08-13T00:35:47.146421927Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Aug 13 00:35:47.147322 containerd[1566]: time="2025-08-13T00:35:47.147300357Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 13 00:35:47.151470 containerd[1566]: time="2025-08-13T00:35:47.151031351Z" level=info msg="CreateContainer within sandbox \"2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 13 00:35:47.180716 containerd[1566]: time="2025-08-13T00:35:47.179502672Z" level=info msg="Container c259497d73c9cda13ecd605530ae4694d73aa258f43e10820ae3ae82dde2ade1: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:47.188040 containerd[1566]: time="2025-08-13T00:35:47.188008217Z" level=info msg="CreateContainer within sandbox \"2176d9ae722c8e5c7477c490419cfd4882a1efe5fdbfd3e1ea2aa57265a52d83\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"c259497d73c9cda13ecd605530ae4694d73aa258f43e10820ae3ae82dde2ade1\"" Aug 13 00:35:47.188555 containerd[1566]: time="2025-08-13T00:35:47.188522221Z" level=info msg="StartContainer for \"c259497d73c9cda13ecd605530ae4694d73aa258f43e10820ae3ae82dde2ade1\"" Aug 13 00:35:47.189363 containerd[1566]: time="2025-08-13T00:35:47.189337374Z" level=info msg="connecting to shim c259497d73c9cda13ecd605530ae4694d73aa258f43e10820ae3ae82dde2ade1" address="unix:///run/containerd/s/9bc8072a0d356a1e124ff4b62542b33b80667d9c3e5a92d7fe6e92859e259da1" protocol=ttrpc version=3 Aug 13 00:35:47.214832 systemd[1]: Started cri-containerd-c259497d73c9cda13ecd605530ae4694d73aa258f43e10820ae3ae82dde2ade1.scope - libcontainer container c259497d73c9cda13ecd605530ae4694d73aa258f43e10820ae3ae82dde2ade1. Aug 13 00:35:47.272392 containerd[1566]: time="2025-08-13T00:35:47.272288655Z" level=info msg="StartContainer for \"c259497d73c9cda13ecd605530ae4694d73aa258f43e10820ae3ae82dde2ade1\" returns successfully" Aug 13 00:35:50.374394 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2319241158.mount: Deactivated successfully. Aug 13 00:35:50.808946 containerd[1566]: time="2025-08-13T00:35:50.808590965Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Aug 13 00:35:50.812923 containerd[1566]: time="2025-08-13T00:35:50.811654448Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:50.833157 containerd[1566]: time="2025-08-13T00:35:50.833112341Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:50.835358 containerd[1566]: time="2025-08-13T00:35:50.835329106Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:50.835886 containerd[1566]: time="2025-08-13T00:35:50.835867463Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.688539911s" Aug 13 00:35:50.836388 containerd[1566]: time="2025-08-13T00:35:50.835966660Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Aug 13 00:35:50.837237 containerd[1566]: time="2025-08-13T00:35:50.837224092Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 13 00:35:50.840470 containerd[1566]: time="2025-08-13T00:35:50.840432714Z" level=info msg="CreateContainer within sandbox \"b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 13 00:35:50.852340 containerd[1566]: time="2025-08-13T00:35:50.852205926Z" level=info msg="Container 56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:50.867919 containerd[1566]: time="2025-08-13T00:35:50.867878889Z" level=info msg="CreateContainer within sandbox \"b3c48575797f9a611205ca4aa33e932354d9b5fe09182371a8776327d0bc16c0\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\"" Aug 13 00:35:50.868471 containerd[1566]: time="2025-08-13T00:35:50.868445722Z" level=info msg="StartContainer for \"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\"" Aug 13 00:35:50.876565 containerd[1566]: time="2025-08-13T00:35:50.876505593Z" level=info msg="connecting to shim 56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9" address="unix:///run/containerd/s/319ba20546e03c3913e6990cbd448040cb96a74b94e34fd2bf959286b3a81b8c" protocol=ttrpc version=3 Aug 13 00:35:50.925938 systemd[1]: Started cri-containerd-56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9.scope - libcontainer container 56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9. Aug 13 00:35:51.031388 containerd[1566]: time="2025-08-13T00:35:51.031318198Z" level=info msg="StartContainer for \"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" returns successfully" Aug 13 00:35:52.009896 kubelet[2755]: I0813 00:35:52.009296 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7bc44c9f66-42gjz" podStartSLOduration=6.762673883 podStartE2EDuration="14.995832339s" podCreationTimestamp="2025-08-13 00:35:37 +0000 UTC" firstStartedPulling="2025-08-13 00:35:38.914043836 +0000 UTC m=+39.291630692" lastFinishedPulling="2025-08-13 00:35:47.147202291 +0000 UTC m=+47.524789148" observedRunningTime="2025-08-13 00:35:47.9610453 +0000 UTC m=+48.338632197" watchObservedRunningTime="2025-08-13 00:35:51.995832339 +0000 UTC m=+52.373419197" Aug 13 00:35:52.009896 kubelet[2755]: I0813 00:35:52.009535 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-5hrzz" podStartSLOduration=26.16892405 podStartE2EDuration="35.009527188s" podCreationTimestamp="2025-08-13 00:35:17 +0000 UTC" firstStartedPulling="2025-08-13 00:35:41.996441008 +0000 UTC m=+42.374027865" lastFinishedPulling="2025-08-13 00:35:50.837044146 +0000 UTC m=+51.214631003" observedRunningTime="2025-08-13 00:35:51.995296859 +0000 UTC m=+52.372883716" watchObservedRunningTime="2025-08-13 00:35:52.009527188 +0000 UTC m=+52.387114046" Aug 13 00:35:52.140195 containerd[1566]: time="2025-08-13T00:35:52.140117212Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" id:\"10e5a0bd33190b0056b4035a56edc55a690bfcd72c3c167611196856ba5fa334\" pid:5030 exit_status:1 exited_at:{seconds:1755045352 nanos:129852528}" Aug 13 00:35:53.059232 containerd[1566]: time="2025-08-13T00:35:53.059192166Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" id:\"d1837a80c6e505410365219ee33bbe134f2bdedce44612bd18a2c07b33b3fb42\" pid:5052 exit_status:1 exited_at:{seconds:1755045353 nanos:58926561}" Aug 13 00:35:54.121796 containerd[1566]: time="2025-08-13T00:35:54.121760914Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" id:\"269e9df0537e0bd5e61d6e7f48d9364456d8b2f9789d5fbf063318cb7290bfd6\" pid:5078 exit_status:1 exited_at:{seconds:1755045354 nanos:121177804}" Aug 13 00:35:54.311743 containerd[1566]: time="2025-08-13T00:35:54.311174197Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:54.311949 containerd[1566]: time="2025-08-13T00:35:54.311931090Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Aug 13 00:35:54.313757 containerd[1566]: time="2025-08-13T00:35:54.313596435Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:54.316385 containerd[1566]: time="2025-08-13T00:35:54.316321179Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:54.316951 containerd[1566]: time="2025-08-13T00:35:54.316772519Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 3.47930749s" Aug 13 00:35:54.317053 containerd[1566]: time="2025-08-13T00:35:54.317033344Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Aug 13 00:35:54.339893 containerd[1566]: time="2025-08-13T00:35:54.339863782Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 13 00:35:54.444202 containerd[1566]: time="2025-08-13T00:35:54.443995434Z" level=info msg="CreateContainer within sandbox \"230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 13 00:35:54.454758 containerd[1566]: time="2025-08-13T00:35:54.454005938Z" level=info msg="Container 528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:54.458900 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount401964988.mount: Deactivated successfully. Aug 13 00:35:54.461800 containerd[1566]: time="2025-08-13T00:35:54.461769630Z" level=info msg="CreateContainer within sandbox \"230d113b4df1f68e70958145e5135cc4cfcd3bc2be29f4bda65870dc45f867b5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\"" Aug 13 00:35:54.462580 containerd[1566]: time="2025-08-13T00:35:54.462531583Z" level=info msg="StartContainer for \"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\"" Aug 13 00:35:54.463866 containerd[1566]: time="2025-08-13T00:35:54.463780968Z" level=info msg="connecting to shim 528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece" address="unix:///run/containerd/s/a08de796faf2184a21b77366176b9bb9cb8e0ce550c45c49fbff38611eb162f0" protocol=ttrpc version=3 Aug 13 00:35:54.488974 systemd[1]: Started cri-containerd-528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece.scope - libcontainer container 528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece. Aug 13 00:35:54.547807 containerd[1566]: time="2025-08-13T00:35:54.547768398Z" level=info msg="StartContainer for \"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\" returns successfully" Aug 13 00:35:55.100519 containerd[1566]: time="2025-08-13T00:35:55.100481307Z" level=info msg="TaskExit event in podsandbox handler container_id:\"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\" id:\"2f0d9b05dd2d79402ca47be4287e697960555150bc1e622bd394b90073150ca5\" pid:5146 exited_at:{seconds:1755045355 nanos:83589628}" Aug 13 00:35:55.194362 kubelet[2755]: I0813 00:35:55.189515 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7c5f9dd4c5-hhmqc" podStartSLOduration=25.851336171 podStartE2EDuration="37.180397153s" podCreationTimestamp="2025-08-13 00:35:18 +0000 UTC" firstStartedPulling="2025-08-13 00:35:43.010368415 +0000 UTC m=+43.387955261" lastFinishedPulling="2025-08-13 00:35:54.339429386 +0000 UTC m=+54.717016243" observedRunningTime="2025-08-13 00:35:55.152760355 +0000 UTC m=+55.530347201" watchObservedRunningTime="2025-08-13 00:35:55.180397153 +0000 UTC m=+55.557984020" Aug 13 00:35:55.897472 containerd[1566]: time="2025-08-13T00:35:55.897405079Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:55.903519 containerd[1566]: time="2025-08-13T00:35:55.898306515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Aug 13 00:35:55.903519 containerd[1566]: time="2025-08-13T00:35:55.899226888Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:55.903841 containerd[1566]: time="2025-08-13T00:35:55.900901006Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.560807771s" Aug 13 00:35:55.903841 containerd[1566]: time="2025-08-13T00:35:55.903768196Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Aug 13 00:35:55.904956 containerd[1566]: time="2025-08-13T00:35:55.903927550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:55.915051 containerd[1566]: time="2025-08-13T00:35:55.914998120Z" level=info msg="CreateContainer within sandbox \"bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 13 00:35:55.939658 containerd[1566]: time="2025-08-13T00:35:55.939603797Z" level=info msg="Container b6b5a87a083da76cffb404dc686056be7df1f9a48a674348dcd0fd32a045993f: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:55.970891 containerd[1566]: time="2025-08-13T00:35:55.970840657Z" level=info msg="CreateContainer within sandbox \"bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"b6b5a87a083da76cffb404dc686056be7df1f9a48a674348dcd0fd32a045993f\"" Aug 13 00:35:55.971573 containerd[1566]: time="2025-08-13T00:35:55.971551889Z" level=info msg="StartContainer for \"b6b5a87a083da76cffb404dc686056be7df1f9a48a674348dcd0fd32a045993f\"" Aug 13 00:35:55.973364 containerd[1566]: time="2025-08-13T00:35:55.973330693Z" level=info msg="connecting to shim b6b5a87a083da76cffb404dc686056be7df1f9a48a674348dcd0fd32a045993f" address="unix:///run/containerd/s/32a258ea25386327a8ee6e8350cc96fe2d898c359b4e404c865c64352b703da3" protocol=ttrpc version=3 Aug 13 00:35:55.999844 systemd[1]: Started cri-containerd-b6b5a87a083da76cffb404dc686056be7df1f9a48a674348dcd0fd32a045993f.scope - libcontainer container b6b5a87a083da76cffb404dc686056be7df1f9a48a674348dcd0fd32a045993f. Aug 13 00:35:56.045856 containerd[1566]: time="2025-08-13T00:35:56.045817080Z" level=info msg="StartContainer for \"b6b5a87a083da76cffb404dc686056be7df1f9a48a674348dcd0fd32a045993f\" returns successfully" Aug 13 00:35:56.058737 containerd[1566]: time="2025-08-13T00:35:56.058648373Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 13 00:35:57.824995 containerd[1566]: time="2025-08-13T00:35:57.824923060Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:57.826238 containerd[1566]: time="2025-08-13T00:35:57.826187235Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Aug 13 00:35:57.827610 containerd[1566]: time="2025-08-13T00:35:57.827425781Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:57.834324 containerd[1566]: time="2025-08-13T00:35:57.834281059Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 13 00:35:57.835353 containerd[1566]: time="2025-08-13T00:35:57.835309631Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 1.776577664s" Aug 13 00:35:57.835468 containerd[1566]: time="2025-08-13T00:35:57.835445518Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Aug 13 00:35:57.847319 containerd[1566]: time="2025-08-13T00:35:57.847274847Z" level=info msg="CreateContainer within sandbox \"bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 13 00:35:57.858013 containerd[1566]: time="2025-08-13T00:35:57.857980927Z" level=info msg="Container f6d8d0ff97af5c9dcd72a21ec1cf2b0b72869805103a6e8ec8126a724655d866: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:35:57.867567 containerd[1566]: time="2025-08-13T00:35:57.867514011Z" level=info msg="CreateContainer within sandbox \"bd63123cd8cfb8219aad8b87039fe5789a05ef811f34c278a65531cc564ae9ce\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"f6d8d0ff97af5c9dcd72a21ec1cf2b0b72869805103a6e8ec8126a724655d866\"" Aug 13 00:35:57.868139 containerd[1566]: time="2025-08-13T00:35:57.868124100Z" level=info msg="StartContainer for \"f6d8d0ff97af5c9dcd72a21ec1cf2b0b72869805103a6e8ec8126a724655d866\"" Aug 13 00:35:57.870185 containerd[1566]: time="2025-08-13T00:35:57.870166686Z" level=info msg="connecting to shim f6d8d0ff97af5c9dcd72a21ec1cf2b0b72869805103a6e8ec8126a724655d866" address="unix:///run/containerd/s/32a258ea25386327a8ee6e8350cc96fe2d898c359b4e404c865c64352b703da3" protocol=ttrpc version=3 Aug 13 00:35:57.909963 systemd[1]: Started cri-containerd-f6d8d0ff97af5c9dcd72a21ec1cf2b0b72869805103a6e8ec8126a724655d866.scope - libcontainer container f6d8d0ff97af5c9dcd72a21ec1cf2b0b72869805103a6e8ec8126a724655d866. Aug 13 00:35:57.977679 containerd[1566]: time="2025-08-13T00:35:57.977626090Z" level=info msg="StartContainer for \"f6d8d0ff97af5c9dcd72a21ec1cf2b0b72869805103a6e8ec8126a724655d866\" returns successfully" Aug 13 00:35:58.052356 kubelet[2755]: I0813 00:35:58.052255 2755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-m42j7" podStartSLOduration=27.509368199 podStartE2EDuration="40.052204114s" podCreationTimestamp="2025-08-13 00:35:18 +0000 UTC" firstStartedPulling="2025-08-13 00:35:45.293418553 +0000 UTC m=+45.671005411" lastFinishedPulling="2025-08-13 00:35:57.836254469 +0000 UTC m=+58.213841326" observedRunningTime="2025-08-13 00:35:58.047158351 +0000 UTC m=+58.424745239" watchObservedRunningTime="2025-08-13 00:35:58.052204114 +0000 UTC m=+58.429791001" Aug 13 00:35:58.957909 kubelet[2755]: I0813 00:35:58.954340 2755 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 13 00:35:58.958057 kubelet[2755]: I0813 00:35:58.957938 2755 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 13 00:36:10.278833 containerd[1566]: time="2025-08-13T00:36:10.278684398Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\" id:\"5157985e45379b1fef370c0d274c304772d6bc0a6bb3e782fb1d375e44e2a3a1\" pid:5255 exited_at:{seconds:1755045370 nanos:277402680}" Aug 13 00:36:24.149453 containerd[1566]: time="2025-08-13T00:36:24.149387617Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" id:\"d3f3c999abf05409c200b4ee4d2af81d44e1caf0f2f0cb527d02aea49718480e\" pid:5285 exited_at:{seconds:1755045384 nanos:148665699}" Aug 13 00:36:25.090425 containerd[1566]: time="2025-08-13T00:36:25.090379491Z" level=info msg="TaskExit event in podsandbox handler container_id:\"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\" id:\"d46f84949e201b7b366774c01d6966a0768f854e012e319829f0321e8aed2768\" pid:5311 exited_at:{seconds:1755045385 nanos:89567957}" Aug 13 00:36:32.230983 containerd[1566]: time="2025-08-13T00:36:32.230948255Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" id:\"774b7d6133a6a5139685f81845bcd466c7026f805ef207ff940a64a03d85dc05\" pid:5335 exited_at:{seconds:1755045392 nanos:230707469}" Aug 13 00:36:33.512242 containerd[1566]: time="2025-08-13T00:36:33.512199165Z" level=info msg="TaskExit event in podsandbox handler container_id:\"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\" id:\"e5b3246422fd336f3d12ee82b1f9c3d6c868eb8da798ec9c3cd0878474d36778\" pid:5358 exited_at:{seconds:1755045393 nanos:511828198}" Aug 13 00:36:36.146419 systemd[1]: Started sshd@7-46.62.157.78:22-139.178.89.65:33656.service - OpenSSH per-connection server daemon (139.178.89.65:33656). Aug 13 00:36:37.176037 sshd[5371]: Accepted publickey for core from 139.178.89.65 port 33656 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:36:37.179248 sshd-session[5371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:37.186912 systemd-logind[1533]: New session 8 of user core. Aug 13 00:36:37.192904 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 13 00:36:38.345596 sshd[5375]: Connection closed by 139.178.89.65 port 33656 Aug 13 00:36:38.346258 sshd-session[5371]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:38.353591 systemd-logind[1533]: Session 8 logged out. Waiting for processes to exit. Aug 13 00:36:38.354281 systemd[1]: sshd@7-46.62.157.78:22-139.178.89.65:33656.service: Deactivated successfully. Aug 13 00:36:38.357452 systemd[1]: session-8.scope: Deactivated successfully. Aug 13 00:36:38.360013 systemd-logind[1533]: Removed session 8. Aug 13 00:36:40.262817 containerd[1566]: time="2025-08-13T00:36:40.262731430Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\" id:\"a2f7156db4d23f509baf04680224a453344ca8833e417f0e4e2e3086778636ef\" pid:5399 exited_at:{seconds:1755045400 nanos:261110508}" Aug 13 00:36:43.511385 systemd[1]: Started sshd@8-46.62.157.78:22-139.178.89.65:41550.service - OpenSSH per-connection server daemon (139.178.89.65:41550). Aug 13 00:36:44.549813 sshd[5412]: Accepted publickey for core from 139.178.89.65 port 41550 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:36:44.556356 sshd-session[5412]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:44.562105 systemd-logind[1533]: New session 9 of user core. Aug 13 00:36:44.567836 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 13 00:36:45.527286 sshd[5414]: Connection closed by 139.178.89.65 port 41550 Aug 13 00:36:45.529360 sshd-session[5412]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:45.533524 systemd[1]: sshd@8-46.62.157.78:22-139.178.89.65:41550.service: Deactivated successfully. Aug 13 00:36:45.535122 systemd[1]: session-9.scope: Deactivated successfully. Aug 13 00:36:45.538276 systemd-logind[1533]: Session 9 logged out. Waiting for processes to exit. Aug 13 00:36:45.539782 systemd-logind[1533]: Removed session 9. Aug 13 00:36:45.696918 systemd[1]: Started sshd@9-46.62.157.78:22-139.178.89.65:41556.service - OpenSSH per-connection server daemon (139.178.89.65:41556). Aug 13 00:36:46.679222 sshd[5427]: Accepted publickey for core from 139.178.89.65 port 41556 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:36:46.680658 sshd-session[5427]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:46.686337 systemd-logind[1533]: New session 10 of user core. Aug 13 00:36:46.690869 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 13 00:36:47.461935 sshd[5429]: Connection closed by 139.178.89.65 port 41556 Aug 13 00:36:47.462506 sshd-session[5427]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:47.465647 systemd[1]: sshd@9-46.62.157.78:22-139.178.89.65:41556.service: Deactivated successfully. Aug 13 00:36:47.468549 systemd[1]: session-10.scope: Deactivated successfully. Aug 13 00:36:47.472010 systemd-logind[1533]: Session 10 logged out. Waiting for processes to exit. Aug 13 00:36:47.473266 systemd-logind[1533]: Removed session 10. Aug 13 00:36:47.629848 systemd[1]: Started sshd@10-46.62.157.78:22-139.178.89.65:41558.service - OpenSSH per-connection server daemon (139.178.89.65:41558). Aug 13 00:36:48.619218 sshd[5439]: Accepted publickey for core from 139.178.89.65 port 41558 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:36:48.621880 sshd-session[5439]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:48.627664 systemd-logind[1533]: New session 11 of user core. Aug 13 00:36:48.635212 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 13 00:36:49.436862 sshd[5441]: Connection closed by 139.178.89.65 port 41558 Aug 13 00:36:49.437751 sshd-session[5439]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:49.442379 systemd[1]: sshd@10-46.62.157.78:22-139.178.89.65:41558.service: Deactivated successfully. Aug 13 00:36:49.444722 systemd[1]: session-11.scope: Deactivated successfully. Aug 13 00:36:49.445473 systemd-logind[1533]: Session 11 logged out. Waiting for processes to exit. Aug 13 00:36:49.447432 systemd-logind[1533]: Removed session 11. Aug 13 00:36:54.017193 systemd[1]: Started sshd@11-46.62.157.78:22-147.182.195.186:44082.service - OpenSSH per-connection server daemon (147.182.195.186:44082). Aug 13 00:36:54.090595 sshd[5459]: banner exchange: Connection from 147.182.195.186 port 44082: invalid format Aug 13 00:36:54.091535 systemd[1]: sshd@11-46.62.157.78:22-147.182.195.186:44082.service: Deactivated successfully. Aug 13 00:36:54.274398 containerd[1566]: time="2025-08-13T00:36:54.274183010Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" id:\"15b0a5b3de506a56fbe5c1817c1aa8060f221213a043b7e5d08e271ad0ed2744\" pid:5472 exited_at:{seconds:1755045414 nanos:248638193}" Aug 13 00:36:54.608601 systemd[1]: Started sshd@12-46.62.157.78:22-139.178.89.65:46590.service - OpenSSH per-connection server daemon (139.178.89.65:46590). Aug 13 00:36:55.190339 containerd[1566]: time="2025-08-13T00:36:55.190280063Z" level=info msg="TaskExit event in podsandbox handler container_id:\"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\" id:\"6de7e88801c025ceb10b5cc80d7f95cc84de47c43e188693bcfdca78eeca4635\" pid:5500 exited_at:{seconds:1755045415 nanos:190033527}" Aug 13 00:36:55.628221 sshd[5486]: Accepted publickey for core from 139.178.89.65 port 46590 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:36:55.629582 sshd-session[5486]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:36:55.635180 systemd-logind[1533]: New session 12 of user core. Aug 13 00:36:55.638902 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 13 00:36:56.472092 sshd[5508]: Connection closed by 139.178.89.65 port 46590 Aug 13 00:36:56.474488 sshd-session[5486]: pam_unix(sshd:session): session closed for user core Aug 13 00:36:56.478990 systemd-logind[1533]: Session 12 logged out. Waiting for processes to exit. Aug 13 00:36:56.479411 systemd[1]: sshd@12-46.62.157.78:22-139.178.89.65:46590.service: Deactivated successfully. Aug 13 00:36:56.481114 systemd[1]: session-12.scope: Deactivated successfully. Aug 13 00:36:56.482551 systemd-logind[1533]: Removed session 12. Aug 13 00:37:01.642274 systemd[1]: Started sshd@13-46.62.157.78:22-139.178.89.65:48364.service - OpenSSH per-connection server daemon (139.178.89.65:48364). Aug 13 00:37:02.646351 sshd[5528]: Accepted publickey for core from 139.178.89.65 port 48364 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:37:02.648219 sshd-session[5528]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:37:02.653649 systemd-logind[1533]: New session 13 of user core. Aug 13 00:37:02.660892 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 13 00:37:03.384326 sshd[5530]: Connection closed by 139.178.89.65 port 48364 Aug 13 00:37:03.384897 sshd-session[5528]: pam_unix(sshd:session): session closed for user core Aug 13 00:37:03.389013 systemd-logind[1533]: Session 13 logged out. Waiting for processes to exit. Aug 13 00:37:03.389068 systemd[1]: sshd@13-46.62.157.78:22-139.178.89.65:48364.service: Deactivated successfully. Aug 13 00:37:03.390814 systemd[1]: session-13.scope: Deactivated successfully. Aug 13 00:37:03.392191 systemd-logind[1533]: Removed session 13. Aug 13 00:37:08.558501 systemd[1]: Started sshd@14-46.62.157.78:22-139.178.89.65:48376.service - OpenSSH per-connection server daemon (139.178.89.65:48376). Aug 13 00:37:09.538968 sshd[5547]: Accepted publickey for core from 139.178.89.65 port 48376 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:37:09.540254 sshd-session[5547]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:37:09.547145 systemd-logind[1533]: New session 14 of user core. Aug 13 00:37:09.553333 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 13 00:37:10.251104 containerd[1566]: time="2025-08-13T00:37:10.244084557Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\" id:\"cf1dd450ee39dc11ff95deac3148ccb34c53d1a37618ef765d994b97340432da\" pid:5563 exited_at:{seconds:1755045430 nanos:243572494}" Aug 13 00:37:10.350001 sshd[5549]: Connection closed by 139.178.89.65 port 48376 Aug 13 00:37:10.352729 sshd-session[5547]: pam_unix(sshd:session): session closed for user core Aug 13 00:37:10.362368 systemd[1]: sshd@14-46.62.157.78:22-139.178.89.65:48376.service: Deactivated successfully. Aug 13 00:37:10.366273 systemd[1]: session-14.scope: Deactivated successfully. Aug 13 00:37:10.374606 systemd-logind[1533]: Session 14 logged out. Waiting for processes to exit. Aug 13 00:37:10.375534 systemd-logind[1533]: Removed session 14. Aug 13 00:37:10.515238 systemd[1]: Started sshd@15-46.62.157.78:22-139.178.89.65:56208.service - OpenSSH per-connection server daemon (139.178.89.65:56208). Aug 13 00:37:11.524731 sshd[5585]: Accepted publickey for core from 139.178.89.65 port 56208 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:37:11.526378 sshd-session[5585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:37:11.536187 systemd-logind[1533]: New session 15 of user core. Aug 13 00:37:11.539892 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 13 00:37:12.436084 sshd[5601]: Connection closed by 139.178.89.65 port 56208 Aug 13 00:37:12.439972 sshd-session[5585]: pam_unix(sshd:session): session closed for user core Aug 13 00:37:12.447171 systemd[1]: sshd@15-46.62.157.78:22-139.178.89.65:56208.service: Deactivated successfully. Aug 13 00:37:12.448637 systemd[1]: session-15.scope: Deactivated successfully. Aug 13 00:37:12.449484 systemd-logind[1533]: Session 15 logged out. Waiting for processes to exit. Aug 13 00:37:12.451065 systemd-logind[1533]: Removed session 15. Aug 13 00:37:12.607012 systemd[1]: Started sshd@16-46.62.157.78:22-139.178.89.65:56222.service - OpenSSH per-connection server daemon (139.178.89.65:56222). Aug 13 00:37:13.601494 sshd[5618]: Accepted publickey for core from 139.178.89.65 port 56222 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:37:13.603459 sshd-session[5618]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:37:13.609201 systemd-logind[1533]: New session 16 of user core. Aug 13 00:37:13.611982 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 13 00:37:14.868720 sshd[5620]: Connection closed by 139.178.89.65 port 56222 Aug 13 00:37:14.873764 sshd-session[5618]: pam_unix(sshd:session): session closed for user core Aug 13 00:37:14.882971 systemd[1]: sshd@16-46.62.157.78:22-139.178.89.65:56222.service: Deactivated successfully. Aug 13 00:37:14.886028 systemd[1]: session-16.scope: Deactivated successfully. Aug 13 00:37:14.888276 systemd-logind[1533]: Session 16 logged out. Waiting for processes to exit. Aug 13 00:37:14.890782 systemd-logind[1533]: Removed session 16. Aug 13 00:37:15.039080 systemd[1]: Started sshd@17-46.62.157.78:22-139.178.89.65:56238.service - OpenSSH per-connection server daemon (139.178.89.65:56238). Aug 13 00:37:16.049051 sshd[5637]: Accepted publickey for core from 139.178.89.65 port 56238 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:37:16.050366 sshd-session[5637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:37:16.056011 systemd-logind[1533]: New session 17 of user core. Aug 13 00:37:16.061023 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 13 00:37:17.383465 sshd[5640]: Connection closed by 139.178.89.65 port 56238 Aug 13 00:37:17.383483 sshd-session[5637]: pam_unix(sshd:session): session closed for user core Aug 13 00:37:17.389949 systemd[1]: sshd@17-46.62.157.78:22-139.178.89.65:56238.service: Deactivated successfully. Aug 13 00:37:17.393413 systemd[1]: session-17.scope: Deactivated successfully. Aug 13 00:37:17.394399 systemd-logind[1533]: Session 17 logged out. Waiting for processes to exit. Aug 13 00:37:17.396257 systemd-logind[1533]: Removed session 17. Aug 13 00:37:17.552864 systemd[1]: Started sshd@18-46.62.157.78:22-139.178.89.65:56254.service - OpenSSH per-connection server daemon (139.178.89.65:56254). Aug 13 00:37:18.567391 sshd[5651]: Accepted publickey for core from 139.178.89.65 port 56254 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:37:18.570876 sshd-session[5651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:37:18.580764 systemd-logind[1533]: New session 18 of user core. Aug 13 00:37:18.586038 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 13 00:37:19.324854 sshd[5653]: Connection closed by 139.178.89.65 port 56254 Aug 13 00:37:19.325885 sshd-session[5651]: pam_unix(sshd:session): session closed for user core Aug 13 00:37:19.329072 systemd[1]: sshd@18-46.62.157.78:22-139.178.89.65:56254.service: Deactivated successfully. Aug 13 00:37:19.332002 systemd[1]: session-18.scope: Deactivated successfully. Aug 13 00:37:19.335261 systemd-logind[1533]: Session 18 logged out. Waiting for processes to exit. Aug 13 00:37:19.337011 systemd-logind[1533]: Removed session 18. Aug 13 00:37:24.428414 containerd[1566]: time="2025-08-13T00:37:24.427157829Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" id:\"8f9a016c8e969e127284c21f0783c4ed5ae57fc1a220f9ad270986b9b55040da\" pid:5681 exited_at:{seconds:1755045444 nanos:418040574}" Aug 13 00:37:24.495319 systemd[1]: Started sshd@19-46.62.157.78:22-139.178.89.65:43718.service - OpenSSH per-connection server daemon (139.178.89.65:43718). Aug 13 00:37:25.103572 containerd[1566]: time="2025-08-13T00:37:25.103521010Z" level=info msg="TaskExit event in podsandbox handler container_id:\"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\" id:\"ab89553f639cab4fad8ef460c3b891b6936e7ed89f94c5c78bb64859071d2c31\" pid:5705 exited_at:{seconds:1755045445 nanos:102608979}" Aug 13 00:37:25.534823 sshd[5691]: Accepted publickey for core from 139.178.89.65 port 43718 ssh2: RSA SHA256:1H+nw3+OvquQwvofT9eHeWTIjHLC1/XQgTD8TxM6A/E Aug 13 00:37:25.536957 sshd-session[5691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 13 00:37:25.543740 systemd-logind[1533]: New session 19 of user core. Aug 13 00:37:25.550888 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 13 00:37:26.923641 sshd[5715]: Connection closed by 139.178.89.65 port 43718 Aug 13 00:37:26.927602 sshd-session[5691]: pam_unix(sshd:session): session closed for user core Aug 13 00:37:26.936079 systemd[1]: sshd@19-46.62.157.78:22-139.178.89.65:43718.service: Deactivated successfully. Aug 13 00:37:26.941382 systemd[1]: session-19.scope: Deactivated successfully. Aug 13 00:37:26.943257 systemd-logind[1533]: Session 19 logged out. Waiting for processes to exit. Aug 13 00:37:26.944921 systemd-logind[1533]: Removed session 19. Aug 13 00:37:32.368757 containerd[1566]: time="2025-08-13T00:37:32.368653004Z" level=info msg="TaskExit event in podsandbox handler container_id:\"56d12797a748bb1a986012ab1f49af30791c1956292486f5b484a3acda6f6fb9\" id:\"5740933b0c5e5420d27d2ec6882f09a8f3d98b5d059510b2f524cde0f5f9b62f\" pid:5740 exited_at:{seconds:1755045452 nanos:368217252}" Aug 13 00:37:33.508055 containerd[1566]: time="2025-08-13T00:37:33.508018043Z" level=info msg="TaskExit event in podsandbox handler container_id:\"528721e6df5ae35426fe54163a2581e5a6f4b231a459ebae411bff7f8b2b9ece\" id:\"967bb3e5321a3a69234186cdc106f3b132cd19002e9efb3fe765384be400e519\" pid:5764 exited_at:{seconds:1755045453 nanos:507554578}" Aug 13 00:37:40.190036 containerd[1566]: time="2025-08-13T00:37:40.189931057Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d850cb77cc2e92ab01511a3d35671e3b898ff4bbb69e0b77351df33b829565\" id:\"21b43d35ce8a9d9b2e52531ac4e08986a497348d89ec6bcb8e91e10dc3fe567e\" pid:5788 exited_at:{seconds:1755045460 nanos:189461769}" Aug 13 00:37:42.642400 systemd[1]: cri-containerd-4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992.scope: Deactivated successfully. Aug 13 00:37:42.643053 systemd[1]: cri-containerd-4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992.scope: Consumed 3.874s CPU time, 84.7M memory peak, 109.8M read from disk. Aug 13 00:37:42.722953 containerd[1566]: time="2025-08-13T00:37:42.722907758Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992\" id:\"4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992\" pid:2589 exit_status:1 exited_at:{seconds:1755045462 nanos:704837677}" Aug 13 00:37:42.729429 containerd[1566]: time="2025-08-13T00:37:42.729377243Z" level=info msg="received exit event container_id:\"4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992\" id:\"4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992\" pid:2589 exit_status:1 exited_at:{seconds:1755045462 nanos:704837677}" Aug 13 00:37:42.818019 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992-rootfs.mount: Deactivated successfully. Aug 13 00:37:43.121177 kubelet[2755]: E0813 00:37:43.121125 2755 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40914->10.0.0.2:2379: read: connection timed out" Aug 13 00:37:43.606126 kubelet[2755]: I0813 00:37:43.601954 2755 scope.go:117] "RemoveContainer" containerID="4ab6297dd59f6c49f1a77074912a03e8f5b7e924a845fd9c82139c6e841e0992" Aug 13 00:37:43.621059 systemd[1]: cri-containerd-a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7.scope: Deactivated successfully. Aug 13 00:37:43.621354 systemd[1]: cri-containerd-a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7.scope: Consumed 13.921s CPU time, 124.9M memory peak, 78.8M read from disk. Aug 13 00:37:43.628216 containerd[1566]: time="2025-08-13T00:37:43.628182294Z" level=info msg="received exit event container_id:\"a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7\" id:\"a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7\" pid:3103 exit_status:1 exited_at:{seconds:1755045463 nanos:626938223}" Aug 13 00:37:43.628506 containerd[1566]: time="2025-08-13T00:37:43.628354073Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7\" id:\"a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7\" pid:3103 exit_status:1 exited_at:{seconds:1755045463 nanos:626938223}" Aug 13 00:37:43.663903 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7-rootfs.mount: Deactivated successfully. Aug 13 00:37:43.685257 containerd[1566]: time="2025-08-13T00:37:43.685223351Z" level=info msg="CreateContainer within sandbox \"1f894da901fa8db159c519924efae66b9b8d971788c99a4d11443bd2659a722c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Aug 13 00:37:43.789245 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1770001718.mount: Deactivated successfully. Aug 13 00:37:43.793991 containerd[1566]: time="2025-08-13T00:37:43.793945423Z" level=info msg="Container 2f77cac824438804206dfd5f022ca1f209ba822ba9edd54a2a465fbf65f7f8d0: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:37:43.808746 containerd[1566]: time="2025-08-13T00:37:43.806905496Z" level=info msg="CreateContainer within sandbox \"1f894da901fa8db159c519924efae66b9b8d971788c99a4d11443bd2659a722c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"2f77cac824438804206dfd5f022ca1f209ba822ba9edd54a2a465fbf65f7f8d0\"" Aug 13 00:37:43.810242 containerd[1566]: time="2025-08-13T00:37:43.810194112Z" level=info msg="StartContainer for \"2f77cac824438804206dfd5f022ca1f209ba822ba9edd54a2a465fbf65f7f8d0\"" Aug 13 00:37:43.811120 containerd[1566]: time="2025-08-13T00:37:43.811085749Z" level=info msg="connecting to shim 2f77cac824438804206dfd5f022ca1f209ba822ba9edd54a2a465fbf65f7f8d0" address="unix:///run/containerd/s/dbb53f5b91826f42a329c612249a4e4aa43461556d910cf9e2de4b869beed3ea" protocol=ttrpc version=3 Aug 13 00:37:43.866904 systemd[1]: Started cri-containerd-2f77cac824438804206dfd5f022ca1f209ba822ba9edd54a2a465fbf65f7f8d0.scope - libcontainer container 2f77cac824438804206dfd5f022ca1f209ba822ba9edd54a2a465fbf65f7f8d0. Aug 13 00:37:43.934732 containerd[1566]: time="2025-08-13T00:37:43.934665476Z" level=info msg="StartContainer for \"2f77cac824438804206dfd5f022ca1f209ba822ba9edd54a2a465fbf65f7f8d0\" returns successfully" Aug 13 00:37:44.591457 kubelet[2755]: I0813 00:37:44.591389 2755 scope.go:117] "RemoveContainer" containerID="a1ea8c4007047c27c0328391a21aa376a4240f62f1919b6b806e361e6aa4a9a7" Aug 13 00:37:44.599454 containerd[1566]: time="2025-08-13T00:37:44.598935469Z" level=info msg="CreateContainer within sandbox \"579a5f4d58f286c13d0754cd8cd47fff5b94c28bf10e617bfc659d41180903d8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Aug 13 00:37:44.610236 containerd[1566]: time="2025-08-13T00:37:44.610202033Z" level=info msg="Container d4acc0bb78b4ed8ee039f5802d36c1c7c9d9a024030958c17d5aa8d3ec915bf6: CDI devices from CRI Config.CDIDevices: []" Aug 13 00:37:44.623633 containerd[1566]: time="2025-08-13T00:37:44.623572173Z" level=info msg="CreateContainer within sandbox \"579a5f4d58f286c13d0754cd8cd47fff5b94c28bf10e617bfc659d41180903d8\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"d4acc0bb78b4ed8ee039f5802d36c1c7c9d9a024030958c17d5aa8d3ec915bf6\"" Aug 13 00:37:44.627604 containerd[1566]: time="2025-08-13T00:37:44.627564397Z" level=info msg="StartContainer for \"d4acc0bb78b4ed8ee039f5802d36c1c7c9d9a024030958c17d5aa8d3ec915bf6\"" Aug 13 00:37:44.634537 containerd[1566]: time="2025-08-13T00:37:44.634363155Z" level=info msg="connecting to shim d4acc0bb78b4ed8ee039f5802d36c1c7c9d9a024030958c17d5aa8d3ec915bf6" address="unix:///run/containerd/s/8ecf781fceb1f87cf0735f04c73eba9ec24eb0ca56743392b8c9070afe3ca95a" protocol=ttrpc version=3 Aug 13 00:37:44.669941 systemd[1]: Started cri-containerd-d4acc0bb78b4ed8ee039f5802d36c1c7c9d9a024030958c17d5aa8d3ec915bf6.scope - libcontainer container d4acc0bb78b4ed8ee039f5802d36c1c7c9d9a024030958c17d5aa8d3ec915bf6. Aug 13 00:37:44.710190 containerd[1566]: time="2025-08-13T00:37:44.710143086Z" level=info msg="StartContainer for \"d4acc0bb78b4ed8ee039f5802d36c1c7c9d9a024030958c17d5aa8d3ec915bf6\" returns successfully" Aug 13 00:37:46.874633 kubelet[2755]: E0813 00:37:46.840564 2755 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40742->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4372-1-0-b-5ba4a9a74b.185b2c8b3cf6aa38 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4372-1-0-b-5ba4a9a74b,UID:fef85c3d0fdc44ae8154ece24a43b583,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-b-5ba4a9a74b,},FirstTimestamp:2025-08-13 00:37:36.32581484 +0000 UTC m=+156.703401726,LastTimestamp:2025-08-13 00:37:36.32581484 +0000 UTC m=+156.703401726,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-b-5ba4a9a74b,}"