Jul 15 05:13:41.808301 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Jul 15 03:28:48 -00 2025 Jul 15 05:13:41.808321 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:13:41.808328 kernel: BIOS-provided physical RAM map: Jul 15 05:13:41.808334 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jul 15 05:13:41.808338 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jul 15 05:13:41.808343 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jul 15 05:13:41.808350 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Jul 15 05:13:41.808355 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Jul 15 05:13:41.808360 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jul 15 05:13:41.808364 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jul 15 05:13:41.808369 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jul 15 05:13:41.808386 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jul 15 05:13:41.808391 kernel: NX (Execute Disable) protection: active Jul 15 05:13:41.808396 kernel: APIC: Static calls initialized Jul 15 05:13:41.808403 kernel: SMBIOS 2.8 present. Jul 15 05:13:41.808409 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Jul 15 05:13:41.808414 kernel: DMI: Memory slots populated: 1/1 Jul 15 05:13:41.808419 kernel: Hypervisor detected: KVM Jul 15 05:13:41.808424 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jul 15 05:13:41.808429 kernel: kvm-clock: using sched offset of 3938850233 cycles Jul 15 05:13:41.808434 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jul 15 05:13:41.808439 kernel: tsc: Detected 2399.998 MHz processor Jul 15 05:13:41.808446 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jul 15 05:13:41.808452 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jul 15 05:13:41.808457 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Jul 15 05:13:41.808462 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jul 15 05:13:41.808467 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jul 15 05:13:41.808472 kernel: Using GB pages for direct mapping Jul 15 05:13:41.808477 kernel: ACPI: Early table checksum verification disabled Jul 15 05:13:41.808482 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Jul 15 05:13:41.808488 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:13:41.808495 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:13:41.808500 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:13:41.808505 kernel: ACPI: FACS 0x000000007CFE0000 000040 Jul 15 05:13:41.808510 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:13:41.808515 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:13:41.808520 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:13:41.808525 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:13:41.808530 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Jul 15 05:13:41.808535 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Jul 15 05:13:41.808544 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Jul 15 05:13:41.808549 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Jul 15 05:13:41.808555 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Jul 15 05:13:41.808560 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Jul 15 05:13:41.808565 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Jul 15 05:13:41.808572 kernel: No NUMA configuration found Jul 15 05:13:41.808578 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Jul 15 05:13:41.808583 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Jul 15 05:13:41.808588 kernel: Zone ranges: Jul 15 05:13:41.808593 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jul 15 05:13:41.808599 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Jul 15 05:13:41.808604 kernel: Normal empty Jul 15 05:13:41.808609 kernel: Device empty Jul 15 05:13:41.808614 kernel: Movable zone start for each node Jul 15 05:13:41.808619 kernel: Early memory node ranges Jul 15 05:13:41.808627 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jul 15 05:13:41.808633 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Jul 15 05:13:41.808638 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Jul 15 05:13:41.808643 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jul 15 05:13:41.808649 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jul 15 05:13:41.808654 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Jul 15 05:13:41.808659 kernel: ACPI: PM-Timer IO Port: 0x608 Jul 15 05:13:41.808664 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jul 15 05:13:41.808670 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jul 15 05:13:41.808677 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jul 15 05:13:41.808682 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jul 15 05:13:41.808688 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jul 15 05:13:41.808695 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jul 15 05:13:41.808703 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jul 15 05:13:41.808711 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jul 15 05:13:41.808718 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jul 15 05:13:41.808726 kernel: CPU topo: Max. logical packages: 1 Jul 15 05:13:41.808733 kernel: CPU topo: Max. logical dies: 1 Jul 15 05:13:41.808742 kernel: CPU topo: Max. dies per package: 1 Jul 15 05:13:41.808751 kernel: CPU topo: Max. threads per core: 1 Jul 15 05:13:41.808757 kernel: CPU topo: Num. cores per package: 2 Jul 15 05:13:41.808762 kernel: CPU topo: Num. threads per package: 2 Jul 15 05:13:41.808767 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jul 15 05:13:41.808775 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jul 15 05:13:41.808783 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jul 15 05:13:41.808790 kernel: Booting paravirtualized kernel on KVM Jul 15 05:13:41.808798 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jul 15 05:13:41.808805 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jul 15 05:13:41.808816 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jul 15 05:13:41.808822 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jul 15 05:13:41.808827 kernel: pcpu-alloc: [0] 0 1 Jul 15 05:13:41.808833 kernel: kvm-guest: PV spinlocks disabled, no host support Jul 15 05:13:41.808839 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:13:41.808844 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 15 05:13:41.808849 kernel: random: crng init done Jul 15 05:13:41.808855 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 15 05:13:41.808862 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jul 15 05:13:41.808867 kernel: Fallback order for Node 0: 0 Jul 15 05:13:41.808873 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Jul 15 05:13:41.808878 kernel: Policy zone: DMA32 Jul 15 05:13:41.808883 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 15 05:13:41.808888 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 15 05:13:41.808894 kernel: ftrace: allocating 40097 entries in 157 pages Jul 15 05:13:41.808899 kernel: ftrace: allocated 157 pages with 5 groups Jul 15 05:13:41.808904 kernel: Dynamic Preempt: voluntary Jul 15 05:13:41.808913 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 15 05:13:41.808922 kernel: rcu: RCU event tracing is enabled. Jul 15 05:13:41.808930 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 15 05:13:41.808937 kernel: Trampoline variant of Tasks RCU enabled. Jul 15 05:13:41.808945 kernel: Rude variant of Tasks RCU enabled. Jul 15 05:13:41.808953 kernel: Tracing variant of Tasks RCU enabled. Jul 15 05:13:41.808961 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 15 05:13:41.808966 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 15 05:13:41.808971 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:13:41.808980 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:13:41.808988 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:13:41.808996 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jul 15 05:13:41.809003 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 15 05:13:41.809010 kernel: Console: colour VGA+ 80x25 Jul 15 05:13:41.809017 kernel: printk: legacy console [tty0] enabled Jul 15 05:13:41.809024 kernel: printk: legacy console [ttyS0] enabled Jul 15 05:13:41.809031 kernel: ACPI: Core revision 20240827 Jul 15 05:13:41.809039 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jul 15 05:13:41.809055 kernel: APIC: Switch to symmetric I/O mode setup Jul 15 05:13:41.809064 kernel: x2apic enabled Jul 15 05:13:41.809072 kernel: APIC: Switched APIC routing to: physical x2apic Jul 15 05:13:41.809082 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jul 15 05:13:41.809090 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jul 15 05:13:41.809098 kernel: Calibrating delay loop (skipped) preset value.. 4799.99 BogoMIPS (lpj=2399998) Jul 15 05:13:41.809106 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jul 15 05:13:41.809123 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jul 15 05:13:41.809131 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jul 15 05:13:41.809141 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jul 15 05:13:41.809149 kernel: Spectre V2 : Mitigation: Retpolines Jul 15 05:13:41.809156 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jul 15 05:13:41.809164 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Jul 15 05:13:41.809172 kernel: RETBleed: Mitigation: untrained return thunk Jul 15 05:13:41.809179 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jul 15 05:13:41.809187 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jul 15 05:13:41.809198 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Jul 15 05:13:41.809207 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Jul 15 05:13:41.809216 kernel: x86/bugs: return thunk changed Jul 15 05:13:41.809224 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Jul 15 05:13:41.809231 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jul 15 05:13:41.809239 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jul 15 05:13:41.809246 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jul 15 05:13:41.809255 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jul 15 05:13:41.809263 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jul 15 05:13:41.809273 kernel: Freeing SMP alternatives memory: 32K Jul 15 05:13:41.809281 kernel: pid_max: default: 32768 minimum: 301 Jul 15 05:13:41.809289 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 15 05:13:41.809297 kernel: landlock: Up and running. Jul 15 05:13:41.809304 kernel: SELinux: Initializing. Jul 15 05:13:41.809312 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 15 05:13:41.809319 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 15 05:13:41.809328 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Jul 15 05:13:41.809335 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Jul 15 05:13:41.809345 kernel: ... version: 0 Jul 15 05:13:41.809353 kernel: ... bit width: 48 Jul 15 05:13:41.809360 kernel: ... generic registers: 6 Jul 15 05:13:41.809368 kernel: ... value mask: 0000ffffffffffff Jul 15 05:13:41.809679 kernel: ... max period: 00007fffffffffff Jul 15 05:13:41.809689 kernel: ... fixed-purpose events: 0 Jul 15 05:13:41.809697 kernel: ... event mask: 000000000000003f Jul 15 05:13:41.809705 kernel: signal: max sigframe size: 1776 Jul 15 05:13:41.809712 kernel: rcu: Hierarchical SRCU implementation. Jul 15 05:13:41.809724 kernel: rcu: Max phase no-delay instances is 400. Jul 15 05:13:41.809732 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 15 05:13:41.809740 kernel: smp: Bringing up secondary CPUs ... Jul 15 05:13:41.809749 kernel: smpboot: x86: Booting SMP configuration: Jul 15 05:13:41.809757 kernel: .... node #0, CPUs: #1 Jul 15 05:13:41.809765 kernel: smp: Brought up 1 node, 2 CPUs Jul 15 05:13:41.809773 kernel: smpboot: Total of 2 processors activated (9599.99 BogoMIPS) Jul 15 05:13:41.809781 kernel: Memory: 1917780K/2047464K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54608K init, 2360K bss, 125140K reserved, 0K cma-reserved) Jul 15 05:13:41.809790 kernel: devtmpfs: initialized Jul 15 05:13:41.809799 kernel: x86/mm: Memory block size: 128MB Jul 15 05:13:41.809807 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 15 05:13:41.809814 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 15 05:13:41.809821 kernel: pinctrl core: initialized pinctrl subsystem Jul 15 05:13:41.809829 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 15 05:13:41.809836 kernel: audit: initializing netlink subsys (disabled) Jul 15 05:13:41.809844 kernel: audit: type=2000 audit(1752556419.877:1): state=initialized audit_enabled=0 res=1 Jul 15 05:13:41.809852 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 15 05:13:41.809859 kernel: thermal_sys: Registered thermal governor 'user_space' Jul 15 05:13:41.809868 kernel: cpuidle: using governor menu Jul 15 05:13:41.809877 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 15 05:13:41.809884 kernel: dca service started, version 1.12.1 Jul 15 05:13:41.809892 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Jul 15 05:13:41.809899 kernel: PCI: Using configuration type 1 for base access Jul 15 05:13:41.809906 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jul 15 05:13:41.809913 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 15 05:13:41.809921 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jul 15 05:13:41.809929 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 15 05:13:41.809939 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jul 15 05:13:41.809947 kernel: ACPI: Added _OSI(Module Device) Jul 15 05:13:41.809954 kernel: ACPI: Added _OSI(Processor Device) Jul 15 05:13:41.809963 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 15 05:13:41.809970 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 15 05:13:41.809978 kernel: ACPI: Interpreter enabled Jul 15 05:13:41.809986 kernel: ACPI: PM: (supports S0 S5) Jul 15 05:13:41.809994 kernel: ACPI: Using IOAPIC for interrupt routing Jul 15 05:13:41.810001 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jul 15 05:13:41.810011 kernel: PCI: Using E820 reservations for host bridge windows Jul 15 05:13:41.810020 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jul 15 05:13:41.810028 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 15 05:13:41.810245 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 15 05:13:41.810365 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jul 15 05:13:41.810526 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jul 15 05:13:41.810538 kernel: PCI host bridge to bus 0000:00 Jul 15 05:13:41.810651 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jul 15 05:13:41.810757 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jul 15 05:13:41.810869 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jul 15 05:13:41.810987 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Jul 15 05:13:41.811091 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jul 15 05:13:41.811213 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Jul 15 05:13:41.811325 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 15 05:13:41.811496 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jul 15 05:13:41.811642 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jul 15 05:13:41.811762 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Jul 15 05:13:41.811887 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Jul 15 05:13:41.812019 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Jul 15 05:13:41.812153 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Jul 15 05:13:41.812293 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jul 15 05:13:41.812462 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.812574 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Jul 15 05:13:41.812686 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:13:41.812795 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Jul 15 05:13:41.812901 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:13:41.813018 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.813151 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Jul 15 05:13:41.813274 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:13:41.813420 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Jul 15 05:13:41.813542 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:13:41.813665 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.813781 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Jul 15 05:13:41.813892 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:13:41.814009 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Jul 15 05:13:41.814137 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:13:41.814272 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.814397 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Jul 15 05:13:41.814515 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:13:41.814626 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Jul 15 05:13:41.814735 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:13:41.814865 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.814978 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Jul 15 05:13:41.815089 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:13:41.815229 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Jul 15 05:13:41.815332 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:13:41.815477 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.815591 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Jul 15 05:13:41.815704 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:13:41.815814 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Jul 15 05:13:41.815927 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:13:41.816047 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.816173 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Jul 15 05:13:41.816288 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:13:41.816418 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Jul 15 05:13:41.816536 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:13:41.816661 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.816774 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Jul 15 05:13:41.816887 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:13:41.816999 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Jul 15 05:13:41.817121 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:13:41.817245 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:13:41.817362 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Jul 15 05:13:41.817501 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:13:41.817612 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Jul 15 05:13:41.817726 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:13:41.817846 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jul 15 05:13:41.817958 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jul 15 05:13:41.818078 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jul 15 05:13:41.818210 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Jul 15 05:13:41.818323 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Jul 15 05:13:41.818461 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jul 15 05:13:41.818574 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Jul 15 05:13:41.818715 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 05:13:41.818833 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Jul 15 05:13:41.818951 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Jul 15 05:13:41.819069 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Jul 15 05:13:41.819202 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:13:41.819335 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jul 15 05:13:41.819482 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Jul 15 05:13:41.819600 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:13:41.819730 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jul 15 05:13:41.819851 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Jul 15 05:13:41.819974 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Jul 15 05:13:41.820089 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:13:41.820224 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jul 15 05:13:41.820326 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Jul 15 05:13:41.820455 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:13:41.820585 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jul 15 05:13:41.820709 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Jul 15 05:13:41.820822 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:13:41.820954 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jul 15 05:13:41.821074 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Jul 15 05:13:41.821216 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Jul 15 05:13:41.821327 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:13:41.821340 kernel: acpiphp: Slot [0] registered Jul 15 05:13:41.821499 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 05:13:41.821616 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Jul 15 05:13:41.821716 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Jul 15 05:13:41.821835 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Jul 15 05:13:41.821955 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:13:41.821968 kernel: acpiphp: Slot [0-2] registered Jul 15 05:13:41.822097 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:13:41.822108 kernel: acpiphp: Slot [0-3] registered Jul 15 05:13:41.822222 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:13:41.822233 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jul 15 05:13:41.822242 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jul 15 05:13:41.822251 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jul 15 05:13:41.822259 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jul 15 05:13:41.822268 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jul 15 05:13:41.822276 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jul 15 05:13:41.822290 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jul 15 05:13:41.822296 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jul 15 05:13:41.822302 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jul 15 05:13:41.822308 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jul 15 05:13:41.822314 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jul 15 05:13:41.822319 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jul 15 05:13:41.822325 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jul 15 05:13:41.822331 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jul 15 05:13:41.822336 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jul 15 05:13:41.822344 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jul 15 05:13:41.822350 kernel: iommu: Default domain type: Translated Jul 15 05:13:41.822356 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jul 15 05:13:41.822361 kernel: PCI: Using ACPI for IRQ routing Jul 15 05:13:41.822367 kernel: PCI: pci_cache_line_size set to 64 bytes Jul 15 05:13:41.822389 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jul 15 05:13:41.822395 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Jul 15 05:13:41.822496 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jul 15 05:13:41.822588 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jul 15 05:13:41.822675 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jul 15 05:13:41.822682 kernel: vgaarb: loaded Jul 15 05:13:41.822688 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jul 15 05:13:41.822694 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jul 15 05:13:41.822702 kernel: clocksource: Switched to clocksource kvm-clock Jul 15 05:13:41.822710 kernel: VFS: Disk quotas dquot_6.6.0 Jul 15 05:13:41.822719 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 15 05:13:41.822728 kernel: pnp: PnP ACPI init Jul 15 05:13:41.822849 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jul 15 05:13:41.822859 kernel: pnp: PnP ACPI: found 5 devices Jul 15 05:13:41.822866 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jul 15 05:13:41.822871 kernel: NET: Registered PF_INET protocol family Jul 15 05:13:41.822877 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 15 05:13:41.822883 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jul 15 05:13:41.822889 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 15 05:13:41.822895 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jul 15 05:13:41.822903 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jul 15 05:13:41.822909 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jul 15 05:13:41.822915 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 15 05:13:41.822920 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 15 05:13:41.822926 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 15 05:13:41.822932 kernel: NET: Registered PF_XDP protocol family Jul 15 05:13:41.823021 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 15 05:13:41.823124 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 15 05:13:41.823213 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 15 05:13:41.823303 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Jul 15 05:13:41.823427 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Jul 15 05:13:41.823518 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Jul 15 05:13:41.823605 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:13:41.823691 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Jul 15 05:13:41.823778 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:13:41.823865 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:13:41.823951 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Jul 15 05:13:41.824038 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:13:41.824136 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:13:41.824223 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Jul 15 05:13:41.824309 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:13:41.824408 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:13:41.824496 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Jul 15 05:13:41.824582 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:13:41.824669 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:13:41.824755 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Jul 15 05:13:41.824845 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:13:41.824932 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:13:41.825025 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Jul 15 05:13:41.825118 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:13:41.825206 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:13:41.825293 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Jul 15 05:13:41.825397 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Jul 15 05:13:41.825512 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:13:41.825625 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:13:41.825746 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Jul 15 05:13:41.825864 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Jul 15 05:13:41.825959 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:13:41.826057 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:13:41.826189 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Jul 15 05:13:41.826303 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Jul 15 05:13:41.826409 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:13:41.826498 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jul 15 05:13:41.826592 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jul 15 05:13:41.826701 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jul 15 05:13:41.826804 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Jul 15 05:13:41.826905 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jul 15 05:13:41.827011 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Jul 15 05:13:41.827139 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Jul 15 05:13:41.827253 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:13:41.827367 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Jul 15 05:13:41.827501 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:13:41.827602 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Jul 15 05:13:41.827698 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:13:41.827804 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Jul 15 05:13:41.827897 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:13:41.827996 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Jul 15 05:13:41.828088 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:13:41.828204 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Jul 15 05:13:41.828298 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:13:41.828421 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Jul 15 05:13:41.828514 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Jul 15 05:13:41.828605 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:13:41.828706 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Jul 15 05:13:41.828798 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Jul 15 05:13:41.828890 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:13:41.828997 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Jul 15 05:13:41.829088 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Jul 15 05:13:41.829193 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:13:41.829205 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jul 15 05:13:41.829214 kernel: PCI: CLS 0 bytes, default 64 Jul 15 05:13:41.829222 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x229835b7123, max_idle_ns: 440795242976 ns Jul 15 05:13:41.829229 kernel: Initialise system trusted keyrings Jul 15 05:13:41.829238 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jul 15 05:13:41.829248 kernel: Key type asymmetric registered Jul 15 05:13:41.829256 kernel: Asymmetric key parser 'x509' registered Jul 15 05:13:41.829264 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jul 15 05:13:41.829271 kernel: io scheduler mq-deadline registered Jul 15 05:13:41.829279 kernel: io scheduler kyber registered Jul 15 05:13:41.829287 kernel: io scheduler bfq registered Jul 15 05:13:41.829419 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jul 15 05:13:41.829519 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jul 15 05:13:41.829617 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jul 15 05:13:41.829716 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jul 15 05:13:41.829814 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jul 15 05:13:41.829910 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jul 15 05:13:41.830009 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jul 15 05:13:41.830107 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jul 15 05:13:41.830223 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jul 15 05:13:41.830337 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jul 15 05:13:41.830470 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jul 15 05:13:41.830588 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jul 15 05:13:41.830697 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jul 15 05:13:41.830805 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jul 15 05:13:41.830917 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jul 15 05:13:41.831035 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jul 15 05:13:41.831046 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jul 15 05:13:41.831177 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jul 15 05:13:41.831293 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jul 15 05:13:41.831305 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jul 15 05:13:41.831314 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jul 15 05:13:41.831323 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 15 05:13:41.831331 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jul 15 05:13:41.831340 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jul 15 05:13:41.831348 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jul 15 05:13:41.831356 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jul 15 05:13:41.831367 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jul 15 05:13:41.831518 kernel: rtc_cmos 00:03: RTC can wake from S4 Jul 15 05:13:41.831629 kernel: rtc_cmos 00:03: registered as rtc0 Jul 15 05:13:41.831773 kernel: rtc_cmos 00:03: setting system clock to 2025-07-15T05:13:41 UTC (1752556421) Jul 15 05:13:41.831882 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Jul 15 05:13:41.831893 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jul 15 05:13:41.831902 kernel: NET: Registered PF_INET6 protocol family Jul 15 05:13:41.831916 kernel: Segment Routing with IPv6 Jul 15 05:13:41.831925 kernel: In-situ OAM (IOAM) with IPv6 Jul 15 05:13:41.831934 kernel: NET: Registered PF_PACKET protocol family Jul 15 05:13:41.831942 kernel: Key type dns_resolver registered Jul 15 05:13:41.831951 kernel: IPI shorthand broadcast: enabled Jul 15 05:13:41.831960 kernel: sched_clock: Marking stable (2699005066, 132389467)->(2836948112, -5553579) Jul 15 05:13:41.831969 kernel: registered taskstats version 1 Jul 15 05:13:41.831978 kernel: Loading compiled-in X.509 certificates Jul 15 05:13:41.831986 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: a24478b628e55368911ce1800a2bd6bc158938c7' Jul 15 05:13:41.831997 kernel: Demotion targets for Node 0: null Jul 15 05:13:41.832006 kernel: Key type .fscrypt registered Jul 15 05:13:41.832015 kernel: Key type fscrypt-provisioning registered Jul 15 05:13:41.832024 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 15 05:13:41.832032 kernel: ima: Allocated hash algorithm: sha1 Jul 15 05:13:41.832041 kernel: ima: No architecture policies found Jul 15 05:13:41.832050 kernel: clk: Disabling unused clocks Jul 15 05:13:41.832059 kernel: Warning: unable to open an initial console. Jul 15 05:13:41.832068 kernel: Freeing unused kernel image (initmem) memory: 54608K Jul 15 05:13:41.832079 kernel: Write protecting the kernel read-only data: 24576k Jul 15 05:13:41.832088 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jul 15 05:13:41.832097 kernel: Run /init as init process Jul 15 05:13:41.832105 kernel: with arguments: Jul 15 05:13:41.832126 kernel: /init Jul 15 05:13:41.832134 kernel: with environment: Jul 15 05:13:41.832143 kernel: HOME=/ Jul 15 05:13:41.832151 kernel: TERM=linux Jul 15 05:13:41.832160 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 15 05:13:41.832170 systemd[1]: Successfully made /usr/ read-only. Jul 15 05:13:41.832184 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:13:41.832194 systemd[1]: Detected virtualization kvm. Jul 15 05:13:41.832204 systemd[1]: Detected architecture x86-64. Jul 15 05:13:41.832213 systemd[1]: Running in initrd. Jul 15 05:13:41.832221 systemd[1]: No hostname configured, using default hostname. Jul 15 05:13:41.832231 systemd[1]: Hostname set to . Jul 15 05:13:41.832242 systemd[1]: Initializing machine ID from VM UUID. Jul 15 05:13:41.832251 systemd[1]: Queued start job for default target initrd.target. Jul 15 05:13:41.832260 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:13:41.832269 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:13:41.832279 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 15 05:13:41.832289 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:13:41.832298 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 15 05:13:41.832308 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 15 05:13:41.832320 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 15 05:13:41.832332 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 15 05:13:41.832342 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:13:41.832350 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:13:41.832359 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:13:41.832368 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:13:41.832394 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:13:41.832405 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:13:41.832415 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:13:41.832424 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:13:41.832433 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 15 05:13:41.832443 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 15 05:13:41.832453 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:13:41.832461 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:13:41.832470 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:13:41.832480 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:13:41.832491 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 15 05:13:41.832500 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:13:41.832510 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 15 05:13:41.832519 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 15 05:13:41.832528 systemd[1]: Starting systemd-fsck-usr.service... Jul 15 05:13:41.832537 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:13:41.832546 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:13:41.832556 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:13:41.832567 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 15 05:13:41.832578 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:13:41.832588 systemd[1]: Finished systemd-fsck-usr.service. Jul 15 05:13:41.832597 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 05:13:41.832638 systemd-journald[216]: Collecting audit messages is disabled. Jul 15 05:13:41.832663 systemd-journald[216]: Journal started Jul 15 05:13:41.832683 systemd-journald[216]: Runtime Journal (/run/log/journal/2615e164b30448099add170c9e06ed87) is 4.8M, max 38.6M, 33.7M free. Jul 15 05:13:41.802824 systemd-modules-load[217]: Inserted module 'overlay' Jul 15 05:13:41.862628 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 15 05:13:41.862649 kernel: Bridge firewalling registered Jul 15 05:13:41.862658 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:13:41.836012 systemd-modules-load[217]: Inserted module 'br_netfilter' Jul 15 05:13:41.870475 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:13:41.871228 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:13:41.877224 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 15 05:13:41.879896 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:13:41.884606 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:13:41.887030 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:13:41.890636 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:13:41.896128 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:13:41.897269 systemd-tmpfiles[235]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 15 05:13:41.903590 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:13:41.904265 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:13:41.907824 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 15 05:13:41.910463 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:13:41.914778 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:13:41.928722 dracut-cmdline[254]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:13:41.942723 systemd-resolved[255]: Positive Trust Anchors: Jul 15 05:13:41.942738 systemd-resolved[255]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:13:41.942767 systemd-resolved[255]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:13:41.946172 systemd-resolved[255]: Defaulting to hostname 'linux'. Jul 15 05:13:41.947640 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:13:41.948405 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:13:41.998399 kernel: SCSI subsystem initialized Jul 15 05:13:42.005396 kernel: Loading iSCSI transport class v2.0-870. Jul 15 05:13:42.013404 kernel: iscsi: registered transport (tcp) Jul 15 05:13:42.028498 kernel: iscsi: registered transport (qla4xxx) Jul 15 05:13:42.028559 kernel: QLogic iSCSI HBA Driver Jul 15 05:13:42.044039 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:13:42.056849 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:13:42.058909 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:13:42.094746 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 15 05:13:42.096396 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 15 05:13:42.151403 kernel: raid6: avx2x4 gen() 55725 MB/s Jul 15 05:13:42.168391 kernel: raid6: avx2x2 gen() 56698 MB/s Jul 15 05:13:42.185441 kernel: raid6: avx2x1 gen() 43863 MB/s Jul 15 05:13:42.185501 kernel: raid6: using algorithm avx2x2 gen() 56698 MB/s Jul 15 05:13:42.203515 kernel: raid6: .... xor() 37211 MB/s, rmw enabled Jul 15 05:13:42.203555 kernel: raid6: using avx2x2 recovery algorithm Jul 15 05:13:42.219420 kernel: xor: automatically using best checksumming function avx Jul 15 05:13:42.343566 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 15 05:13:42.357901 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:13:42.359738 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:13:42.384960 systemd-udevd[465]: Using default interface naming scheme 'v255'. Jul 15 05:13:42.390772 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:13:42.393041 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 15 05:13:42.417681 dracut-pre-trigger[472]: rd.md=0: removing MD RAID activation Jul 15 05:13:42.443196 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:13:42.444828 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:13:42.512221 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:13:42.519911 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 15 05:13:42.577088 kernel: ACPI: bus type USB registered Jul 15 05:13:42.577149 kernel: usbcore: registered new interface driver usbfs Jul 15 05:13:42.578096 kernel: usbcore: registered new interface driver hub Jul 15 05:13:42.579060 kernel: usbcore: registered new device driver usb Jul 15 05:13:42.614606 kernel: cryptd: max_cpu_qlen set to 1000 Jul 15 05:13:42.620396 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jul 15 05:13:42.627409 kernel: scsi host0: Virtio SCSI HBA Jul 15 05:13:42.630457 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 05:13:42.631391 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:13:42.680975 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jul 15 05:13:42.681169 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jul 15 05:13:42.681193 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jul 15 05:13:42.681312 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 05:13:42.681434 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jul 15 05:13:42.681540 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jul 15 05:13:42.681641 kernel: hub 1-0:1.0: USB hub found Jul 15 05:13:42.681766 kernel: hub 1-0:1.0: 4 ports detected Jul 15 05:13:42.681872 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jul 15 05:13:42.682113 kernel: hub 2-0:1.0: USB hub found Jul 15 05:13:42.682269 kernel: hub 2-0:1.0: 4 ports detected Jul 15 05:13:42.682454 kernel: libata version 3.00 loaded. Jul 15 05:13:42.631530 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:13:42.677564 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:13:42.686908 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:13:42.699435 kernel: ahci 0000:00:1f.2: version 3.0 Jul 15 05:13:42.701393 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jul 15 05:13:42.704865 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jul 15 05:13:42.705043 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jul 15 05:13:42.705159 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jul 15 05:13:42.709550 kernel: scsi host1: ahci Jul 15 05:13:42.709736 kernel: scsi host2: ahci Jul 15 05:13:42.711454 kernel: scsi host3: ahci Jul 15 05:13:42.713390 kernel: scsi host4: ahci Jul 15 05:13:42.713558 kernel: scsi host5: ahci Jul 15 05:13:42.713677 kernel: scsi host6: ahci Jul 15 05:13:42.715072 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 49 lpm-pol 0 Jul 15 05:13:42.719924 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 49 lpm-pol 0 Jul 15 05:13:42.719956 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 49 lpm-pol 0 Jul 15 05:13:42.719965 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 49 lpm-pol 0 Jul 15 05:13:42.719974 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 49 lpm-pol 0 Jul 15 05:13:42.719981 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 49 lpm-pol 0 Jul 15 05:13:42.725433 kernel: AES CTR mode by8 optimization enabled Jul 15 05:13:42.729386 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jul 15 05:13:42.734411 kernel: sd 0:0:0:0: Power-on or device reset occurred Jul 15 05:13:42.735529 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jul 15 05:13:42.735670 kernel: sd 0:0:0:0: [sda] Write Protect is off Jul 15 05:13:42.735784 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Jul 15 05:13:42.735895 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jul 15 05:13:42.741507 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 15 05:13:42.741558 kernel: GPT:17805311 != 80003071 Jul 15 05:13:42.741567 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 15 05:13:42.741575 kernel: GPT:17805311 != 80003071 Jul 15 05:13:42.741582 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 15 05:13:42.741590 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:13:42.741598 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jul 15 05:13:42.793775 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:13:42.893402 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jul 15 05:13:43.033403 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jul 15 05:13:43.033490 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 15 05:13:43.033512 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jul 15 05:13:43.033531 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jul 15 05:13:43.033550 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jul 15 05:13:43.037351 kernel: ata1.00: applying bridge limits Jul 15 05:13:43.037413 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jul 15 05:13:43.037432 kernel: ata1.00: configured for UDMA/100 Jul 15 05:13:43.041002 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jul 15 05:13:43.042073 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jul 15 05:13:43.042113 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jul 15 05:13:43.072395 kernel: usbcore: registered new interface driver usbhid Jul 15 05:13:43.073078 kernel: usbhid: USB HID core driver Jul 15 05:13:43.084207 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jul 15 05:13:43.084247 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jul 15 05:13:43.101219 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jul 15 05:13:43.101426 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 15 05:13:43.119397 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Jul 15 05:13:43.143693 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jul 15 05:13:43.150173 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jul 15 05:13:43.157396 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 05:13:43.162894 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jul 15 05:13:43.163357 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jul 15 05:13:43.165832 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 15 05:13:43.187155 disk-uuid[637]: Primary Header is updated. Jul 15 05:13:43.187155 disk-uuid[637]: Secondary Entries is updated. Jul 15 05:13:43.187155 disk-uuid[637]: Secondary Header is updated. Jul 15 05:13:43.188510 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:13:43.313025 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 15 05:13:43.313901 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:13:43.314451 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:13:43.314853 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:13:43.317281 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 15 05:13:43.336830 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:13:44.216905 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:13:44.220591 disk-uuid[638]: The operation has completed successfully. Jul 15 05:13:44.301638 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 15 05:13:44.301753 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 15 05:13:44.318694 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 15 05:13:44.334694 sh[667]: Success Jul 15 05:13:44.351742 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 15 05:13:44.351809 kernel: device-mapper: uevent: version 1.0.3 Jul 15 05:13:44.351820 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 15 05:13:44.361396 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Jul 15 05:13:44.401731 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 15 05:13:44.405482 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 15 05:13:44.413408 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 15 05:13:44.429677 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 15 05:13:44.429731 kernel: BTRFS: device fsid eb96c768-dac4-4ca9-ae1d-82815d4ce00b devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (679) Jul 15 05:13:44.431398 kernel: BTRFS info (device dm-0): first mount of filesystem eb96c768-dac4-4ca9-ae1d-82815d4ce00b Jul 15 05:13:44.433184 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:13:44.433198 kernel: BTRFS info (device dm-0): using free-space-tree Jul 15 05:13:44.442980 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 15 05:13:44.444447 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:13:44.445871 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 15 05:13:44.447037 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 15 05:13:44.449524 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 15 05:13:44.473404 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (712) Jul 15 05:13:44.473458 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:13:44.475517 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:13:44.477407 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:13:44.484877 kernel: BTRFS info (device sda6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:13:44.485435 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 15 05:13:44.486807 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 15 05:13:44.569165 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:13:44.570793 ignition[777]: Ignition 2.21.0 Jul 15 05:13:44.570805 ignition[777]: Stage: fetch-offline Jul 15 05:13:44.572287 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:13:44.570829 ignition[777]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:13:44.570836 ignition[777]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:13:44.570893 ignition[777]: parsed url from cmdline: "" Jul 15 05:13:44.575424 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:13:44.570896 ignition[777]: no config URL provided Jul 15 05:13:44.570901 ignition[777]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:13:44.570907 ignition[777]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:13:44.570911 ignition[777]: failed to fetch config: resource requires networking Jul 15 05:13:44.571016 ignition[777]: Ignition finished successfully Jul 15 05:13:44.602793 systemd-networkd[853]: lo: Link UP Jul 15 05:13:44.602802 systemd-networkd[853]: lo: Gained carrier Jul 15 05:13:44.604962 systemd-networkd[853]: Enumeration completed Jul 15 05:13:44.605136 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:13:44.605622 systemd-networkd[853]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:13:44.605626 systemd-networkd[853]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:13:44.606196 systemd[1]: Reached target network.target - Network. Jul 15 05:13:44.606667 systemd-networkd[853]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:13:44.606671 systemd-networkd[853]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:13:44.607894 systemd-networkd[853]: eth0: Link UP Jul 15 05:13:44.607898 systemd-networkd[853]: eth0: Gained carrier Jul 15 05:13:44.607906 systemd-networkd[853]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:13:44.608930 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 15 05:13:44.612619 systemd-networkd[853]: eth1: Link UP Jul 15 05:13:44.612623 systemd-networkd[853]: eth1: Gained carrier Jul 15 05:13:44.612634 systemd-networkd[853]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:13:44.628658 ignition[857]: Ignition 2.21.0 Jul 15 05:13:44.628669 ignition[857]: Stage: fetch Jul 15 05:13:44.628769 ignition[857]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:13:44.632996 systemd-networkd[853]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 05:13:44.628777 ignition[857]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:13:44.628830 ignition[857]: parsed url from cmdline: "" Jul 15 05:13:44.628833 ignition[857]: no config URL provided Jul 15 05:13:44.628837 ignition[857]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:13:44.628843 ignition[857]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:13:44.628876 ignition[857]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jul 15 05:13:44.629025 ignition[857]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jul 15 05:13:44.690468 systemd-networkd[853]: eth0: DHCPv4 address 95.217.135.169/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 05:13:44.829853 ignition[857]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jul 15 05:13:44.838032 ignition[857]: GET result: OK Jul 15 05:13:44.838100 ignition[857]: parsing config with SHA512: 6d558d8e25eeec6530e1050591dd23dc1322f66d98a0cca8d247adb579233a96c3feb197183a7bfc66ff6ae70e56f387d7e5b186c298de20f5e15971cbb1fd77 Jul 15 05:13:44.841302 unknown[857]: fetched base config from "system" Jul 15 05:13:44.841559 ignition[857]: fetch: fetch complete Jul 15 05:13:44.841312 unknown[857]: fetched base config from "system" Jul 15 05:13:44.841564 ignition[857]: fetch: fetch passed Jul 15 05:13:44.841317 unknown[857]: fetched user config from "hetzner" Jul 15 05:13:44.841600 ignition[857]: Ignition finished successfully Jul 15 05:13:44.847295 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 15 05:13:44.849790 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 15 05:13:44.888580 ignition[865]: Ignition 2.21.0 Jul 15 05:13:44.888607 ignition[865]: Stage: kargs Jul 15 05:13:44.888838 ignition[865]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:13:44.888857 ignition[865]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:13:44.893189 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 15 05:13:44.890223 ignition[865]: kargs: kargs passed Jul 15 05:13:44.890294 ignition[865]: Ignition finished successfully Jul 15 05:13:44.898175 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 15 05:13:44.939510 ignition[871]: Ignition 2.21.0 Jul 15 05:13:44.939552 ignition[871]: Stage: disks Jul 15 05:13:44.939787 ignition[871]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:13:44.939805 ignition[871]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:13:44.942353 ignition[871]: disks: disks passed Jul 15 05:13:44.943881 ignition[871]: Ignition finished successfully Jul 15 05:13:44.950744 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 15 05:13:44.952506 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 15 05:13:44.952963 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 15 05:13:44.953894 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:13:44.954808 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:13:44.955663 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:13:44.957278 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 15 05:13:44.983211 systemd-fsck[880]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 15 05:13:44.985873 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 15 05:13:44.987245 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 15 05:13:45.091429 kernel: EXT4-fs (sda9): mounted filesystem 277c3938-5262-4ab1-8fa3-62fde82f8257 r/w with ordered data mode. Quota mode: none. Jul 15 05:13:45.091456 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 15 05:13:45.092258 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 15 05:13:45.094485 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:13:45.096813 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 15 05:13:45.099653 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 15 05:13:45.100728 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 15 05:13:45.100751 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:13:45.109396 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 15 05:13:45.111480 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 15 05:13:45.128538 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (888) Jul 15 05:13:45.128571 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:13:45.137428 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:13:45.137463 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:13:45.149734 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:13:45.170576 coreos-metadata[890]: Jul 15 05:13:45.170 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jul 15 05:13:45.171977 coreos-metadata[890]: Jul 15 05:13:45.171 INFO Fetch successful Jul 15 05:13:45.172592 coreos-metadata[890]: Jul 15 05:13:45.172 INFO wrote hostname ci-4396-0-0-n-85c8113064 to /sysroot/etc/hostname Jul 15 05:13:45.175245 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 05:13:45.181451 initrd-setup-root[917]: cut: /sysroot/etc/passwd: No such file or directory Jul 15 05:13:45.185740 initrd-setup-root[924]: cut: /sysroot/etc/group: No such file or directory Jul 15 05:13:45.189748 initrd-setup-root[931]: cut: /sysroot/etc/shadow: No such file or directory Jul 15 05:13:45.193873 initrd-setup-root[938]: cut: /sysroot/etc/gshadow: No such file or directory Jul 15 05:13:45.265244 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 15 05:13:45.267121 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 15 05:13:45.268934 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 15 05:13:45.286403 kernel: BTRFS info (device sda6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:13:45.300512 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 15 05:13:45.309641 ignition[1007]: INFO : Ignition 2.21.0 Jul 15 05:13:45.309641 ignition[1007]: INFO : Stage: mount Jul 15 05:13:45.311661 ignition[1007]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:13:45.311661 ignition[1007]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:13:45.311661 ignition[1007]: INFO : mount: mount passed Jul 15 05:13:45.311661 ignition[1007]: INFO : Ignition finished successfully Jul 15 05:13:45.311890 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 15 05:13:45.313343 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 15 05:13:45.427977 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 15 05:13:45.429739 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:13:45.463451 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1017) Jul 15 05:13:45.468447 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:13:45.468499 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:13:45.471549 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:13:45.479360 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:13:45.520781 ignition[1033]: INFO : Ignition 2.21.0 Jul 15 05:13:45.520781 ignition[1033]: INFO : Stage: files Jul 15 05:13:45.522283 ignition[1033]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:13:45.522283 ignition[1033]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:13:45.524442 ignition[1033]: DEBUG : files: compiled without relabeling support, skipping Jul 15 05:13:45.525384 ignition[1033]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 15 05:13:45.525384 ignition[1033]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 15 05:13:45.527662 ignition[1033]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 15 05:13:45.528392 ignition[1033]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 15 05:13:45.528392 ignition[1033]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 15 05:13:45.528087 unknown[1033]: wrote ssh authorized keys file for user: core Jul 15 05:13:45.530226 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jul 15 05:13:45.530226 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jul 15 05:13:45.715673 systemd-networkd[853]: eth0: Gained IPv6LL Jul 15 05:13:45.824593 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 15 05:13:46.163597 systemd-networkd[853]: eth1: Gained IPv6LL Jul 15 05:13:48.105260 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jul 15 05:13:48.106426 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 15 05:13:48.106426 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 15 05:13:48.106426 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:13:48.108645 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:13:48.108645 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:13:48.108645 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:13:48.108645 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:13:48.108645 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:13:48.112182 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:13:48.112182 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:13:48.112182 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jul 15 05:13:48.114455 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jul 15 05:13:48.114455 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jul 15 05:13:48.114455 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Jul 15 05:13:48.586507 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 15 05:13:52.155146 ignition[1033]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jul 15 05:13:52.155146 ignition[1033]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 15 05:13:52.159672 ignition[1033]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:13:52.159672 ignition[1033]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:13:52.159672 ignition[1033]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 15 05:13:52.159672 ignition[1033]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 15 05:13:52.159672 ignition[1033]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 05:13:52.159672 ignition[1033]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 05:13:52.159672 ignition[1033]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 15 05:13:52.159672 ignition[1033]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jul 15 05:13:52.183547 ignition[1033]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jul 15 05:13:52.183547 ignition[1033]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:13:52.183547 ignition[1033]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:13:52.183547 ignition[1033]: INFO : files: files passed Jul 15 05:13:52.183547 ignition[1033]: INFO : Ignition finished successfully Jul 15 05:13:52.164122 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 15 05:13:52.174630 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 15 05:13:52.182767 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 15 05:13:52.206587 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 15 05:13:52.206822 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 15 05:13:52.217651 initrd-setup-root-after-ignition[1063]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:13:52.217651 initrd-setup-root-after-ignition[1063]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:13:52.220994 initrd-setup-root-after-ignition[1067]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:13:52.220817 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:13:52.221888 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 15 05:13:52.223882 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 15 05:13:52.281818 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 15 05:13:52.282058 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 15 05:13:52.284137 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 15 05:13:52.285551 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 15 05:13:52.287130 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 15 05:13:52.289614 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 15 05:13:52.319191 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:13:52.323341 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 15 05:13:52.352005 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:13:52.354366 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:13:52.355564 systemd[1]: Stopped target timers.target - Timer Units. Jul 15 05:13:52.357196 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 15 05:13:52.357474 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:13:52.359127 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 15 05:13:52.360311 systemd[1]: Stopped target basic.target - Basic System. Jul 15 05:13:52.361928 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 15 05:13:52.363335 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:13:52.364915 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 15 05:13:52.366412 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:13:52.368096 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 15 05:13:52.369680 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:13:52.371460 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 15 05:13:52.372839 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 15 05:13:52.374507 systemd[1]: Stopped target swap.target - Swaps. Jul 15 05:13:52.375840 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 15 05:13:52.376043 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:13:52.377892 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:13:52.379494 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:13:52.381445 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 15 05:13:52.381788 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:13:52.385366 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 15 05:13:52.385562 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 15 05:13:52.393433 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 15 05:13:52.393593 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:13:52.398093 systemd[1]: ignition-files.service: Deactivated successfully. Jul 15 05:13:52.398201 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 15 05:13:52.399047 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 15 05:13:52.399116 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 05:13:52.402255 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 15 05:13:52.404533 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 15 05:13:52.404975 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 15 05:13:52.405093 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:13:52.405914 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 15 05:13:52.406010 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:13:52.412700 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 15 05:13:52.412791 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 15 05:13:52.423397 ignition[1087]: INFO : Ignition 2.21.0 Jul 15 05:13:52.423397 ignition[1087]: INFO : Stage: umount Jul 15 05:13:52.423397 ignition[1087]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:13:52.423397 ignition[1087]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:13:52.430294 ignition[1087]: INFO : umount: umount passed Jul 15 05:13:52.431209 ignition[1087]: INFO : Ignition finished successfully Jul 15 05:13:52.432525 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 15 05:13:52.433173 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 15 05:13:52.433317 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 15 05:13:52.434801 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 15 05:13:52.434844 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 15 05:13:52.436608 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 15 05:13:52.436650 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 15 05:13:52.437434 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 15 05:13:52.437468 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 15 05:13:52.438290 systemd[1]: Stopped target network.target - Network. Jul 15 05:13:52.439084 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 15 05:13:52.439122 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:13:52.439977 systemd[1]: Stopped target paths.target - Path Units. Jul 15 05:13:52.440804 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 15 05:13:52.442491 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:13:52.443125 systemd[1]: Stopped target slices.target - Slice Units. Jul 15 05:13:52.444071 systemd[1]: Stopped target sockets.target - Socket Units. Jul 15 05:13:52.444892 systemd[1]: iscsid.socket: Deactivated successfully. Jul 15 05:13:52.444925 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:13:52.445817 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 15 05:13:52.445854 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:13:52.446827 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 15 05:13:52.446870 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 15 05:13:52.447695 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 15 05:13:52.447738 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 15 05:13:52.448651 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 15 05:13:52.449535 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 15 05:13:52.452001 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 15 05:13:52.452102 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 15 05:13:52.452702 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 15 05:13:52.452920 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 15 05:13:52.456066 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 15 05:13:52.456702 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 15 05:13:52.456779 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 15 05:13:52.457947 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 15 05:13:52.457989 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:13:52.460485 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:13:52.460705 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 15 05:13:52.460794 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 15 05:13:52.462415 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 15 05:13:52.462822 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 15 05:13:52.463765 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 15 05:13:52.463801 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:13:52.465496 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 15 05:13:52.467508 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 15 05:13:52.467581 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:13:52.468099 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 15 05:13:52.468136 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:13:52.470469 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 15 05:13:52.470510 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 15 05:13:52.471294 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:13:52.472946 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 15 05:13:52.479701 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 15 05:13:52.479955 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:13:52.481056 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 15 05:13:52.481119 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 15 05:13:52.481892 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 15 05:13:52.481923 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:13:52.482783 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 15 05:13:52.482829 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:13:52.484174 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 15 05:13:52.484213 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 15 05:13:52.485560 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 15 05:13:52.485607 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:13:52.488473 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 15 05:13:52.488968 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 15 05:13:52.489011 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:13:52.491423 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 15 05:13:52.491484 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:13:52.493139 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jul 15 05:13:52.493177 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:13:52.494271 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 15 05:13:52.494308 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:13:52.495464 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:13:52.495507 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:13:52.501526 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 15 05:13:52.501621 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 15 05:13:52.503521 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 15 05:13:52.503601 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 15 05:13:52.504830 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 15 05:13:52.507484 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 15 05:13:52.539397 systemd[1]: Switching root. Jul 15 05:13:52.595833 systemd-journald[216]: Journal stopped Jul 15 05:13:53.603091 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). Jul 15 05:13:53.603142 kernel: SELinux: policy capability network_peer_controls=1 Jul 15 05:13:53.603158 kernel: SELinux: policy capability open_perms=1 Jul 15 05:13:53.603170 kernel: SELinux: policy capability extended_socket_class=1 Jul 15 05:13:53.603178 kernel: SELinux: policy capability always_check_network=0 Jul 15 05:13:53.603187 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 15 05:13:53.603195 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 15 05:13:53.603203 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 15 05:13:53.603211 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 15 05:13:53.603222 kernel: SELinux: policy capability userspace_initial_context=0 Jul 15 05:13:53.603230 kernel: audit: type=1403 audit(1752556432.733:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 15 05:13:53.603246 systemd[1]: Successfully loaded SELinux policy in 58.832ms. Jul 15 05:13:53.603277 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.144ms. Jul 15 05:13:53.603287 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:13:53.603297 systemd[1]: Detected virtualization kvm. Jul 15 05:13:53.603306 systemd[1]: Detected architecture x86-64. Jul 15 05:13:53.603316 systemd[1]: Detected first boot. Jul 15 05:13:53.603331 systemd[1]: Hostname set to . Jul 15 05:13:53.603343 systemd[1]: Initializing machine ID from VM UUID. Jul 15 05:13:53.603356 zram_generator::config[1131]: No configuration found. Jul 15 05:13:53.603366 kernel: Guest personality initialized and is inactive Jul 15 05:13:53.606455 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jul 15 05:13:53.606476 kernel: Initialized host personality Jul 15 05:13:53.606489 kernel: NET: Registered PF_VSOCK protocol family Jul 15 05:13:53.606499 systemd[1]: Populated /etc with preset unit settings. Jul 15 05:13:53.606512 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 15 05:13:53.606524 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 15 05:13:53.606533 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 15 05:13:53.606542 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 15 05:13:53.606551 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 15 05:13:53.606563 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 15 05:13:53.606573 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 15 05:13:53.606581 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 15 05:13:53.606590 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 15 05:13:53.606601 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 15 05:13:53.606610 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 15 05:13:53.606619 systemd[1]: Created slice user.slice - User and Session Slice. Jul 15 05:13:53.606628 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:13:53.606643 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:13:53.606656 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 15 05:13:53.606668 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 15 05:13:53.606677 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 15 05:13:53.606686 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:13:53.606695 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jul 15 05:13:53.606704 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:13:53.606713 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:13:53.606724 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 15 05:13:53.606734 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 15 05:13:53.606743 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 15 05:13:53.606756 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 15 05:13:53.606770 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:13:53.606783 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:13:53.606791 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:13:53.606801 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:13:53.606809 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 15 05:13:53.606820 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 15 05:13:53.606830 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 15 05:13:53.606839 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:13:53.606848 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:13:53.606857 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:13:53.606866 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 15 05:13:53.606874 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 15 05:13:53.606883 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 15 05:13:53.606892 systemd[1]: Mounting media.mount - External Media Directory... Jul 15 05:13:53.606903 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:13:53.606912 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 15 05:13:53.606922 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 15 05:13:53.606931 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 15 05:13:53.606940 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 15 05:13:53.606949 systemd[1]: Reached target machines.target - Containers. Jul 15 05:13:53.606958 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 15 05:13:53.606966 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:13:53.606977 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:13:53.606986 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 15 05:13:53.606997 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:13:53.607009 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:13:53.607018 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:13:53.607027 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 15 05:13:53.607040 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:13:53.607053 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 15 05:13:53.607065 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 15 05:13:53.607076 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 15 05:13:53.607084 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 15 05:13:53.607093 systemd[1]: Stopped systemd-fsck-usr.service. Jul 15 05:13:53.607102 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:13:53.607114 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:13:53.607127 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:13:53.607136 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:13:53.607144 kernel: fuse: init (API version 7.41) Jul 15 05:13:53.607153 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 15 05:13:53.607164 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 15 05:13:53.607173 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:13:53.607183 systemd[1]: verity-setup.service: Deactivated successfully. Jul 15 05:13:53.607192 kernel: loop: module loaded Jul 15 05:13:53.607200 systemd[1]: Stopped verity-setup.service. Jul 15 05:13:53.607209 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:13:53.607219 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 15 05:13:53.607228 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 15 05:13:53.607236 systemd[1]: Mounted media.mount - External Media Directory. Jul 15 05:13:53.607245 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 15 05:13:53.607266 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 15 05:13:53.607275 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 15 05:13:53.607284 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 15 05:13:53.607293 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:13:53.607304 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 15 05:13:53.607313 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 15 05:13:53.607322 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:13:53.607331 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:13:53.607341 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:13:53.607350 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:13:53.607360 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 15 05:13:53.611355 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 15 05:13:53.611444 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:13:53.611461 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:13:53.611470 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:13:53.611480 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:13:53.611489 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 15 05:13:53.611502 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 15 05:13:53.611512 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:13:53.611553 systemd-journald[1215]: Collecting audit messages is disabled. Jul 15 05:13:53.611572 kernel: ACPI: bus type drm_connector registered Jul 15 05:13:53.611582 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 15 05:13:53.611592 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 15 05:13:53.611605 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 15 05:13:53.611618 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:13:53.611631 systemd-journald[1215]: Journal started Jul 15 05:13:53.611649 systemd-journald[1215]: Runtime Journal (/run/log/journal/2615e164b30448099add170c9e06ed87) is 4.8M, max 38.6M, 33.7M free. Jul 15 05:13:53.290435 systemd[1]: Queued start job for default target multi-user.target. Jul 15 05:13:53.309330 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 15 05:13:53.309845 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 15 05:13:53.616645 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 15 05:13:53.629516 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 15 05:13:53.629580 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:13:53.633396 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 15 05:13:53.638571 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:13:53.640392 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 15 05:13:53.645128 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:13:53.647392 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:13:53.656389 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 15 05:13:53.662395 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 05:13:53.666939 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:13:53.669280 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:13:53.670568 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:13:53.676238 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 15 05:13:53.677453 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 15 05:13:53.682423 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 15 05:13:53.709927 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 15 05:13:53.713468 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 15 05:13:53.715123 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 15 05:13:53.724667 kernel: loop0: detected capacity change from 0 to 114000 Jul 15 05:13:53.737135 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:13:53.740199 systemd-journald[1215]: Time spent on flushing to /var/log/journal/2615e164b30448099add170c9e06ed87 is 28.976ms for 1166 entries. Jul 15 05:13:53.740199 systemd-journald[1215]: System Journal (/var/log/journal/2615e164b30448099add170c9e06ed87) is 8M, max 584.8M, 576.8M free. Jul 15 05:13:53.792355 systemd-journald[1215]: Received client request to flush runtime journal. Jul 15 05:13:53.792688 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 15 05:13:53.792704 kernel: loop1: detected capacity change from 0 to 8 Jul 15 05:13:53.746570 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:13:53.751829 systemd-tmpfiles[1238]: ACLs are not supported, ignoring. Jul 15 05:13:53.751839 systemd-tmpfiles[1238]: ACLs are not supported, ignoring. Jul 15 05:13:53.758964 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 15 05:13:53.764464 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:13:53.773824 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 15 05:13:53.797449 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 15 05:13:53.806405 kernel: loop2: detected capacity change from 0 to 229808 Jul 15 05:13:53.821862 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 15 05:13:53.824670 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:13:53.852482 kernel: loop3: detected capacity change from 0 to 146488 Jul 15 05:13:53.854197 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. Jul 15 05:13:53.854490 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. Jul 15 05:13:53.858026 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:13:53.890434 kernel: loop4: detected capacity change from 0 to 114000 Jul 15 05:13:53.907401 kernel: loop5: detected capacity change from 0 to 8 Jul 15 05:13:53.909407 kernel: loop6: detected capacity change from 0 to 229808 Jul 15 05:13:53.931405 kernel: loop7: detected capacity change from 0 to 146488 Jul 15 05:13:53.951958 (sd-merge)[1282]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jul 15 05:13:53.953205 (sd-merge)[1282]: Merged extensions into '/usr'. Jul 15 05:13:53.958507 systemd[1]: Reload requested from client PID 1237 ('systemd-sysext') (unit systemd-sysext.service)... Jul 15 05:13:53.958581 systemd[1]: Reloading... Jul 15 05:13:54.050437 zram_generator::config[1305]: No configuration found. Jul 15 05:13:54.140978 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:13:54.166528 ldconfig[1233]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 15 05:13:54.202777 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 15 05:13:54.202949 systemd[1]: Reloading finished in 243 ms. Jul 15 05:13:54.216408 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 15 05:13:54.218339 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 15 05:13:54.228511 systemd[1]: Starting ensure-sysext.service... Jul 15 05:13:54.231554 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:13:54.245587 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 15 05:13:54.248835 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:13:54.258362 systemd[1]: Reload requested from client PID 1351 ('systemctl') (unit ensure-sysext.service)... Jul 15 05:13:54.258477 systemd[1]: Reloading... Jul 15 05:13:54.263321 systemd-tmpfiles[1352]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 15 05:13:54.263427 systemd-tmpfiles[1352]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 15 05:13:54.263668 systemd-tmpfiles[1352]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 15 05:13:54.263858 systemd-tmpfiles[1352]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 15 05:13:54.264563 systemd-tmpfiles[1352]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 15 05:13:54.264749 systemd-tmpfiles[1352]: ACLs are not supported, ignoring. Jul 15 05:13:54.264800 systemd-tmpfiles[1352]: ACLs are not supported, ignoring. Jul 15 05:13:54.274621 systemd-udevd[1355]: Using default interface naming scheme 'v255'. Jul 15 05:13:54.276813 systemd-tmpfiles[1352]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:13:54.276913 systemd-tmpfiles[1352]: Skipping /boot Jul 15 05:13:54.284951 systemd-tmpfiles[1352]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:13:54.285028 systemd-tmpfiles[1352]: Skipping /boot Jul 15 05:13:54.344422 zram_generator::config[1392]: No configuration found. Jul 15 05:13:54.462813 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:13:54.528245 kernel: mousedev: PS/2 mouse device common for all mice Jul 15 05:13:54.571228 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jul 15 05:13:54.576808 systemd[1]: Reloading finished in 318 ms. Jul 15 05:13:54.588300 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:13:54.591408 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:13:54.607819 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jul 15 05:13:54.618403 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jul 15 05:13:54.621409 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jul 15 05:13:54.623398 kernel: Console: switching to colour dummy device 80x25 Jul 15 05:13:54.625025 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 15 05:13:54.625056 kernel: [drm] features: -context_init Jul 15 05:13:54.625069 kernel: ACPI: button: Power Button [PWRF] Jul 15 05:13:54.631242 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jul 15 05:13:54.640409 kernel: [drm] number of scanouts: 1 Jul 15 05:13:54.640448 kernel: [drm] number of cap sets: 0 Jul 15 05:13:54.650407 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jul 15 05:13:54.656731 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 05:13:54.658207 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:13:54.660514 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jul 15 05:13:54.660712 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jul 15 05:13:54.660897 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:13:54.664448 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 15 05:13:54.664637 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:13:54.665902 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:13:54.668592 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:13:54.670575 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:13:54.671579 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:13:54.672442 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 15 05:13:54.672500 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:13:54.674791 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 15 05:13:54.678645 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:13:54.682733 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:13:54.688873 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 15 05:13:54.688951 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:13:54.691943 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:13:54.692555 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:13:54.692683 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:13:54.692737 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:13:54.692790 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:13:54.700595 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 15 05:13:54.701017 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:13:54.701205 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:13:54.707324 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:13:54.708480 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:13:54.708958 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 15 05:13:54.714626 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:13:54.714836 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:13:54.718553 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:13:54.721517 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:13:54.721685 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:13:54.721709 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:13:54.721771 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:13:54.722158 systemd[1]: Finished ensure-sysext.service. Jul 15 05:13:54.729626 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 15 05:13:54.751279 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:13:54.751512 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:13:54.753054 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:13:54.761201 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 15 05:13:54.763293 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:13:54.763880 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:13:54.764340 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:13:54.775409 kernel: EDAC MC: Ver: 3.0.0 Jul 15 05:13:54.779725 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 15 05:13:54.780825 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:13:54.780994 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:13:54.784140 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 15 05:13:54.814441 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 15 05:13:54.819638 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 15 05:13:54.819852 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 05:13:54.823674 augenrules[1529]: No rules Jul 15 05:13:54.824936 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:13:54.826107 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:13:54.838787 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:13:54.863982 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 15 05:13:54.906622 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:13:54.906847 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:13:54.911091 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:13:54.916445 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:13:54.965735 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:13:55.038880 systemd-networkd[1477]: lo: Link UP Jul 15 05:13:55.038893 systemd-networkd[1477]: lo: Gained carrier Jul 15 05:13:55.041190 systemd-networkd[1477]: Enumeration completed Jul 15 05:13:55.041282 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:13:55.042504 systemd-networkd[1477]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:13:55.042512 systemd-networkd[1477]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:13:55.043108 systemd-networkd[1477]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:13:55.043113 systemd-networkd[1477]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:13:55.043455 systemd-networkd[1477]: eth0: Link UP Jul 15 05:13:55.043595 systemd-networkd[1477]: eth0: Gained carrier Jul 15 05:13:55.043611 systemd-networkd[1477]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:13:55.043851 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 15 05:13:55.047795 systemd-networkd[1477]: eth1: Link UP Jul 15 05:13:55.048688 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 15 05:13:55.048853 systemd-networkd[1477]: eth1: Gained carrier Jul 15 05:13:55.048867 systemd-networkd[1477]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:13:55.051918 systemd-resolved[1479]: Positive Trust Anchors: Jul 15 05:13:55.051927 systemd-resolved[1479]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:13:55.051957 systemd-resolved[1479]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:13:55.060639 systemd-resolved[1479]: Using system hostname 'ci-4396-0-0-n-85c8113064'. Jul 15 05:13:55.065708 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:13:55.066536 systemd[1]: Reached target network.target - Network. Jul 15 05:13:55.066593 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:13:55.067455 systemd-networkd[1477]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 05:13:55.072666 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 15 05:13:55.072899 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:13:55.073492 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 15 05:13:55.073586 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 15 05:13:55.073634 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jul 15 05:13:55.073683 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 15 05:13:55.073725 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 15 05:13:55.073739 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:13:55.073784 systemd[1]: Reached target time-set.target - System Time Set. Jul 15 05:13:55.073957 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 15 05:13:55.074349 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 15 05:13:55.074474 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:13:55.075765 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 15 05:13:55.077129 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 15 05:13:55.078824 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 15 05:13:55.079110 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 15 05:13:55.079231 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 15 05:13:55.091361 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 15 05:13:55.091899 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 15 05:13:55.093115 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 15 05:13:55.093397 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 15 05:13:55.094603 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:13:55.094742 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:13:55.094913 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:13:55.094939 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:13:55.096175 systemd[1]: Starting containerd.service - containerd container runtime... Jul 15 05:13:55.097488 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 15 05:13:55.103620 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 15 05:13:55.104700 systemd-networkd[1477]: eth0: DHCPv4 address 95.217.135.169/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 05:13:55.105411 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 15 05:13:55.107066 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 15 05:13:55.110765 systemd-timesyncd[1498]: Network configuration changed, trying to establish connection. Jul 15 05:13:55.111395 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 15 05:13:55.111511 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 15 05:13:55.120509 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jul 15 05:13:55.124597 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 15 05:13:55.128259 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 15 05:13:55.133613 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jul 15 05:13:55.136497 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 15 05:13:55.140220 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 15 05:13:55.147485 jq[1565]: false Jul 15 05:13:55.147983 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 15 05:13:55.150365 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 15 05:13:55.155501 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 15 05:13:55.160512 systemd[1]: Starting update-engine.service - Update Engine... Jul 15 05:13:55.164510 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 15 05:13:55.168088 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 15 05:13:55.171221 extend-filesystems[1566]: Found /dev/sda6 Jul 15 05:13:55.168497 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 15 05:13:55.168683 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 15 05:13:55.185092 google_oslogin_nss_cache[1567]: oslogin_cache_refresh[1567]: Refreshing passwd entry cache Jul 15 05:13:55.185314 oslogin_cache_refresh[1567]: Refreshing passwd entry cache Jul 15 05:13:55.189671 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 15 05:13:55.189892 google_oslogin_nss_cache[1567]: oslogin_cache_refresh[1567]: Failure getting users, quitting Jul 15 05:13:55.189925 oslogin_cache_refresh[1567]: Failure getting users, quitting Jul 15 05:13:55.189972 google_oslogin_nss_cache[1567]: oslogin_cache_refresh[1567]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:13:55.189991 oslogin_cache_refresh[1567]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:13:55.190058 google_oslogin_nss_cache[1567]: oslogin_cache_refresh[1567]: Refreshing group entry cache Jul 15 05:13:55.190080 oslogin_cache_refresh[1567]: Refreshing group entry cache Jul 15 05:13:55.190441 extend-filesystems[1566]: Found /dev/sda9 Jul 15 05:13:55.190688 google_oslogin_nss_cache[1567]: oslogin_cache_refresh[1567]: Failure getting groups, quitting Jul 15 05:13:55.190728 oslogin_cache_refresh[1567]: Failure getting groups, quitting Jul 15 05:13:55.190772 google_oslogin_nss_cache[1567]: oslogin_cache_refresh[1567]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:13:55.190802 oslogin_cache_refresh[1567]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:13:55.193309 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 15 05:13:55.196001 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jul 15 05:13:55.196240 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jul 15 05:13:55.198648 extend-filesystems[1566]: Checking size of /dev/sda9 Jul 15 05:13:55.201305 jq[1581]: true Jul 15 05:13:55.224756 (ntainerd)[1607]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 15 05:13:55.225973 update_engine[1580]: I20250715 05:13:55.225692 1580 main.cc:92] Flatcar Update Engine starting Jul 15 05:13:55.228639 systemd[1]: motdgen.service: Deactivated successfully. Jul 15 05:13:55.228855 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 15 05:13:55.231775 tar[1586]: linux-amd64/LICENSE Jul 15 05:13:55.232361 tar[1586]: linux-amd64/helm Jul 15 05:13:55.243172 extend-filesystems[1566]: Resized partition /dev/sda9 Jul 15 05:13:55.249757 jq[1603]: true Jul 15 05:13:55.250988 extend-filesystems[1613]: resize2fs 1.47.2 (1-Jan-2025) Jul 15 05:13:55.253933 coreos-metadata[1562]: Jul 15 05:13:55.253 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jul 15 05:13:55.259146 coreos-metadata[1562]: Jul 15 05:13:55.257 INFO Fetch successful Jul 15 05:13:55.259146 coreos-metadata[1562]: Jul 15 05:13:55.257 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jul 15 05:13:55.259146 coreos-metadata[1562]: Jul 15 05:13:55.257 INFO Fetch successful Jul 15 05:13:55.261397 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jul 15 05:13:55.265733 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 15 05:13:55.265003 dbus-daemon[1563]: [system] SELinux support is enabled Jul 15 05:13:55.270629 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 15 05:13:55.270653 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 15 05:13:55.270741 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 15 05:13:55.270751 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 15 05:13:55.286023 update_engine[1580]: I20250715 05:13:55.285867 1580 update_check_scheduler.cc:74] Next update check in 7m34s Jul 15 05:13:55.286121 systemd[1]: Started update-engine.service - Update Engine. Jul 15 05:13:55.290534 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 15 05:13:55.350762 systemd-logind[1575]: New seat seat0. Jul 15 05:13:55.360541 systemd-logind[1575]: Watching system buttons on /dev/input/event3 (Power Button) Jul 15 05:13:55.360561 systemd-logind[1575]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jul 15 05:13:55.360708 systemd[1]: Started systemd-logind.service - User Login Management. Jul 15 05:13:55.415681 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 15 05:13:55.416032 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 15 05:13:55.418935 bash[1642]: Updated "/home/core/.ssh/authorized_keys" Jul 15 05:13:55.425717 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 15 05:13:55.429691 systemd[1]: Starting sshkeys.service... Jul 15 05:13:55.440884 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jul 15 05:13:55.464225 extend-filesystems[1613]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jul 15 05:13:55.464225 extend-filesystems[1613]: old_desc_blocks = 1, new_desc_blocks = 5 Jul 15 05:13:55.464225 extend-filesystems[1613]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jul 15 05:13:55.463944 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 15 05:13:55.464528 extend-filesystems[1566]: Resized filesystem in /dev/sda9 Jul 15 05:13:55.464150 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 15 05:13:55.478607 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 15 05:13:55.481464 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 15 05:13:55.499911 containerd[1607]: time="2025-07-15T05:13:55Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 15 05:13:55.516404 containerd[1607]: time="2025-07-15T05:13:55.509362565Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Jul 15 05:13:55.553125 containerd[1607]: time="2025-07-15T05:13:55.552623833Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.24µs" Jul 15 05:13:55.553125 containerd[1607]: time="2025-07-15T05:13:55.552659833Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 15 05:13:55.553125 containerd[1607]: time="2025-07-15T05:13:55.552678833Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 15 05:13:55.553125 containerd[1607]: time="2025-07-15T05:13:55.552852693Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 15 05:13:55.553125 containerd[1607]: time="2025-07-15T05:13:55.552872033Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 15 05:13:55.553125 containerd[1607]: time="2025-07-15T05:13:55.552895483Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:13:55.553125 containerd[1607]: time="2025-07-15T05:13:55.552949813Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:13:55.553125 containerd[1607]: time="2025-07-15T05:13:55.552960073Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:13:55.553357 containerd[1607]: time="2025-07-15T05:13:55.553236183Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:13:55.553357 containerd[1607]: time="2025-07-15T05:13:55.553259163Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:13:55.553357 containerd[1607]: time="2025-07-15T05:13:55.553272023Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:13:55.553357 containerd[1607]: time="2025-07-15T05:13:55.553290573Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 15 05:13:55.553437 containerd[1607]: time="2025-07-15T05:13:55.553416703Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 15 05:13:55.564405 containerd[1607]: time="2025-07-15T05:13:55.563996327Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:13:55.564405 containerd[1607]: time="2025-07-15T05:13:55.564062348Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:13:55.564405 containerd[1607]: time="2025-07-15T05:13:55.564075158Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 15 05:13:55.564405 containerd[1607]: time="2025-07-15T05:13:55.564106918Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 15 05:13:55.565361 containerd[1607]: time="2025-07-15T05:13:55.564362788Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 15 05:13:55.565361 containerd[1607]: time="2025-07-15T05:13:55.564658298Z" level=info msg="metadata content store policy set" policy=shared Jul 15 05:13:55.570489 coreos-metadata[1651]: Jul 15 05:13:55.570 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jul 15 05:13:55.573370 coreos-metadata[1651]: Jul 15 05:13:55.573 INFO Fetch successful Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574640932Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574710812Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574726612Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574740222Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574789412Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574802402Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574823612Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574836032Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574852802Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574864052Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574874282Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.574888202Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.575007102Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 15 05:13:55.575343 containerd[1607]: time="2025-07-15T05:13:55.575023882Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575041262Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575054122Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575066262Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575076382Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575087942Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575097382Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575108202Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575118312Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575128332Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575193962Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 15 05:13:55.575614 containerd[1607]: time="2025-07-15T05:13:55.575209562Z" level=info msg="Start snapshots syncer" Jul 15 05:13:55.576718 containerd[1607]: time="2025-07-15T05:13:55.576425373Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 15 05:13:55.576684 unknown[1651]: wrote ssh authorized keys file for user: core Jul 15 05:13:55.577287 containerd[1607]: time="2025-07-15T05:13:55.577087063Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.577582453Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578422684Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578552324Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578581714Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578589804Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578597604Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578607274Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578618434Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578629844Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578647454Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578654804Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 15 05:13:55.578725 containerd[1607]: time="2025-07-15T05:13:55.578665744Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579505924Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579525864Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579532914Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579539884Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579545324Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579589544Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579597944Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579610944Z" level=info msg="runtime interface created" Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579615054Z" level=info msg="created NRI interface" Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579620934Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579629344Z" level=info msg="Connect containerd service" Jul 15 05:13:55.579669 containerd[1607]: time="2025-07-15T05:13:55.579646064Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 15 05:13:55.583305 containerd[1607]: time="2025-07-15T05:13:55.581639485Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 05:13:55.620074 update-ssh-keys[1660]: Updated "/home/core/.ssh/authorized_keys" Jul 15 05:13:55.621602 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 15 05:13:55.626798 systemd[1]: Finished sshkeys.service. Jul 15 05:13:55.688514 sshd_keygen[1601]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 15 05:13:55.693902 locksmithd[1619]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715048660Z" level=info msg="Start subscribing containerd event" Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715098800Z" level=info msg="Start recovering state" Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715202330Z" level=info msg="Start event monitor" Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715216250Z" level=info msg="Start cni network conf syncer for default" Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715235180Z" level=info msg="Start streaming server" Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715243070Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715250040Z" level=info msg="runtime interface starting up..." Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715255960Z" level=info msg="starting plugins..." Jul 15 05:13:55.715543 containerd[1607]: time="2025-07-15T05:13:55.715269480Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 15 05:13:55.715963 containerd[1607]: time="2025-07-15T05:13:55.715941301Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 15 05:13:55.716153 containerd[1607]: time="2025-07-15T05:13:55.716138471Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 15 05:13:55.716828 systemd[1]: Started containerd.service - containerd container runtime. Jul 15 05:13:55.717898 containerd[1607]: time="2025-07-15T05:13:55.717526551Z" level=info msg="containerd successfully booted in 0.219889s" Jul 15 05:13:55.733406 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 15 05:13:55.736296 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 15 05:13:55.752128 systemd[1]: issuegen.service: Deactivated successfully. Jul 15 05:13:55.752448 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 15 05:13:55.756604 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 15 05:13:55.773228 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 15 05:13:55.775654 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 15 05:13:55.777687 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jul 15 05:13:55.777952 systemd[1]: Reached target getty.target - Login Prompts. Jul 15 05:13:55.818568 tar[1586]: linux-amd64/README.md Jul 15 05:13:55.832255 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 15 05:13:56.339548 systemd-networkd[1477]: eth0: Gained IPv6LL Jul 15 05:13:56.340112 systemd-timesyncd[1498]: Network configuration changed, trying to establish connection. Jul 15 05:13:56.342915 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 15 05:13:56.343690 systemd[1]: Reached target network-online.target - Network is Online. Jul 15 05:13:56.346032 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:13:56.349602 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 15 05:13:56.377777 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 15 05:13:57.043522 systemd-networkd[1477]: eth1: Gained IPv6LL Jul 15 05:13:57.044342 systemd-timesyncd[1498]: Network configuration changed, trying to establish connection. Jul 15 05:13:57.149767 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:13:57.150198 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 15 05:13:57.151467 systemd[1]: Startup finished in 2.747s (kernel) + 11.115s (initrd) + 4.475s (userspace) = 18.339s. Jul 15 05:13:57.155971 (kubelet)[1714]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:13:57.635810 kubelet[1714]: E0715 05:13:57.635724 1714 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:13:57.638961 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:13:57.639136 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:13:57.639627 systemd[1]: kubelet.service: Consumed 798ms CPU time, 266.2M memory peak. Jul 15 05:13:59.331752 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 15 05:13:59.334958 systemd[1]: Started sshd@0-95.217.135.169:22-139.178.89.65:49674.service - OpenSSH per-connection server daemon (139.178.89.65:49674). Jul 15 05:14:00.354480 sshd[1725]: Accepted publickey for core from 139.178.89.65 port 49674 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:14:00.356724 sshd-session[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:00.369876 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 15 05:14:00.372203 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 15 05:14:00.376606 systemd-logind[1575]: New session 1 of user core. Jul 15 05:14:00.389204 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 15 05:14:00.392040 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 15 05:14:00.416572 (systemd)[1730]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 15 05:14:00.420553 systemd-logind[1575]: New session c1 of user core. Jul 15 05:14:00.567302 systemd[1730]: Queued start job for default target default.target. Jul 15 05:14:00.573665 systemd[1730]: Created slice app.slice - User Application Slice. Jul 15 05:14:00.573689 systemd[1730]: Reached target paths.target - Paths. Jul 15 05:14:00.573725 systemd[1730]: Reached target timers.target - Timers. Jul 15 05:14:00.575131 systemd[1730]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 15 05:14:00.609014 systemd[1730]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 15 05:14:00.609234 systemd[1730]: Reached target sockets.target - Sockets. Jul 15 05:14:00.609327 systemd[1730]: Reached target basic.target - Basic System. Jul 15 05:14:00.609492 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 15 05:14:00.609514 systemd[1730]: Reached target default.target - Main User Target. Jul 15 05:14:00.609584 systemd[1730]: Startup finished in 178ms. Jul 15 05:14:00.617511 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 15 05:14:01.310729 systemd[1]: Started sshd@1-95.217.135.169:22-139.178.89.65:49684.service - OpenSSH per-connection server daemon (139.178.89.65:49684). Jul 15 05:14:02.293215 sshd[1741]: Accepted publickey for core from 139.178.89.65 port 49684 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:14:02.297501 sshd-session[1741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:02.306270 systemd-logind[1575]: New session 2 of user core. Jul 15 05:14:02.316702 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 15 05:14:02.972886 sshd[1744]: Connection closed by 139.178.89.65 port 49684 Jul 15 05:14:02.973604 sshd-session[1741]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:02.977812 systemd-logind[1575]: Session 2 logged out. Waiting for processes to exit. Jul 15 05:14:02.978411 systemd[1]: sshd@1-95.217.135.169:22-139.178.89.65:49684.service: Deactivated successfully. Jul 15 05:14:02.980165 systemd[1]: session-2.scope: Deactivated successfully. Jul 15 05:14:02.982412 systemd-logind[1575]: Removed session 2. Jul 15 05:14:03.151945 systemd[1]: Started sshd@2-95.217.135.169:22-139.178.89.65:49698.service - OpenSSH per-connection server daemon (139.178.89.65:49698). Jul 15 05:14:04.128501 sshd[1750]: Accepted publickey for core from 139.178.89.65 port 49698 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:14:04.131454 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:04.138620 systemd-logind[1575]: New session 3 of user core. Jul 15 05:14:04.154677 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 15 05:14:04.798846 sshd[1753]: Connection closed by 139.178.89.65 port 49698 Jul 15 05:14:04.799862 sshd-session[1750]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:04.804030 systemd[1]: sshd@2-95.217.135.169:22-139.178.89.65:49698.service: Deactivated successfully. Jul 15 05:14:04.806583 systemd[1]: session-3.scope: Deactivated successfully. Jul 15 05:14:04.808247 systemd-logind[1575]: Session 3 logged out. Waiting for processes to exit. Jul 15 05:14:04.811269 systemd-logind[1575]: Removed session 3. Jul 15 05:14:04.966474 systemd[1]: Started sshd@3-95.217.135.169:22-139.178.89.65:49704.service - OpenSSH per-connection server daemon (139.178.89.65:49704). Jul 15 05:14:05.955212 sshd[1759]: Accepted publickey for core from 139.178.89.65 port 49704 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:14:05.957204 sshd-session[1759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:05.963092 systemd-logind[1575]: New session 4 of user core. Jul 15 05:14:05.970625 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 15 05:14:06.628778 sshd[1762]: Connection closed by 139.178.89.65 port 49704 Jul 15 05:14:06.629773 sshd-session[1759]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:06.636744 systemd-logind[1575]: Session 4 logged out. Waiting for processes to exit. Jul 15 05:14:06.637936 systemd[1]: sshd@3-95.217.135.169:22-139.178.89.65:49704.service: Deactivated successfully. Jul 15 05:14:06.641138 systemd[1]: session-4.scope: Deactivated successfully. Jul 15 05:14:06.643759 systemd-logind[1575]: Removed session 4. Jul 15 05:14:06.802849 systemd[1]: Started sshd@4-95.217.135.169:22-139.178.89.65:49710.service - OpenSSH per-connection server daemon (139.178.89.65:49710). Jul 15 05:14:07.777615 sshd[1768]: Accepted publickey for core from 139.178.89.65 port 49710 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:14:07.779602 sshd-session[1768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:07.780593 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 15 05:14:07.783284 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:14:07.788619 systemd-logind[1575]: New session 5 of user core. Jul 15 05:14:07.791795 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 15 05:14:07.931300 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:14:07.936621 (kubelet)[1780]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:14:07.967538 kubelet[1780]: E0715 05:14:07.967496 1780 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:14:07.971597 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:14:07.971751 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:14:07.972020 systemd[1]: kubelet.service: Consumed 157ms CPU time, 108.9M memory peak. Jul 15 05:14:08.311065 sudo[1787]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 15 05:14:08.311692 sudo[1787]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:14:08.329557 sudo[1787]: pam_unix(sudo:session): session closed for user root Jul 15 05:14:08.488268 sshd[1774]: Connection closed by 139.178.89.65 port 49710 Jul 15 05:14:08.488956 sshd-session[1768]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:08.493199 systemd-logind[1575]: Session 5 logged out. Waiting for processes to exit. Jul 15 05:14:08.493916 systemd[1]: sshd@4-95.217.135.169:22-139.178.89.65:49710.service: Deactivated successfully. Jul 15 05:14:08.497274 systemd[1]: session-5.scope: Deactivated successfully. Jul 15 05:14:08.499687 systemd-logind[1575]: Removed session 5. Jul 15 05:14:08.663639 systemd[1]: Started sshd@5-95.217.135.169:22-139.178.89.65:41884.service - OpenSSH per-connection server daemon (139.178.89.65:41884). Jul 15 05:14:09.665436 sshd[1793]: Accepted publickey for core from 139.178.89.65 port 41884 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:14:09.667904 sshd-session[1793]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:09.675441 systemd-logind[1575]: New session 6 of user core. Jul 15 05:14:09.679628 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 15 05:14:10.187896 sudo[1798]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 15 05:14:10.188497 sudo[1798]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:14:10.196209 sudo[1798]: pam_unix(sudo:session): session closed for user root Jul 15 05:14:10.206351 sudo[1797]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 15 05:14:10.206975 sudo[1797]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:14:10.225009 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:14:10.286802 augenrules[1820]: No rules Jul 15 05:14:10.288807 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:14:10.289156 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:14:10.292133 sudo[1797]: pam_unix(sudo:session): session closed for user root Jul 15 05:14:10.450283 sshd[1796]: Connection closed by 139.178.89.65 port 41884 Jul 15 05:14:10.451056 sshd-session[1793]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:10.455611 systemd-logind[1575]: Session 6 logged out. Waiting for processes to exit. Jul 15 05:14:10.455906 systemd[1]: sshd@5-95.217.135.169:22-139.178.89.65:41884.service: Deactivated successfully. Jul 15 05:14:10.457824 systemd[1]: session-6.scope: Deactivated successfully. Jul 15 05:14:10.459596 systemd-logind[1575]: Removed session 6. Jul 15 05:14:10.618324 systemd[1]: Started sshd@6-95.217.135.169:22-139.178.89.65:41896.service - OpenSSH per-connection server daemon (139.178.89.65:41896). Jul 15 05:14:11.591930 sshd[1829]: Accepted publickey for core from 139.178.89.65 port 41896 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:14:11.594923 sshd-session[1829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:14:11.602016 systemd-logind[1575]: New session 7 of user core. Jul 15 05:14:11.612637 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 15 05:14:12.118475 sudo[1833]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 15 05:14:12.119217 sudo[1833]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:14:12.402211 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 15 05:14:12.415745 (dockerd)[1851]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 15 05:14:12.586344 dockerd[1851]: time="2025-07-15T05:14:12.586267867Z" level=info msg="Starting up" Jul 15 05:14:12.589390 dockerd[1851]: time="2025-07-15T05:14:12.589319138Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 15 05:14:12.607575 dockerd[1851]: time="2025-07-15T05:14:12.607455476Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jul 15 05:14:12.626924 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport584472821-merged.mount: Deactivated successfully. Jul 15 05:14:12.661855 dockerd[1851]: time="2025-07-15T05:14:12.661695788Z" level=info msg="Loading containers: start." Jul 15 05:14:12.678742 kernel: Initializing XFRM netlink socket Jul 15 05:14:12.877767 systemd-timesyncd[1498]: Network configuration changed, trying to establish connection. Jul 15 05:14:13.890659 systemd-resolved[1479]: Clock change detected. Flushing caches. Jul 15 05:14:13.890950 systemd-timesyncd[1498]: Contacted time server 51.75.67.47:123 (2.flatcar.pool.ntp.org). Jul 15 05:14:13.891046 systemd-timesyncd[1498]: Initial clock synchronization to Tue 2025-07-15 05:14:13.889608 UTC. Jul 15 05:14:13.902309 systemd-networkd[1477]: docker0: Link UP Jul 15 05:14:13.905782 dockerd[1851]: time="2025-07-15T05:14:13.905743864Z" level=info msg="Loading containers: done." Jul 15 05:14:13.918035 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck230211217-merged.mount: Deactivated successfully. Jul 15 05:14:13.921833 dockerd[1851]: time="2025-07-15T05:14:13.921781391Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 15 05:14:13.921920 dockerd[1851]: time="2025-07-15T05:14:13.921866351Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jul 15 05:14:13.921967 dockerd[1851]: time="2025-07-15T05:14:13.921949391Z" level=info msg="Initializing buildkit" Jul 15 05:14:13.943238 dockerd[1851]: time="2025-07-15T05:14:13.943183430Z" level=info msg="Completed buildkit initialization" Jul 15 05:14:13.951604 dockerd[1851]: time="2025-07-15T05:14:13.951562963Z" level=info msg="Daemon has completed initialization" Jul 15 05:14:13.951823 dockerd[1851]: time="2025-07-15T05:14:13.951620104Z" level=info msg="API listen on /run/docker.sock" Jul 15 05:14:13.951903 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 15 05:14:14.665719 containerd[1607]: time="2025-07-15T05:14:14.665289461Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.2\"" Jul 15 05:14:15.312486 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount897378708.mount: Deactivated successfully. Jul 15 05:14:16.240194 containerd[1607]: time="2025-07-15T05:14:16.240137527Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:16.241535 containerd[1607]: time="2025-07-15T05:14:16.241315607Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.2: active requests=0, bytes read=30079193" Jul 15 05:14:16.242546 containerd[1607]: time="2025-07-15T05:14:16.242517578Z" level=info msg="ImageCreate event name:\"sha256:ee794efa53d856b7e291320be3cd6390fa2e113c3f258a21290bc27fc214233e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:16.244554 containerd[1607]: time="2025-07-15T05:14:16.244529018Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e8ae58675899e946fabe38425f2b3bfd33120b7930d05b5898de97c81a7f6137\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:16.245214 containerd[1607]: time="2025-07-15T05:14:16.245186919Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.2\" with image id \"sha256:ee794efa53d856b7e291320be3cd6390fa2e113c3f258a21290bc27fc214233e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e8ae58675899e946fabe38425f2b3bfd33120b7930d05b5898de97c81a7f6137\", size \"30075899\" in 1.579850348s" Jul 15 05:14:16.245251 containerd[1607]: time="2025-07-15T05:14:16.245217129Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.2\" returns image reference \"sha256:ee794efa53d856b7e291320be3cd6390fa2e113c3f258a21290bc27fc214233e\"" Jul 15 05:14:16.246130 containerd[1607]: time="2025-07-15T05:14:16.246113469Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.2\"" Jul 15 05:14:17.524130 containerd[1607]: time="2025-07-15T05:14:17.524071841Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:17.525193 containerd[1607]: time="2025-07-15T05:14:17.525067802Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.2: active requests=0, bytes read=26018968" Jul 15 05:14:17.525899 containerd[1607]: time="2025-07-15T05:14:17.525879922Z" level=info msg="ImageCreate event name:\"sha256:ff4f56c76b82d6cda0555115a0fe479d5dd612264b85efb9cc14b1b4b937bdf2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:17.527859 containerd[1607]: time="2025-07-15T05:14:17.527842093Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:2236e72a4be5dcc9c04600353ff8849db1557f5364947c520ff05471ae719081\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:17.528383 containerd[1607]: time="2025-07-15T05:14:17.528366793Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.2\" with image id \"sha256:ff4f56c76b82d6cda0555115a0fe479d5dd612264b85efb9cc14b1b4b937bdf2\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:2236e72a4be5dcc9c04600353ff8849db1557f5364947c520ff05471ae719081\", size \"27646507\" in 1.282233844s" Jul 15 05:14:17.528448 containerd[1607]: time="2025-07-15T05:14:17.528437793Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.2\" returns image reference \"sha256:ff4f56c76b82d6cda0555115a0fe479d5dd612264b85efb9cc14b1b4b937bdf2\"" Jul 15 05:14:17.528805 containerd[1607]: time="2025-07-15T05:14:17.528784923Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.2\"" Jul 15 05:14:18.613031 containerd[1607]: time="2025-07-15T05:14:18.612972855Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:18.614037 containerd[1607]: time="2025-07-15T05:14:18.614001885Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.2: active requests=0, bytes read=20155077" Jul 15 05:14:18.615117 containerd[1607]: time="2025-07-15T05:14:18.615084096Z" level=info msg="ImageCreate event name:\"sha256:cfed1ff7489289d4e8d796b0d95fd251990403510563cf843912f42ab9718a7b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:18.617192 containerd[1607]: time="2025-07-15T05:14:18.617174567Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:304c28303133be7d927973bc9bd6c83945b3735c59d283c25b63d5b9ed53bca3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:18.617771 containerd[1607]: time="2025-07-15T05:14:18.617752557Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.2\" with image id \"sha256:cfed1ff7489289d4e8d796b0d95fd251990403510563cf843912f42ab9718a7b\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:304c28303133be7d927973bc9bd6c83945b3735c59d283c25b63d5b9ed53bca3\", size \"21782634\" in 1.088944064s" Jul 15 05:14:18.618071 containerd[1607]: time="2025-07-15T05:14:18.617824457Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.2\" returns image reference \"sha256:cfed1ff7489289d4e8d796b0d95fd251990403510563cf843912f42ab9718a7b\"" Jul 15 05:14:18.618306 containerd[1607]: time="2025-07-15T05:14:18.618184997Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.2\"" Jul 15 05:14:19.209556 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 15 05:14:19.212174 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:14:19.366807 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:14:19.375015 (kubelet)[2134]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:14:19.418153 kubelet[2134]: E0715 05:14:19.418115 2134 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:14:19.424044 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:14:19.424193 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:14:19.424718 systemd[1]: kubelet.service: Consumed 156ms CPU time, 110.4M memory peak. Jul 15 05:14:19.607352 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount833379893.mount: Deactivated successfully. Jul 15 05:14:19.916127 containerd[1607]: time="2025-07-15T05:14:19.915899968Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:19.916786 containerd[1607]: time="2025-07-15T05:14:19.916757248Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.2: active requests=0, bytes read=31892774" Jul 15 05:14:19.917395 containerd[1607]: time="2025-07-15T05:14:19.917348978Z" level=info msg="ImageCreate event name:\"sha256:661d404f36f01cd854403fd3540f18dcf0342d22bd9c6516bb9de234ac183b19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:19.918916 containerd[1607]: time="2025-07-15T05:14:19.918845669Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4796ef3e43efa5ed2a5b015c18f81d3c2fe3aea36f555ea643cc01827eb65e51\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:19.919349 containerd[1607]: time="2025-07-15T05:14:19.919315189Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.2\" with image id \"sha256:661d404f36f01cd854403fd3540f18dcf0342d22bd9c6516bb9de234ac183b19\", repo tag \"registry.k8s.io/kube-proxy:v1.33.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:4796ef3e43efa5ed2a5b015c18f81d3c2fe3aea36f555ea643cc01827eb65e51\", size \"31891765\" in 1.300931162s" Jul 15 05:14:19.919349 containerd[1607]: time="2025-07-15T05:14:19.919342669Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.2\" returns image reference \"sha256:661d404f36f01cd854403fd3540f18dcf0342d22bd9c6516bb9de234ac183b19\"" Jul 15 05:14:19.919913 containerd[1607]: time="2025-07-15T05:14:19.919850439Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jul 15 05:14:20.416622 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2900891397.mount: Deactivated successfully. Jul 15 05:14:21.185408 containerd[1607]: time="2025-07-15T05:14:21.185337586Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:21.186530 containerd[1607]: time="2025-07-15T05:14:21.186373747Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942332" Jul 15 05:14:21.187641 containerd[1607]: time="2025-07-15T05:14:21.187613827Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:21.189850 containerd[1607]: time="2025-07-15T05:14:21.189824528Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:21.191816 containerd[1607]: time="2025-07-15T05:14:21.191796499Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.27191553s" Jul 15 05:14:21.192218 containerd[1607]: time="2025-07-15T05:14:21.191886749Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Jul 15 05:14:21.192937 containerd[1607]: time="2025-07-15T05:14:21.192915109Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 15 05:14:21.630131 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount620777022.mount: Deactivated successfully. Jul 15 05:14:21.635827 containerd[1607]: time="2025-07-15T05:14:21.635765014Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:14:21.636574 containerd[1607]: time="2025-07-15T05:14:21.636553594Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Jul 15 05:14:21.637688 containerd[1607]: time="2025-07-15T05:14:21.637159514Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:14:21.638786 containerd[1607]: time="2025-07-15T05:14:21.638751955Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:14:21.639265 containerd[1607]: time="2025-07-15T05:14:21.639249085Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 446.306756ms" Jul 15 05:14:21.639321 containerd[1607]: time="2025-07-15T05:14:21.639311375Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jul 15 05:14:21.640591 containerd[1607]: time="2025-07-15T05:14:21.640554176Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jul 15 05:14:22.069555 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount588552075.mount: Deactivated successfully. Jul 15 05:14:23.547480 containerd[1607]: time="2025-07-15T05:14:23.547436010Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:23.548584 containerd[1607]: time="2025-07-15T05:14:23.548361660Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58247215" Jul 15 05:14:23.549239 containerd[1607]: time="2025-07-15T05:14:23.549213031Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:23.551263 containerd[1607]: time="2025-07-15T05:14:23.551235792Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:23.551882 containerd[1607]: time="2025-07-15T05:14:23.551856712Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 1.911266166s" Jul 15 05:14:23.551936 containerd[1607]: time="2025-07-15T05:14:23.551882822Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Jul 15 05:14:26.258039 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:14:26.258189 systemd[1]: kubelet.service: Consumed 156ms CPU time, 110.4M memory peak. Jul 15 05:14:26.260146 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:14:26.280117 systemd[1]: Reload requested from client PID 2285 ('systemctl') (unit session-7.scope)... Jul 15 05:14:26.280221 systemd[1]: Reloading... Jul 15 05:14:26.394038 zram_generator::config[2329]: No configuration found. Jul 15 05:14:26.466477 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:14:26.560040 systemd[1]: Reloading finished in 279 ms. Jul 15 05:14:26.613757 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 15 05:14:26.613837 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 15 05:14:26.614183 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:14:26.614242 systemd[1]: kubelet.service: Consumed 114ms CPU time, 97.7M memory peak. Jul 15 05:14:26.616328 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:14:26.759227 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:14:26.771344 (kubelet)[2383]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:14:26.806699 kubelet[2383]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:14:26.806699 kubelet[2383]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 15 05:14:26.806699 kubelet[2383]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:14:26.806699 kubelet[2383]: I0715 05:14:26.806554 2383 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:14:27.106253 kubelet[2383]: I0715 05:14:27.105911 2383 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jul 15 05:14:27.106253 kubelet[2383]: I0715 05:14:27.105939 2383 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:14:27.107177 kubelet[2383]: I0715 05:14:27.107161 2383 server.go:956] "Client rotation is on, will bootstrap in background" Jul 15 05:14:27.143717 kubelet[2383]: E0715 05:14:27.143640 2383 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://95.217.135.169:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 95.217.135.169:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jul 15 05:14:27.143717 kubelet[2383]: I0715 05:14:27.143723 2383 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:14:27.153440 kubelet[2383]: I0715 05:14:27.153399 2383 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:14:27.159489 kubelet[2383]: I0715 05:14:27.159450 2383 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:14:27.161339 kubelet[2383]: I0715 05:14:27.161293 2383 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:14:27.163793 kubelet[2383]: I0715 05:14:27.161327 2383 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4396-0-0-n-85c8113064","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:14:27.163793 kubelet[2383]: I0715 05:14:27.163775 2383 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:14:27.163793 kubelet[2383]: I0715 05:14:27.163787 2383 container_manager_linux.go:303] "Creating device plugin manager" Jul 15 05:14:27.165057 kubelet[2383]: I0715 05:14:27.165007 2383 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:14:27.170495 kubelet[2383]: I0715 05:14:27.170431 2383 kubelet.go:480] "Attempting to sync node with API server" Jul 15 05:14:27.170495 kubelet[2383]: I0715 05:14:27.170477 2383 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:14:27.170495 kubelet[2383]: I0715 05:14:27.170510 2383 kubelet.go:386] "Adding apiserver pod source" Jul 15 05:14:27.170686 kubelet[2383]: I0715 05:14:27.170529 2383 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:14:27.177789 kubelet[2383]: E0715 05:14:27.177321 2383 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://95.217.135.169:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396-0-0-n-85c8113064&limit=500&resourceVersion=0\": dial tcp 95.217.135.169:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jul 15 05:14:27.177789 kubelet[2383]: E0715 05:14:27.177740 2383 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://95.217.135.169:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 95.217.135.169:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jul 15 05:14:27.179487 kubelet[2383]: I0715 05:14:27.179455 2383 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:14:27.180185 kubelet[2383]: I0715 05:14:27.179966 2383 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jul 15 05:14:27.180689 kubelet[2383]: W0715 05:14:27.180653 2383 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 15 05:14:27.183519 kubelet[2383]: I0715 05:14:27.183494 2383 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 15 05:14:27.183578 kubelet[2383]: I0715 05:14:27.183561 2383 server.go:1289] "Started kubelet" Jul 15 05:14:27.186139 kubelet[2383]: I0715 05:14:27.185989 2383 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:14:27.193800 kubelet[2383]: E0715 05:14:27.189492 2383 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://95.217.135.169:6443/api/v1/namespaces/default/events\": dial tcp 95.217.135.169:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4396-0-0-n-85c8113064.185254d24b497bd0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4396-0-0-n-85c8113064,UID:ci-4396-0-0-n-85c8113064,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4396-0-0-n-85c8113064,},FirstTimestamp:2025-07-15 05:14:27.183516624 +0000 UTC m=+0.408184521,LastTimestamp:2025-07-15 05:14:27.183516624 +0000 UTC m=+0.408184521,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396-0-0-n-85c8113064,}" Jul 15 05:14:27.193800 kubelet[2383]: I0715 05:14:27.193612 2383 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:14:27.194875 kubelet[2383]: I0715 05:14:27.194862 2383 server.go:317] "Adding debug handlers to kubelet server" Jul 15 05:14:27.196536 kubelet[2383]: I0715 05:14:27.196522 2383 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 15 05:14:27.196855 kubelet[2383]: E0715 05:14:27.196828 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:27.197407 kubelet[2383]: I0715 05:14:27.197384 2383 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 15 05:14:27.197503 kubelet[2383]: I0715 05:14:27.197484 2383 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:14:27.203744 kubelet[2383]: I0715 05:14:27.203136 2383 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:14:27.203744 kubelet[2383]: I0715 05:14:27.203405 2383 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:14:27.203899 kubelet[2383]: E0715 05:14:27.203875 2383 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://95.217.135.169:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 95.217.135.169:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jul 15 05:14:27.203966 kubelet[2383]: E0715 05:14:27.203939 2383 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.135.169:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-85c8113064?timeout=10s\": dial tcp 95.217.135.169:6443: connect: connection refused" interval="200ms" Jul 15 05:14:27.203999 kubelet[2383]: I0715 05:14:27.203979 2383 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:14:27.206573 kubelet[2383]: I0715 05:14:27.206547 2383 factory.go:223] Registration of the systemd container factory successfully Jul 15 05:14:27.206714 kubelet[2383]: I0715 05:14:27.206647 2383 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:14:27.208935 kubelet[2383]: I0715 05:14:27.208924 2383 factory.go:223] Registration of the containerd container factory successfully Jul 15 05:14:27.215166 kubelet[2383]: I0715 05:14:27.214931 2383 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jul 15 05:14:27.216474 kubelet[2383]: I0715 05:14:27.216273 2383 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jul 15 05:14:27.216474 kubelet[2383]: I0715 05:14:27.216286 2383 status_manager.go:230] "Starting to sync pod status with apiserver" Jul 15 05:14:27.216474 kubelet[2383]: I0715 05:14:27.216303 2383 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 15 05:14:27.216474 kubelet[2383]: I0715 05:14:27.216310 2383 kubelet.go:2436] "Starting kubelet main sync loop" Jul 15 05:14:27.216474 kubelet[2383]: E0715 05:14:27.216341 2383 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:14:27.224114 kubelet[2383]: E0715 05:14:27.224090 2383 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://95.217.135.169:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 95.217.135.169:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jul 15 05:14:27.235306 kubelet[2383]: E0715 05:14:27.235280 2383 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 05:14:27.238175 kubelet[2383]: I0715 05:14:27.238156 2383 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 15 05:14:27.238240 kubelet[2383]: I0715 05:14:27.238170 2383 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 15 05:14:27.238240 kubelet[2383]: I0715 05:14:27.238215 2383 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:14:27.240345 kubelet[2383]: I0715 05:14:27.240323 2383 policy_none.go:49] "None policy: Start" Jul 15 05:14:27.240394 kubelet[2383]: I0715 05:14:27.240345 2383 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 15 05:14:27.240394 kubelet[2383]: I0715 05:14:27.240365 2383 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:14:27.245403 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 15 05:14:27.266815 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 15 05:14:27.288795 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 15 05:14:27.294242 kubelet[2383]: E0715 05:14:27.292996 2383 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jul 15 05:14:27.294242 kubelet[2383]: I0715 05:14:27.293202 2383 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:14:27.294242 kubelet[2383]: I0715 05:14:27.293212 2383 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:14:27.294242 kubelet[2383]: I0715 05:14:27.293425 2383 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:14:27.294942 kubelet[2383]: E0715 05:14:27.294922 2383 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 15 05:14:27.295018 kubelet[2383]: E0715 05:14:27.294987 2383 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:27.332309 systemd[1]: Created slice kubepods-burstable-pod4efbf38801bb6c8a1de499bd6b9057eb.slice - libcontainer container kubepods-burstable-pod4efbf38801bb6c8a1de499bd6b9057eb.slice. Jul 15 05:14:27.344570 kubelet[2383]: E0715 05:14:27.344523 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.347379 systemd[1]: Created slice kubepods-burstable-poda8fef8d5637ab08c2a1ac228075d8235.slice - libcontainer container kubepods-burstable-poda8fef8d5637ab08c2a1ac228075d8235.slice. Jul 15 05:14:27.349373 kubelet[2383]: E0715 05:14:27.349354 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.351148 systemd[1]: Created slice kubepods-burstable-pod4ae0925fe7a00d504064ac2dbebf8857.slice - libcontainer container kubepods-burstable-pod4ae0925fe7a00d504064ac2dbebf8857.slice. Jul 15 05:14:27.352640 kubelet[2383]: E0715 05:14:27.352614 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.395308 kubelet[2383]: I0715 05:14:27.395166 2383 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.396097 kubelet[2383]: E0715 05:14:27.395467 2383 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.217.135.169:6443/api/v1/nodes\": dial tcp 95.217.135.169:6443: connect: connection refused" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.398760 kubelet[2383]: I0715 05:14:27.398742 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.399001 kubelet[2383]: I0715 05:14:27.398824 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4ae0925fe7a00d504064ac2dbebf8857-kubeconfig\") pod \"kube-scheduler-ci-4396-0-0-n-85c8113064\" (UID: \"4ae0925fe7a00d504064ac2dbebf8857\") " pod="kube-system/kube-scheduler-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.399001 kubelet[2383]: I0715 05:14:27.398839 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-ca-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.399001 kubelet[2383]: I0715 05:14:27.398890 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4efbf38801bb6c8a1de499bd6b9057eb-ca-certs\") pod \"kube-apiserver-ci-4396-0-0-n-85c8113064\" (UID: \"4efbf38801bb6c8a1de499bd6b9057eb\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.399001 kubelet[2383]: I0715 05:14:27.398905 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4efbf38801bb6c8a1de499bd6b9057eb-k8s-certs\") pod \"kube-apiserver-ci-4396-0-0-n-85c8113064\" (UID: \"4efbf38801bb6c8a1de499bd6b9057eb\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.399001 kubelet[2383]: I0715 05:14:27.398917 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4efbf38801bb6c8a1de499bd6b9057eb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4396-0-0-n-85c8113064\" (UID: \"4efbf38801bb6c8a1de499bd6b9057eb\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.399103 kubelet[2383]: I0715 05:14:27.398928 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-flexvolume-dir\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.399103 kubelet[2383]: I0715 05:14:27.398970 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-k8s-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.399103 kubelet[2383]: I0715 05:14:27.398982 2383 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-kubeconfig\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.404511 kubelet[2383]: E0715 05:14:27.404472 2383 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.135.169:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-85c8113064?timeout=10s\": dial tcp 95.217.135.169:6443: connect: connection refused" interval="400ms" Jul 15 05:14:27.597927 kubelet[2383]: I0715 05:14:27.597895 2383 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.598231 kubelet[2383]: E0715 05:14:27.598203 2383 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.217.135.169:6443/api/v1/nodes\": dial tcp 95.217.135.169:6443: connect: connection refused" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:27.647472 containerd[1607]: time="2025-07-15T05:14:27.647294958Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4396-0-0-n-85c8113064,Uid:4efbf38801bb6c8a1de499bd6b9057eb,Namespace:kube-system,Attempt:0,}" Jul 15 05:14:27.651947 containerd[1607]: time="2025-07-15T05:14:27.651864309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4396-0-0-n-85c8113064,Uid:a8fef8d5637ab08c2a1ac228075d8235,Namespace:kube-system,Attempt:0,}" Jul 15 05:14:27.653557 containerd[1607]: time="2025-07-15T05:14:27.653487170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4396-0-0-n-85c8113064,Uid:4ae0925fe7a00d504064ac2dbebf8857,Namespace:kube-system,Attempt:0,}" Jul 15 05:14:27.747075 containerd[1607]: time="2025-07-15T05:14:27.747023719Z" level=info msg="connecting to shim 066737f2ab977322f9e722af92daa44fba6ba8ed96b3ebca1053a3fa63d0561c" address="unix:///run/containerd/s/d7b41dfbe27482c66950563c4c79fb9ae32d5497dd55d996d175714b602275fc" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:27.747910 containerd[1607]: time="2025-07-15T05:14:27.747487319Z" level=info msg="connecting to shim 5ee3287b61c4e2261ffd95f480d7587ee932d00998e422f62414315dbd5f096c" address="unix:///run/containerd/s/02ee354128fa85f9cc8880d663c7ac090d23aeb08cc60f8491c98e0906779af6" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:27.754119 containerd[1607]: time="2025-07-15T05:14:27.753897122Z" level=info msg="connecting to shim 8cf322447156298f008f2d7d364816fd49c2b8d91c6dd30ab1c1e3cfedf64576" address="unix:///run/containerd/s/51cc5444b26ab15b44e74c1f9435be2d16ae706529a96338fc8af8b22d9e25c3" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:27.806070 kubelet[2383]: E0715 05:14:27.806035 2383 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.135.169:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-85c8113064?timeout=10s\": dial tcp 95.217.135.169:6443: connect: connection refused" interval="800ms" Jul 15 05:14:27.816791 systemd[1]: Started cri-containerd-066737f2ab977322f9e722af92daa44fba6ba8ed96b3ebca1053a3fa63d0561c.scope - libcontainer container 066737f2ab977322f9e722af92daa44fba6ba8ed96b3ebca1053a3fa63d0561c. Jul 15 05:14:27.819160 systemd[1]: Started cri-containerd-5ee3287b61c4e2261ffd95f480d7587ee932d00998e422f62414315dbd5f096c.scope - libcontainer container 5ee3287b61c4e2261ffd95f480d7587ee932d00998e422f62414315dbd5f096c. Jul 15 05:14:27.820964 systemd[1]: Started cri-containerd-8cf322447156298f008f2d7d364816fd49c2b8d91c6dd30ab1c1e3cfedf64576.scope - libcontainer container 8cf322447156298f008f2d7d364816fd49c2b8d91c6dd30ab1c1e3cfedf64576. Jul 15 05:14:27.874776 containerd[1607]: time="2025-07-15T05:14:27.873999542Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4396-0-0-n-85c8113064,Uid:a8fef8d5637ab08c2a1ac228075d8235,Namespace:kube-system,Attempt:0,} returns sandbox id \"066737f2ab977322f9e722af92daa44fba6ba8ed96b3ebca1053a3fa63d0561c\"" Jul 15 05:14:27.884889 containerd[1607]: time="2025-07-15T05:14:27.884414806Z" level=info msg="CreateContainer within sandbox \"066737f2ab977322f9e722af92daa44fba6ba8ed96b3ebca1053a3fa63d0561c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 15 05:14:27.894348 containerd[1607]: time="2025-07-15T05:14:27.894234100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4396-0-0-n-85c8113064,Uid:4efbf38801bb6c8a1de499bd6b9057eb,Namespace:kube-system,Attempt:0,} returns sandbox id \"8cf322447156298f008f2d7d364816fd49c2b8d91c6dd30ab1c1e3cfedf64576\"" Jul 15 05:14:27.900483 containerd[1607]: time="2025-07-15T05:14:27.899764263Z" level=info msg="Container 86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:27.902307 containerd[1607]: time="2025-07-15T05:14:27.902167994Z" level=info msg="CreateContainer within sandbox \"8cf322447156298f008f2d7d364816fd49c2b8d91c6dd30ab1c1e3cfedf64576\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 15 05:14:27.919817 containerd[1607]: time="2025-07-15T05:14:27.919721781Z" level=info msg="Container 14219a6ad5b2302db4432d037ca358c00e3d2b58b703a5d2f9bcb55a920c0414: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:27.920556 containerd[1607]: time="2025-07-15T05:14:27.920531091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4396-0-0-n-85c8113064,Uid:4ae0925fe7a00d504064ac2dbebf8857,Namespace:kube-system,Attempt:0,} returns sandbox id \"5ee3287b61c4e2261ffd95f480d7587ee932d00998e422f62414315dbd5f096c\"" Jul 15 05:14:27.921679 containerd[1607]: time="2025-07-15T05:14:27.921636182Z" level=info msg="CreateContainer within sandbox \"066737f2ab977322f9e722af92daa44fba6ba8ed96b3ebca1053a3fa63d0561c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d\"" Jul 15 05:14:27.922110 containerd[1607]: time="2025-07-15T05:14:27.922088102Z" level=info msg="StartContainer for \"86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d\"" Jul 15 05:14:27.924056 containerd[1607]: time="2025-07-15T05:14:27.924035733Z" level=info msg="connecting to shim 86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d" address="unix:///run/containerd/s/d7b41dfbe27482c66950563c4c79fb9ae32d5497dd55d996d175714b602275fc" protocol=ttrpc version=3 Jul 15 05:14:27.926058 containerd[1607]: time="2025-07-15T05:14:27.925979984Z" level=info msg="CreateContainer within sandbox \"5ee3287b61c4e2261ffd95f480d7587ee932d00998e422f62414315dbd5f096c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 15 05:14:27.927209 containerd[1607]: time="2025-07-15T05:14:27.927190824Z" level=info msg="CreateContainer within sandbox \"8cf322447156298f008f2d7d364816fd49c2b8d91c6dd30ab1c1e3cfedf64576\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"14219a6ad5b2302db4432d037ca358c00e3d2b58b703a5d2f9bcb55a920c0414\"" Jul 15 05:14:27.928701 containerd[1607]: time="2025-07-15T05:14:27.928617695Z" level=info msg="StartContainer for \"14219a6ad5b2302db4432d037ca358c00e3d2b58b703a5d2f9bcb55a920c0414\"" Jul 15 05:14:27.930196 containerd[1607]: time="2025-07-15T05:14:27.930173595Z" level=info msg="connecting to shim 14219a6ad5b2302db4432d037ca358c00e3d2b58b703a5d2f9bcb55a920c0414" address="unix:///run/containerd/s/51cc5444b26ab15b44e74c1f9435be2d16ae706529a96338fc8af8b22d9e25c3" protocol=ttrpc version=3 Jul 15 05:14:27.933002 containerd[1607]: time="2025-07-15T05:14:27.932978827Z" level=info msg="Container c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:27.939260 containerd[1607]: time="2025-07-15T05:14:27.939159899Z" level=info msg="CreateContainer within sandbox \"5ee3287b61c4e2261ffd95f480d7587ee932d00998e422f62414315dbd5f096c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2\"" Jul 15 05:14:27.942007 containerd[1607]: time="2025-07-15T05:14:27.941939520Z" level=info msg="StartContainer for \"c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2\"" Jul 15 05:14:27.943158 containerd[1607]: time="2025-07-15T05:14:27.943117701Z" level=info msg="connecting to shim c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2" address="unix:///run/containerd/s/02ee354128fa85f9cc8880d663c7ac090d23aeb08cc60f8491c98e0906779af6" protocol=ttrpc version=3 Jul 15 05:14:27.944051 systemd[1]: Started cri-containerd-86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d.scope - libcontainer container 86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d. Jul 15 05:14:27.960869 systemd[1]: Started cri-containerd-14219a6ad5b2302db4432d037ca358c00e3d2b58b703a5d2f9bcb55a920c0414.scope - libcontainer container 14219a6ad5b2302db4432d037ca358c00e3d2b58b703a5d2f9bcb55a920c0414. Jul 15 05:14:27.974078 systemd[1]: Started cri-containerd-c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2.scope - libcontainer container c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2. Jul 15 05:14:28.001803 kubelet[2383]: I0715 05:14:28.001725 2383 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:28.003822 kubelet[2383]: E0715 05:14:28.003774 2383 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://95.217.135.169:6443/api/v1/nodes\": dial tcp 95.217.135.169:6443: connect: connection refused" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:28.038247 containerd[1607]: time="2025-07-15T05:14:28.038183600Z" level=info msg="StartContainer for \"86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d\" returns successfully" Jul 15 05:14:28.069143 containerd[1607]: time="2025-07-15T05:14:28.069038803Z" level=info msg="StartContainer for \"c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2\" returns successfully" Jul 15 05:14:28.073456 kubelet[2383]: E0715 05:14:28.073307 2383 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://95.217.135.169:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 95.217.135.169:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jul 15 05:14:28.077064 containerd[1607]: time="2025-07-15T05:14:28.076837046Z" level=info msg="StartContainer for \"14219a6ad5b2302db4432d037ca358c00e3d2b58b703a5d2f9bcb55a920c0414\" returns successfully" Jul 15 05:14:28.204066 kubelet[2383]: E0715 05:14:28.203601 2383 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://95.217.135.169:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396-0-0-n-85c8113064&limit=500&resourceVersion=0\": dial tcp 95.217.135.169:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jul 15 05:14:28.245545 kubelet[2383]: E0715 05:14:28.245432 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:28.246124 kubelet[2383]: E0715 05:14:28.246093 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:28.253118 kubelet[2383]: E0715 05:14:28.253088 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:28.805573 kubelet[2383]: I0715 05:14:28.805533 2383 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:29.255068 kubelet[2383]: E0715 05:14:29.254861 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:29.255068 kubelet[2383]: E0715 05:14:29.255031 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:29.736640 kubelet[2383]: E0715 05:14:29.736589 2383 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:29.915562 kubelet[2383]: I0715 05:14:29.915518 2383 kubelet_node_status.go:78] "Successfully registered node" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:29.915562 kubelet[2383]: E0715 05:14:29.915553 2383 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4396-0-0-n-85c8113064\": node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:29.929688 kubelet[2383]: E0715 05:14:29.929609 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.030203 kubelet[2383]: E0715 05:14:30.030031 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.130563 kubelet[2383]: E0715 05:14:30.130511 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.231573 kubelet[2383]: E0715 05:14:30.231508 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.255838 kubelet[2383]: E0715 05:14:30.255795 2383 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-85c8113064\" not found" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:30.332226 kubelet[2383]: E0715 05:14:30.332079 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.433203 kubelet[2383]: E0715 05:14:30.433133 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.533474 kubelet[2383]: E0715 05:14:30.533434 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.634189 kubelet[2383]: E0715 05:14:30.633901 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.734661 kubelet[2383]: E0715 05:14:30.734587 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.835544 kubelet[2383]: E0715 05:14:30.835465 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.936748 kubelet[2383]: E0715 05:14:30.936562 2383 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:30.997863 kubelet[2383]: I0715 05:14:30.997813 2383 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:31.006980 kubelet[2383]: I0715 05:14:31.006930 2383 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:31.010506 kubelet[2383]: I0715 05:14:31.010470 2383 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4396-0-0-n-85c8113064" Jul 15 05:14:31.181338 kubelet[2383]: I0715 05:14:31.181253 2383 apiserver.go:52] "Watching apiserver" Jul 15 05:14:31.198199 kubelet[2383]: I0715 05:14:31.197885 2383 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 15 05:14:31.725854 systemd[1]: Reload requested from client PID 2661 ('systemctl') (unit session-7.scope)... Jul 15 05:14:31.725888 systemd[1]: Reloading... Jul 15 05:14:31.817764 zram_generator::config[2701]: No configuration found. Jul 15 05:14:31.914854 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:14:32.030939 systemd[1]: Reloading finished in 304 ms. Jul 15 05:14:32.063924 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:14:32.071465 systemd[1]: kubelet.service: Deactivated successfully. Jul 15 05:14:32.071758 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:14:32.071809 systemd[1]: kubelet.service: Consumed 762ms CPU time, 125.9M memory peak. Jul 15 05:14:32.073745 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:14:32.244709 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:14:32.252638 (kubelet)[2756]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:14:32.310813 kubelet[2756]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:14:32.310813 kubelet[2756]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 15 05:14:32.310813 kubelet[2756]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:14:32.310813 kubelet[2756]: I0715 05:14:32.310647 2756 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:14:32.320748 kubelet[2756]: I0715 05:14:32.320077 2756 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jul 15 05:14:32.320748 kubelet[2756]: I0715 05:14:32.320104 2756 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:14:32.320748 kubelet[2756]: I0715 05:14:32.320344 2756 server.go:956] "Client rotation is on, will bootstrap in background" Jul 15 05:14:32.321693 kubelet[2756]: I0715 05:14:32.321652 2756 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jul 15 05:14:32.323744 kubelet[2756]: I0715 05:14:32.323722 2756 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:14:32.329100 kubelet[2756]: I0715 05:14:32.329073 2756 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:14:32.332026 kubelet[2756]: I0715 05:14:32.331980 2756 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:14:32.332194 kubelet[2756]: I0715 05:14:32.332170 2756 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:14:32.332364 kubelet[2756]: I0715 05:14:32.332190 2756 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4396-0-0-n-85c8113064","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:14:32.332449 kubelet[2756]: I0715 05:14:32.332367 2756 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:14:32.332449 kubelet[2756]: I0715 05:14:32.332375 2756 container_manager_linux.go:303] "Creating device plugin manager" Jul 15 05:14:32.332449 kubelet[2756]: I0715 05:14:32.332410 2756 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:14:32.332548 kubelet[2756]: I0715 05:14:32.332533 2756 kubelet.go:480] "Attempting to sync node with API server" Jul 15 05:14:32.332548 kubelet[2756]: I0715 05:14:32.332547 2756 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:14:32.332583 kubelet[2756]: I0715 05:14:32.332565 2756 kubelet.go:386] "Adding apiserver pod source" Jul 15 05:14:32.332583 kubelet[2756]: I0715 05:14:32.332579 2756 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:14:32.334506 kubelet[2756]: I0715 05:14:32.334485 2756 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:14:32.335282 kubelet[2756]: I0715 05:14:32.335216 2756 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jul 15 05:14:32.340886 kubelet[2756]: I0715 05:14:32.340867 2756 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 15 05:14:32.341067 kubelet[2756]: I0715 05:14:32.341055 2756 server.go:1289] "Started kubelet" Jul 15 05:14:32.341548 kubelet[2756]: I0715 05:14:32.341369 2756 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:14:32.341589 kubelet[2756]: I0715 05:14:32.341565 2756 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:14:32.345697 kubelet[2756]: I0715 05:14:32.343733 2756 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:14:32.354290 kubelet[2756]: I0715 05:14:32.354248 2756 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:14:32.355086 kubelet[2756]: I0715 05:14:32.355049 2756 server.go:317] "Adding debug handlers to kubelet server" Jul 15 05:14:32.355886 kubelet[2756]: I0715 05:14:32.355839 2756 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:14:32.357883 kubelet[2756]: I0715 05:14:32.357483 2756 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 15 05:14:32.357883 kubelet[2756]: E0715 05:14:32.357609 2756 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-85c8113064\" not found" Jul 15 05:14:32.359105 kubelet[2756]: I0715 05:14:32.359094 2756 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 15 05:14:32.359232 kubelet[2756]: I0715 05:14:32.359225 2756 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:14:32.360429 kubelet[2756]: I0715 05:14:32.360396 2756 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:14:32.361329 kubelet[2756]: I0715 05:14:32.360713 2756 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jul 15 05:14:32.362267 kubelet[2756]: I0715 05:14:32.362256 2756 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jul 15 05:14:32.362652 kubelet[2756]: I0715 05:14:32.362641 2756 status_manager.go:230] "Starting to sync pod status with apiserver" Jul 15 05:14:32.362728 kubelet[2756]: I0715 05:14:32.362720 2756 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 15 05:14:32.362768 kubelet[2756]: I0715 05:14:32.362762 2756 kubelet.go:2436] "Starting kubelet main sync loop" Jul 15 05:14:32.362838 kubelet[2756]: E0715 05:14:32.362827 2756 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:14:32.364997 kubelet[2756]: I0715 05:14:32.364970 2756 factory.go:223] Registration of the containerd container factory successfully Jul 15 05:14:32.364997 kubelet[2756]: I0715 05:14:32.364988 2756 factory.go:223] Registration of the systemd container factory successfully Jul 15 05:14:32.417200 kubelet[2756]: I0715 05:14:32.417169 2756 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 15 05:14:32.417200 kubelet[2756]: I0715 05:14:32.417204 2756 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 15 05:14:32.417383 kubelet[2756]: I0715 05:14:32.417240 2756 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:14:32.417432 kubelet[2756]: I0715 05:14:32.417410 2756 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 15 05:14:32.417483 kubelet[2756]: I0715 05:14:32.417443 2756 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 15 05:14:32.417483 kubelet[2756]: I0715 05:14:32.417477 2756 policy_none.go:49] "None policy: Start" Jul 15 05:14:32.417514 kubelet[2756]: I0715 05:14:32.417485 2756 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 15 05:14:32.417514 kubelet[2756]: I0715 05:14:32.417495 2756 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:14:32.417633 kubelet[2756]: I0715 05:14:32.417611 2756 state_mem.go:75] "Updated machine memory state" Jul 15 05:14:32.422224 kubelet[2756]: E0715 05:14:32.422126 2756 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jul 15 05:14:32.422431 kubelet[2756]: I0715 05:14:32.422290 2756 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:14:32.422431 kubelet[2756]: I0715 05:14:32.422308 2756 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:14:32.422846 kubelet[2756]: I0715 05:14:32.422779 2756 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:14:32.424125 kubelet[2756]: E0715 05:14:32.423866 2756 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 15 05:14:32.464169 kubelet[2756]: I0715 05:14:32.464131 2756 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.464522 kubelet[2756]: I0715 05:14:32.464324 2756 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.464522 kubelet[2756]: I0715 05:14:32.464456 2756 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.470275 kubelet[2756]: E0715 05:14:32.470222 2756 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4396-0-0-n-85c8113064\" already exists" pod="kube-system/kube-scheduler-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.470275 kubelet[2756]: E0715 05:14:32.470280 2756 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4396-0-0-n-85c8113064\" already exists" pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.470537 kubelet[2756]: E0715 05:14:32.470317 2756 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" already exists" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.524850 kubelet[2756]: I0715 05:14:32.524810 2756 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.532725 kubelet[2756]: I0715 05:14:32.532691 2756 kubelet_node_status.go:124] "Node was previously registered" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.533194 kubelet[2756]: I0715 05:14:32.533173 2756 kubelet_node_status.go:78] "Successfully registered node" node="ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.560947 kubelet[2756]: I0715 05:14:32.560816 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4efbf38801bb6c8a1de499bd6b9057eb-ca-certs\") pod \"kube-apiserver-ci-4396-0-0-n-85c8113064\" (UID: \"4efbf38801bb6c8a1de499bd6b9057eb\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.560947 kubelet[2756]: I0715 05:14:32.560858 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4efbf38801bb6c8a1de499bd6b9057eb-k8s-certs\") pod \"kube-apiserver-ci-4396-0-0-n-85c8113064\" (UID: \"4efbf38801bb6c8a1de499bd6b9057eb\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.560947 kubelet[2756]: I0715 05:14:32.560877 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4efbf38801bb6c8a1de499bd6b9057eb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4396-0-0-n-85c8113064\" (UID: \"4efbf38801bb6c8a1de499bd6b9057eb\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.560947 kubelet[2756]: I0715 05:14:32.560898 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-flexvolume-dir\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.560947 kubelet[2756]: I0715 05:14:32.560919 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-k8s-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.562946 kubelet[2756]: I0715 05:14:32.562887 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-ca-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.562946 kubelet[2756]: I0715 05:14:32.562912 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-kubeconfig\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.563045 kubelet[2756]: I0715 05:14:32.562925 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a8fef8d5637ab08c2a1ac228075d8235-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4396-0-0-n-85c8113064\" (UID: \"a8fef8d5637ab08c2a1ac228075d8235\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" Jul 15 05:14:32.563045 kubelet[2756]: I0715 05:14:32.562970 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4ae0925fe7a00d504064ac2dbebf8857-kubeconfig\") pod \"kube-scheduler-ci-4396-0-0-n-85c8113064\" (UID: \"4ae0925fe7a00d504064ac2dbebf8857\") " pod="kube-system/kube-scheduler-ci-4396-0-0-n-85c8113064" Jul 15 05:14:33.341009 kubelet[2756]: I0715 05:14:33.340770 2756 apiserver.go:52] "Watching apiserver" Jul 15 05:14:33.360004 kubelet[2756]: I0715 05:14:33.359930 2756 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 15 05:14:33.401376 kubelet[2756]: I0715 05:14:33.401346 2756 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:33.401652 kubelet[2756]: I0715 05:14:33.401567 2756 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4396-0-0-n-85c8113064" Jul 15 05:14:33.410245 kubelet[2756]: E0715 05:14:33.409799 2756 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4396-0-0-n-85c8113064\" already exists" pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" Jul 15 05:14:33.419600 kubelet[2756]: E0715 05:14:33.418908 2756 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4396-0-0-n-85c8113064\" already exists" pod="kube-system/kube-scheduler-ci-4396-0-0-n-85c8113064" Jul 15 05:14:33.449147 kubelet[2756]: I0715 05:14:33.449097 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-85c8113064" podStartSLOduration=2.449082094 podStartE2EDuration="2.449082094s" podCreationTimestamp="2025-07-15 05:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:14:33.447192443 +0000 UTC m=+1.184466654" watchObservedRunningTime="2025-07-15 05:14:33.449082094 +0000 UTC m=+1.186356305" Jul 15 05:14:33.449331 kubelet[2756]: I0715 05:14:33.449176 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4396-0-0-n-85c8113064" podStartSLOduration=2.449173654 podStartE2EDuration="2.449173654s" podCreationTimestamp="2025-07-15 05:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:14:33.435274988 +0000 UTC m=+1.172549189" watchObservedRunningTime="2025-07-15 05:14:33.449173654 +0000 UTC m=+1.186447855" Jul 15 05:14:33.467118 kubelet[2756]: I0715 05:14:33.467066 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4396-0-0-n-85c8113064" podStartSLOduration=2.467049381 podStartE2EDuration="2.467049381s" podCreationTimestamp="2025-07-15 05:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:14:33.466560241 +0000 UTC m=+1.203834452" watchObservedRunningTime="2025-07-15 05:14:33.467049381 +0000 UTC m=+1.204323582" Jul 15 05:14:38.150801 kubelet[2756]: I0715 05:14:38.150776 2756 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 15 05:14:38.151680 containerd[1607]: time="2025-07-15T05:14:38.151615583Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 15 05:14:38.151936 kubelet[2756]: I0715 05:14:38.151807 2756 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 15 05:14:38.965831 systemd[1]: Created slice kubepods-besteffort-pod2b5491ea_c652_4979_92f0_ad05f6964cfb.slice - libcontainer container kubepods-besteffort-pod2b5491ea_c652_4979_92f0_ad05f6964cfb.slice. Jul 15 05:14:39.002421 kubelet[2756]: I0715 05:14:39.002357 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/2b5491ea-c652-4979-92f0-ad05f6964cfb-kube-proxy\") pod \"kube-proxy-v8s7q\" (UID: \"2b5491ea-c652-4979-92f0-ad05f6964cfb\") " pod="kube-system/kube-proxy-v8s7q" Jul 15 05:14:39.002421 kubelet[2756]: I0715 05:14:39.002398 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2b5491ea-c652-4979-92f0-ad05f6964cfb-xtables-lock\") pod \"kube-proxy-v8s7q\" (UID: \"2b5491ea-c652-4979-92f0-ad05f6964cfb\") " pod="kube-system/kube-proxy-v8s7q" Jul 15 05:14:39.002421 kubelet[2756]: I0715 05:14:39.002414 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2b5491ea-c652-4979-92f0-ad05f6964cfb-lib-modules\") pod \"kube-proxy-v8s7q\" (UID: \"2b5491ea-c652-4979-92f0-ad05f6964cfb\") " pod="kube-system/kube-proxy-v8s7q" Jul 15 05:14:39.002611 kubelet[2756]: I0715 05:14:39.002425 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j77c4\" (UniqueName: \"kubernetes.io/projected/2b5491ea-c652-4979-92f0-ad05f6964cfb-kube-api-access-j77c4\") pod \"kube-proxy-v8s7q\" (UID: \"2b5491ea-c652-4979-92f0-ad05f6964cfb\") " pod="kube-system/kube-proxy-v8s7q" Jul 15 05:14:39.276592 containerd[1607]: time="2025-07-15T05:14:39.276040201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-v8s7q,Uid:2b5491ea-c652-4979-92f0-ad05f6964cfb,Namespace:kube-system,Attempt:0,}" Jul 15 05:14:39.302153 containerd[1607]: time="2025-07-15T05:14:39.301401941Z" level=info msg="connecting to shim 7dc88403bb3f006908ef45e48e083c913040698a9b5ebeea1251e83add4942e3" address="unix:///run/containerd/s/c5b3d2db4b40f5a4494d1d2ba5665472a8cd95f83798d8c0196a115c36bf5fbc" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:39.338053 systemd[1]: Started cri-containerd-7dc88403bb3f006908ef45e48e083c913040698a9b5ebeea1251e83add4942e3.scope - libcontainer container 7dc88403bb3f006908ef45e48e083c913040698a9b5ebeea1251e83add4942e3. Jul 15 05:14:39.404114 systemd[1]: Created slice kubepods-besteffort-poddea90dc7_b32f_44d8_accb_18ea5c8d06d7.slice - libcontainer container kubepods-besteffort-poddea90dc7_b32f_44d8_accb_18ea5c8d06d7.slice. Jul 15 05:14:39.406145 kubelet[2756]: I0715 05:14:39.406115 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dea90dc7-b32f-44d8-accb-18ea5c8d06d7-var-lib-calico\") pod \"tigera-operator-747864d56d-znzr6\" (UID: \"dea90dc7-b32f-44d8-accb-18ea5c8d06d7\") " pod="tigera-operator/tigera-operator-747864d56d-znzr6" Jul 15 05:14:39.406468 kubelet[2756]: I0715 05:14:39.406152 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h88bs\" (UniqueName: \"kubernetes.io/projected/dea90dc7-b32f-44d8-accb-18ea5c8d06d7-kube-api-access-h88bs\") pod \"tigera-operator-747864d56d-znzr6\" (UID: \"dea90dc7-b32f-44d8-accb-18ea5c8d06d7\") " pod="tigera-operator/tigera-operator-747864d56d-znzr6" Jul 15 05:14:39.441781 containerd[1607]: time="2025-07-15T05:14:39.441587040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-v8s7q,Uid:2b5491ea-c652-4979-92f0-ad05f6964cfb,Namespace:kube-system,Attempt:0,} returns sandbox id \"7dc88403bb3f006908ef45e48e083c913040698a9b5ebeea1251e83add4942e3\"" Jul 15 05:14:39.449302 containerd[1607]: time="2025-07-15T05:14:39.449246553Z" level=info msg="CreateContainer within sandbox \"7dc88403bb3f006908ef45e48e083c913040698a9b5ebeea1251e83add4942e3\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 15 05:14:39.466601 containerd[1607]: time="2025-07-15T05:14:39.464739479Z" level=info msg="Container df3ca646a2f6bb5db004a16a757405fe86492c6481e62d6753c2e190c7080999: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:39.467330 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3370495015.mount: Deactivated successfully. Jul 15 05:14:39.473108 containerd[1607]: time="2025-07-15T05:14:39.473057633Z" level=info msg="CreateContainer within sandbox \"7dc88403bb3f006908ef45e48e083c913040698a9b5ebeea1251e83add4942e3\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"df3ca646a2f6bb5db004a16a757405fe86492c6481e62d6753c2e190c7080999\"" Jul 15 05:14:39.474698 containerd[1607]: time="2025-07-15T05:14:39.474058153Z" level=info msg="StartContainer for \"df3ca646a2f6bb5db004a16a757405fe86492c6481e62d6753c2e190c7080999\"" Jul 15 05:14:39.475894 containerd[1607]: time="2025-07-15T05:14:39.475859314Z" level=info msg="connecting to shim df3ca646a2f6bb5db004a16a757405fe86492c6481e62d6753c2e190c7080999" address="unix:///run/containerd/s/c5b3d2db4b40f5a4494d1d2ba5665472a8cd95f83798d8c0196a115c36bf5fbc" protocol=ttrpc version=3 Jul 15 05:14:39.495884 systemd[1]: Started cri-containerd-df3ca646a2f6bb5db004a16a757405fe86492c6481e62d6753c2e190c7080999.scope - libcontainer container df3ca646a2f6bb5db004a16a757405fe86492c6481e62d6753c2e190c7080999. Jul 15 05:14:39.545843 containerd[1607]: time="2025-07-15T05:14:39.545705133Z" level=info msg="StartContainer for \"df3ca646a2f6bb5db004a16a757405fe86492c6481e62d6753c2e190c7080999\" returns successfully" Jul 15 05:14:39.713214 containerd[1607]: time="2025-07-15T05:14:39.713158513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-znzr6,Uid:dea90dc7-b32f-44d8-accb-18ea5c8d06d7,Namespace:tigera-operator,Attempt:0,}" Jul 15 05:14:39.740918 containerd[1607]: time="2025-07-15T05:14:39.740704134Z" level=info msg="connecting to shim 0fa1fa9d9003d87c454cf681d348a10303092550a8f3802c1d480155082dac53" address="unix:///run/containerd/s/e4ccbfe6bc1d773351544be81badaf096360b6de9570aff28c73328799f7ce9c" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:39.771848 systemd[1]: Started cri-containerd-0fa1fa9d9003d87c454cf681d348a10303092550a8f3802c1d480155082dac53.scope - libcontainer container 0fa1fa9d9003d87c454cf681d348a10303092550a8f3802c1d480155082dac53. Jul 15 05:14:39.821176 containerd[1607]: time="2025-07-15T05:14:39.821073358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-znzr6,Uid:dea90dc7-b32f-44d8-accb-18ea5c8d06d7,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0fa1fa9d9003d87c454cf681d348a10303092550a8f3802c1d480155082dac53\"" Jul 15 05:14:39.824652 containerd[1607]: time="2025-07-15T05:14:39.824613489Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 15 05:14:40.438426 kubelet[2756]: I0715 05:14:40.438335 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-v8s7q" podStartSLOduration=2.438313185 podStartE2EDuration="2.438313185s" podCreationTimestamp="2025-07-15 05:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:14:40.438123225 +0000 UTC m=+8.175397466" watchObservedRunningTime="2025-07-15 05:14:40.438313185 +0000 UTC m=+8.175587426" Jul 15 05:14:41.710042 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2927264546.mount: Deactivated successfully. Jul 15 05:14:41.801387 update_engine[1580]: I20250715 05:14:41.801259 1580 update_attempter.cc:509] Updating boot flags... Jul 15 05:14:42.307015 containerd[1607]: time="2025-07-15T05:14:42.306950913Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:42.308124 containerd[1607]: time="2025-07-15T05:14:42.307948314Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Jul 15 05:14:42.308958 containerd[1607]: time="2025-07-15T05:14:42.308914024Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:42.310563 containerd[1607]: time="2025-07-15T05:14:42.310537415Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:42.311026 containerd[1607]: time="2025-07-15T05:14:42.311007195Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.485652885s" Jul 15 05:14:42.311108 containerd[1607]: time="2025-07-15T05:14:42.311096165Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Jul 15 05:14:42.315248 containerd[1607]: time="2025-07-15T05:14:42.315212107Z" level=info msg="CreateContainer within sandbox \"0fa1fa9d9003d87c454cf681d348a10303092550a8f3802c1d480155082dac53\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 15 05:14:42.324638 containerd[1607]: time="2025-07-15T05:14:42.324593530Z" level=info msg="Container a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:42.333585 containerd[1607]: time="2025-07-15T05:14:42.333528884Z" level=info msg="CreateContainer within sandbox \"0fa1fa9d9003d87c454cf681d348a10303092550a8f3802c1d480155082dac53\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50\"" Jul 15 05:14:42.334720 containerd[1607]: time="2025-07-15T05:14:42.334689555Z" level=info msg="StartContainer for \"a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50\"" Jul 15 05:14:42.337094 containerd[1607]: time="2025-07-15T05:14:42.337027996Z" level=info msg="connecting to shim a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50" address="unix:///run/containerd/s/e4ccbfe6bc1d773351544be81badaf096360b6de9570aff28c73328799f7ce9c" protocol=ttrpc version=3 Jul 15 05:14:42.359792 systemd[1]: Started cri-containerd-a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50.scope - libcontainer container a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50. Jul 15 05:14:42.397919 containerd[1607]: time="2025-07-15T05:14:42.397867961Z" level=info msg="StartContainer for \"a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50\" returns successfully" Jul 15 05:14:47.106791 kubelet[2756]: I0715 05:14:47.106588 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-znzr6" podStartSLOduration=5.618375726 podStartE2EDuration="8.106573942s" podCreationTimestamp="2025-07-15 05:14:39 +0000 UTC" firstStartedPulling="2025-07-15 05:14:39.823492289 +0000 UTC m=+7.560766490" lastFinishedPulling="2025-07-15 05:14:42.311690505 +0000 UTC m=+10.048964706" observedRunningTime="2025-07-15 05:14:42.447797422 +0000 UTC m=+10.185071633" watchObservedRunningTime="2025-07-15 05:14:47.106573942 +0000 UTC m=+14.843848153" Jul 15 05:14:48.082541 sudo[1833]: pam_unix(sudo:session): session closed for user root Jul 15 05:14:48.242781 sshd[1832]: Connection closed by 139.178.89.65 port 41896 Jul 15 05:14:48.241681 sshd-session[1829]: pam_unix(sshd:session): session closed for user core Jul 15 05:14:48.250418 systemd[1]: sshd@6-95.217.135.169:22-139.178.89.65:41896.service: Deactivated successfully. Jul 15 05:14:48.255839 systemd[1]: session-7.scope: Deactivated successfully. Jul 15 05:14:48.256171 systemd[1]: session-7.scope: Consumed 4.318s CPU time, 162M memory peak. Jul 15 05:14:48.257979 systemd-logind[1575]: Session 7 logged out. Waiting for processes to exit. Jul 15 05:14:48.261089 systemd-logind[1575]: Removed session 7. Jul 15 05:14:50.782986 systemd[1]: Created slice kubepods-besteffort-podbc5ac86c_c936_4be2_844a_3d41cb5e7e92.slice - libcontainer container kubepods-besteffort-podbc5ac86c_c936_4be2_844a_3d41cb5e7e92.slice. Jul 15 05:14:50.887769 kubelet[2756]: I0715 05:14:50.887724 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bc5ac86c-c936-4be2-844a-3d41cb5e7e92-tigera-ca-bundle\") pod \"calico-typha-5b9b8c5c6-9l8jm\" (UID: \"bc5ac86c-c936-4be2-844a-3d41cb5e7e92\") " pod="calico-system/calico-typha-5b9b8c5c6-9l8jm" Jul 15 05:14:50.887769 kubelet[2756]: I0715 05:14:50.887759 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx4pg\" (UniqueName: \"kubernetes.io/projected/bc5ac86c-c936-4be2-844a-3d41cb5e7e92-kube-api-access-hx4pg\") pod \"calico-typha-5b9b8c5c6-9l8jm\" (UID: \"bc5ac86c-c936-4be2-844a-3d41cb5e7e92\") " pod="calico-system/calico-typha-5b9b8c5c6-9l8jm" Jul 15 05:14:50.887769 kubelet[2756]: I0715 05:14:50.887773 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/bc5ac86c-c936-4be2-844a-3d41cb5e7e92-typha-certs\") pod \"calico-typha-5b9b8c5c6-9l8jm\" (UID: \"bc5ac86c-c936-4be2-844a-3d41cb5e7e92\") " pod="calico-system/calico-typha-5b9b8c5c6-9l8jm" Jul 15 05:14:51.092661 containerd[1607]: time="2025-07-15T05:14:51.092247132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b9b8c5c6-9l8jm,Uid:bc5ac86c-c936-4be2-844a-3d41cb5e7e92,Namespace:calico-system,Attempt:0,}" Jul 15 05:14:51.132700 systemd[1]: Created slice kubepods-besteffort-pod83694405_e571_497e_9551_df22cb488ca2.slice - libcontainer container kubepods-besteffort-pod83694405_e571_497e_9551_df22cb488ca2.slice. Jul 15 05:14:51.137612 containerd[1607]: time="2025-07-15T05:14:51.136852575Z" level=info msg="connecting to shim 28b4a593a1e6262336431805701735294ba9973392062cbaf1a64a3f06a8dca3" address="unix:///run/containerd/s/46ae37bfbf741a27e9f9c62f3a9357eb79d3bbfae6d829ea34c494f2afba7b0f" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:51.175795 systemd[1]: Started cri-containerd-28b4a593a1e6262336431805701735294ba9973392062cbaf1a64a3f06a8dca3.scope - libcontainer container 28b4a593a1e6262336431805701735294ba9973392062cbaf1a64a3f06a8dca3. Jul 15 05:14:51.189400 kubelet[2756]: I0715 05:14:51.189353 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-cni-net-dir\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189400 kubelet[2756]: I0715 05:14:51.189385 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/83694405-e571-497e-9551-df22cb488ca2-node-certs\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189400 kubelet[2756]: I0715 05:14:51.189397 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4t9z\" (UniqueName: \"kubernetes.io/projected/83694405-e571-497e-9551-df22cb488ca2-kube-api-access-d4t9z\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189577 kubelet[2756]: I0715 05:14:51.189411 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-flexvol-driver-host\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189577 kubelet[2756]: I0715 05:14:51.189421 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-policysync\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189577 kubelet[2756]: I0715 05:14:51.189431 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-var-run-calico\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189577 kubelet[2756]: I0715 05:14:51.189441 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-xtables-lock\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189577 kubelet[2756]: I0715 05:14:51.189454 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-lib-modules\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189658 kubelet[2756]: I0715 05:14:51.189466 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-cni-log-dir\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189658 kubelet[2756]: I0715 05:14:51.189481 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-var-lib-calico\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189658 kubelet[2756]: I0715 05:14:51.189491 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/83694405-e571-497e-9551-df22cb488ca2-cni-bin-dir\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.189658 kubelet[2756]: I0715 05:14:51.189502 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83694405-e571-497e-9551-df22cb488ca2-tigera-ca-bundle\") pod \"calico-node-9zns4\" (UID: \"83694405-e571-497e-9551-df22cb488ca2\") " pod="calico-system/calico-node-9zns4" Jul 15 05:14:51.220142 containerd[1607]: time="2025-07-15T05:14:51.220104747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5b9b8c5c6-9l8jm,Uid:bc5ac86c-c936-4be2-844a-3d41cb5e7e92,Namespace:calico-system,Attempt:0,} returns sandbox id \"28b4a593a1e6262336431805701735294ba9973392062cbaf1a64a3f06a8dca3\"" Jul 15 05:14:51.222801 containerd[1607]: time="2025-07-15T05:14:51.222755143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 15 05:14:51.293045 kubelet[2756]: E0715 05:14:51.293011 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.293045 kubelet[2756]: W0715 05:14:51.293035 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.294366 kubelet[2756]: E0715 05:14:51.294341 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.294657 kubelet[2756]: E0715 05:14:51.294639 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.294754 kubelet[2756]: W0715 05:14:51.294664 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.294754 kubelet[2756]: E0715 05:14:51.294691 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.295756 kubelet[2756]: E0715 05:14:51.295735 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.295756 kubelet[2756]: W0715 05:14:51.295752 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.295823 kubelet[2756]: E0715 05:14:51.295778 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.300768 kubelet[2756]: E0715 05:14:51.299801 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.300768 kubelet[2756]: W0715 05:14:51.299813 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.300768 kubelet[2756]: E0715 05:14:51.299826 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.301390 kubelet[2756]: E0715 05:14:51.301264 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.301390 kubelet[2756]: W0715 05:14:51.301280 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.301390 kubelet[2756]: E0715 05:14:51.301293 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.305926 kubelet[2756]: E0715 05:14:51.305905 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.306052 kubelet[2756]: W0715 05:14:51.306025 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.306052 kubelet[2756]: E0715 05:14:51.306043 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.404648 kubelet[2756]: E0715 05:14:51.403943 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5fd7c" podUID="47d635b5-605e-4c77-ba36-e640d13113f8" Jul 15 05:14:51.438533 containerd[1607]: time="2025-07-15T05:14:51.438468509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9zns4,Uid:83694405-e571-497e-9551-df22cb488ca2,Namespace:calico-system,Attempt:0,}" Jul 15 05:14:51.472481 kubelet[2756]: E0715 05:14:51.472435 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.472481 kubelet[2756]: W0715 05:14:51.472471 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.472727 kubelet[2756]: E0715 05:14:51.472497 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.472920 kubelet[2756]: E0715 05:14:51.472891 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.472920 kubelet[2756]: W0715 05:14:51.472907 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.472920 kubelet[2756]: E0715 05:14:51.472919 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.473353 kubelet[2756]: E0715 05:14:51.473328 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.473353 kubelet[2756]: W0715 05:14:51.473343 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.473439 kubelet[2756]: E0715 05:14:51.473360 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.473918 kubelet[2756]: E0715 05:14:51.473865 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.473918 kubelet[2756]: W0715 05:14:51.473889 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.474247 kubelet[2756]: E0715 05:14:51.474073 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.474434 kubelet[2756]: E0715 05:14:51.474423 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.474571 kubelet[2756]: W0715 05:14:51.474485 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.474571 kubelet[2756]: E0715 05:14:51.474499 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.474855 kubelet[2756]: E0715 05:14:51.474797 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.474855 kubelet[2756]: W0715 05:14:51.474808 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.474855 kubelet[2756]: E0715 05:14:51.474817 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.475203 kubelet[2756]: E0715 05:14:51.475172 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.475282 kubelet[2756]: W0715 05:14:51.475244 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.475282 kubelet[2756]: E0715 05:14:51.475256 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.477126 kubelet[2756]: E0715 05:14:51.475651 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.477774 kubelet[2756]: W0715 05:14:51.477225 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.477774 kubelet[2756]: E0715 05:14:51.477242 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.477774 kubelet[2756]: E0715 05:14:51.477427 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.477774 kubelet[2756]: W0715 05:14:51.477435 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.477774 kubelet[2756]: E0715 05:14:51.477443 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.477774 kubelet[2756]: E0715 05:14:51.477623 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.477774 kubelet[2756]: W0715 05:14:51.477633 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.477774 kubelet[2756]: E0715 05:14:51.477645 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.478482 kubelet[2756]: E0715 05:14:51.478059 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.478482 kubelet[2756]: W0715 05:14:51.478070 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.478482 kubelet[2756]: E0715 05:14:51.478079 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.478482 kubelet[2756]: E0715 05:14:51.478235 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.478482 kubelet[2756]: W0715 05:14:51.478243 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.478482 kubelet[2756]: E0715 05:14:51.478250 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.478482 kubelet[2756]: E0715 05:14:51.478406 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.478482 kubelet[2756]: W0715 05:14:51.478413 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.478482 kubelet[2756]: E0715 05:14:51.478420 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.478949 kubelet[2756]: E0715 05:14:51.478844 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.478949 kubelet[2756]: W0715 05:14:51.478861 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.478949 kubelet[2756]: E0715 05:14:51.478871 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.479173 kubelet[2756]: E0715 05:14:51.479149 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.479327 kubelet[2756]: W0715 05:14:51.479227 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.479327 kubelet[2756]: E0715 05:14:51.479241 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.479508 kubelet[2756]: E0715 05:14:51.479496 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.479568 kubelet[2756]: W0715 05:14:51.479557 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.479614 kubelet[2756]: E0715 05:14:51.479605 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.479844 kubelet[2756]: E0715 05:14:51.479833 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.479899 kubelet[2756]: W0715 05:14:51.479890 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.479939 kubelet[2756]: E0715 05:14:51.479931 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.480118 kubelet[2756]: E0715 05:14:51.480108 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.480176 kubelet[2756]: W0715 05:14:51.480167 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.480220 kubelet[2756]: E0715 05:14:51.480212 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.480461 kubelet[2756]: E0715 05:14:51.480450 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.480507 kubelet[2756]: W0715 05:14:51.480499 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.480550 kubelet[2756]: E0715 05:14:51.480541 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.480757 kubelet[2756]: E0715 05:14:51.480747 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.480811 kubelet[2756]: W0715 05:14:51.480803 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.480852 kubelet[2756]: E0715 05:14:51.480844 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.491444 kubelet[2756]: E0715 05:14:51.491118 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.491444 kubelet[2756]: W0715 05:14:51.491136 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.491444 kubelet[2756]: E0715 05:14:51.491172 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.491444 kubelet[2756]: I0715 05:14:51.491193 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/47d635b5-605e-4c77-ba36-e640d13113f8-registration-dir\") pod \"csi-node-driver-5fd7c\" (UID: \"47d635b5-605e-4c77-ba36-e640d13113f8\") " pod="calico-system/csi-node-driver-5fd7c" Jul 15 05:14:51.491444 kubelet[2756]: E0715 05:14:51.491330 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.491444 kubelet[2756]: W0715 05:14:51.491336 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.491444 kubelet[2756]: E0715 05:14:51.491343 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.491444 kubelet[2756]: I0715 05:14:51.491352 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/47d635b5-605e-4c77-ba36-e640d13113f8-socket-dir\") pod \"csi-node-driver-5fd7c\" (UID: \"47d635b5-605e-4c77-ba36-e640d13113f8\") " pod="calico-system/csi-node-driver-5fd7c" Jul 15 05:14:51.493002 kubelet[2756]: E0715 05:14:51.492326 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.493002 kubelet[2756]: W0715 05:14:51.492340 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.493002 kubelet[2756]: E0715 05:14:51.492348 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.493002 kubelet[2756]: E0715 05:14:51.492538 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.493002 kubelet[2756]: W0715 05:14:51.492544 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.493002 kubelet[2756]: E0715 05:14:51.492551 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.493002 kubelet[2756]: E0715 05:14:51.492722 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.493002 kubelet[2756]: W0715 05:14:51.492727 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.493002 kubelet[2756]: E0715 05:14:51.492735 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.493242 kubelet[2756]: I0715 05:14:51.492766 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/47d635b5-605e-4c77-ba36-e640d13113f8-varrun\") pod \"csi-node-driver-5fd7c\" (UID: \"47d635b5-605e-4c77-ba36-e640d13113f8\") " pod="calico-system/csi-node-driver-5fd7c" Jul 15 05:14:51.493242 kubelet[2756]: E0715 05:14:51.492933 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.493242 kubelet[2756]: W0715 05:14:51.492939 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.493242 kubelet[2756]: E0715 05:14:51.492945 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.493242 kubelet[2756]: I0715 05:14:51.492964 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bwg\" (UniqueName: \"kubernetes.io/projected/47d635b5-605e-4c77-ba36-e640d13113f8-kube-api-access-j2bwg\") pod \"csi-node-driver-5fd7c\" (UID: \"47d635b5-605e-4c77-ba36-e640d13113f8\") " pod="calico-system/csi-node-driver-5fd7c" Jul 15 05:14:51.494596 kubelet[2756]: E0715 05:14:51.494100 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.494596 kubelet[2756]: W0715 05:14:51.494111 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.494596 kubelet[2756]: E0715 05:14:51.494119 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.494596 kubelet[2756]: I0715 05:14:51.494172 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d635b5-605e-4c77-ba36-e640d13113f8-kubelet-dir\") pod \"csi-node-driver-5fd7c\" (UID: \"47d635b5-605e-4c77-ba36-e640d13113f8\") " pod="calico-system/csi-node-driver-5fd7c" Jul 15 05:14:51.494596 kubelet[2756]: E0715 05:14:51.494302 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.494596 kubelet[2756]: W0715 05:14:51.494307 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.494596 kubelet[2756]: E0715 05:14:51.494313 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.494596 kubelet[2756]: E0715 05:14:51.494462 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.494596 kubelet[2756]: W0715 05:14:51.494467 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.494777 kubelet[2756]: E0715 05:14:51.494473 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.496217 kubelet[2756]: E0715 05:14:51.495589 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.496217 kubelet[2756]: W0715 05:14:51.495601 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.496217 kubelet[2756]: E0715 05:14:51.495609 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.496217 kubelet[2756]: E0715 05:14:51.495824 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.496217 kubelet[2756]: W0715 05:14:51.495832 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.496217 kubelet[2756]: E0715 05:14:51.495841 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.496217 kubelet[2756]: E0715 05:14:51.496028 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.496217 kubelet[2756]: W0715 05:14:51.496036 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.496217 kubelet[2756]: E0715 05:14:51.496044 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.496386 kubelet[2756]: E0715 05:14:51.496248 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.496386 kubelet[2756]: W0715 05:14:51.496255 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.496386 kubelet[2756]: E0715 05:14:51.496262 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.496433 kubelet[2756]: E0715 05:14:51.496419 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.496433 kubelet[2756]: W0715 05:14:51.496425 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.496433 kubelet[2756]: E0715 05:14:51.496431 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.496620 kubelet[2756]: E0715 05:14:51.496594 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.496620 kubelet[2756]: W0715 05:14:51.496607 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.496620 kubelet[2756]: E0715 05:14:51.496613 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.512622 containerd[1607]: time="2025-07-15T05:14:51.512533482Z" level=info msg="connecting to shim 976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1" address="unix:///run/containerd/s/ac1c60ff0341718428040350bcc2a4c870c7a6e419da37b6a00ec23bfd90629c" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:14:51.543056 systemd[1]: Started cri-containerd-976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1.scope - libcontainer container 976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1. Jul 15 05:14:51.595369 kubelet[2756]: E0715 05:14:51.595247 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.595369 kubelet[2756]: W0715 05:14:51.595265 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.595369 kubelet[2756]: E0715 05:14:51.595292 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.595794 kubelet[2756]: E0715 05:14:51.595693 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.595794 kubelet[2756]: W0715 05:14:51.595703 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.595794 kubelet[2756]: E0715 05:14:51.595712 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.596095 kubelet[2756]: E0715 05:14:51.596040 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.596095 kubelet[2756]: W0715 05:14:51.596049 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.596095 kubelet[2756]: E0715 05:14:51.596057 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.597508 kubelet[2756]: E0715 05:14:51.597462 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.597796 kubelet[2756]: W0715 05:14:51.597491 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.597796 kubelet[2756]: E0715 05:14:51.597569 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.597949 kubelet[2756]: E0715 05:14:51.597881 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.597949 kubelet[2756]: W0715 05:14:51.597895 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.597949 kubelet[2756]: E0715 05:14:51.597906 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.598193 kubelet[2756]: E0715 05:14:51.598170 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.598193 kubelet[2756]: W0715 05:14:51.598181 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.598193 kubelet[2756]: E0715 05:14:51.598189 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.598750 kubelet[2756]: E0715 05:14:51.598725 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.598936 kubelet[2756]: W0715 05:14:51.598758 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.598936 kubelet[2756]: E0715 05:14:51.598768 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.598936 kubelet[2756]: E0715 05:14:51.598912 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.598936 kubelet[2756]: W0715 05:14:51.598918 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.598936 kubelet[2756]: E0715 05:14:51.598924 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.599584 kubelet[2756]: E0715 05:14:51.599569 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.599584 kubelet[2756]: W0715 05:14:51.599580 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.599827 kubelet[2756]: E0715 05:14:51.599588 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.600057 kubelet[2756]: E0715 05:14:51.599996 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.600057 kubelet[2756]: W0715 05:14:51.600007 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.600314 kubelet[2756]: E0715 05:14:51.600016 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.600400 kubelet[2756]: E0715 05:14:51.600363 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.600400 kubelet[2756]: W0715 05:14:51.600380 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.600400 kubelet[2756]: E0715 05:14:51.600390 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.600599 kubelet[2756]: E0715 05:14:51.600581 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.600599 kubelet[2756]: W0715 05:14:51.600593 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.600599 kubelet[2756]: E0715 05:14:51.600599 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.600857 kubelet[2756]: E0715 05:14:51.600770 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.600857 kubelet[2756]: W0715 05:14:51.600783 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.601122 kubelet[2756]: E0715 05:14:51.600791 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.601788 kubelet[2756]: E0715 05:14:51.601728 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.601788 kubelet[2756]: W0715 05:14:51.601746 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.601788 kubelet[2756]: E0715 05:14:51.601756 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.603900 kubelet[2756]: E0715 05:14:51.603867 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.603900 kubelet[2756]: W0715 05:14:51.603881 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.603900 kubelet[2756]: E0715 05:14:51.603891 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.604206 kubelet[2756]: E0715 05:14:51.604185 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.604206 kubelet[2756]: W0715 05:14:51.604197 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.604206 kubelet[2756]: E0715 05:14:51.604205 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.604475 kubelet[2756]: E0715 05:14:51.604455 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.604475 kubelet[2756]: W0715 05:14:51.604467 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.604475 kubelet[2756]: E0715 05:14:51.604474 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.605098 kubelet[2756]: E0715 05:14:51.605077 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.605098 kubelet[2756]: W0715 05:14:51.605089 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.605098 kubelet[2756]: E0715 05:14:51.605099 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.605447 kubelet[2756]: E0715 05:14:51.605336 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.605447 kubelet[2756]: W0715 05:14:51.605347 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.605447 kubelet[2756]: E0715 05:14:51.605354 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.605793 kubelet[2756]: E0715 05:14:51.605721 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.605793 kubelet[2756]: W0715 05:14:51.605734 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.605793 kubelet[2756]: E0715 05:14:51.605741 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.607786 kubelet[2756]: E0715 05:14:51.607764 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.607786 kubelet[2756]: W0715 05:14:51.607780 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.607786 kubelet[2756]: E0715 05:14:51.607789 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.608132 kubelet[2756]: E0715 05:14:51.608093 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.608132 kubelet[2756]: W0715 05:14:51.608108 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.608132 kubelet[2756]: E0715 05:14:51.608118 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.608435 kubelet[2756]: E0715 05:14:51.608412 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.608435 kubelet[2756]: W0715 05:14:51.608427 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.608495 kubelet[2756]: E0715 05:14:51.608437 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.609096 kubelet[2756]: E0715 05:14:51.609074 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.609096 kubelet[2756]: W0715 05:14:51.609087 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.609096 kubelet[2756]: E0715 05:14:51.609094 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.609502 kubelet[2756]: E0715 05:14:51.609478 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.609502 kubelet[2756]: W0715 05:14:51.609489 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.609502 kubelet[2756]: E0715 05:14:51.609497 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:51.615997 containerd[1607]: time="2025-07-15T05:14:51.615951529Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9zns4,Uid:83694405-e571-497e-9551-df22cb488ca2,Namespace:calico-system,Attempt:0,} returns sandbox id \"976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1\"" Jul 15 05:14:51.621013 kubelet[2756]: E0715 05:14:51.620997 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:51.621092 kubelet[2756]: W0715 05:14:51.621081 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:51.621241 kubelet[2756]: E0715 05:14:51.621131 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:53.044593 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3312120383.mount: Deactivated successfully. Jul 15 05:14:53.363197 kubelet[2756]: E0715 05:14:53.363009 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5fd7c" podUID="47d635b5-605e-4c77-ba36-e640d13113f8" Jul 15 05:14:54.048386 containerd[1607]: time="2025-07-15T05:14:54.048210301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:54.049085 containerd[1607]: time="2025-07-15T05:14:54.049007443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Jul 15 05:14:54.049905 containerd[1607]: time="2025-07-15T05:14:54.049886983Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:54.051750 containerd[1607]: time="2025-07-15T05:14:54.051721853Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:54.052090 containerd[1607]: time="2025-07-15T05:14:54.052072799Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.829296657s" Jul 15 05:14:54.052148 containerd[1607]: time="2025-07-15T05:14:54.052137879Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Jul 15 05:14:54.053364 containerd[1607]: time="2025-07-15T05:14:54.053339515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 15 05:14:54.068974 containerd[1607]: time="2025-07-15T05:14:54.068916528Z" level=info msg="CreateContainer within sandbox \"28b4a593a1e6262336431805701735294ba9973392062cbaf1a64a3f06a8dca3\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 15 05:14:54.077516 containerd[1607]: time="2025-07-15T05:14:54.077479825Z" level=info msg="Container 935e6c6eb2605224238945cb52c88709d09a7b7157701a4a1d13574affbb7f97: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:54.082470 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1794254811.mount: Deactivated successfully. Jul 15 05:14:54.087885 containerd[1607]: time="2025-07-15T05:14:54.087825784Z" level=info msg="CreateContainer within sandbox \"28b4a593a1e6262336431805701735294ba9973392062cbaf1a64a3f06a8dca3\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"935e6c6eb2605224238945cb52c88709d09a7b7157701a4a1d13574affbb7f97\"" Jul 15 05:14:54.089060 containerd[1607]: time="2025-07-15T05:14:54.089014990Z" level=info msg="StartContainer for \"935e6c6eb2605224238945cb52c88709d09a7b7157701a4a1d13574affbb7f97\"" Jul 15 05:14:54.089954 containerd[1607]: time="2025-07-15T05:14:54.089932301Z" level=info msg="connecting to shim 935e6c6eb2605224238945cb52c88709d09a7b7157701a4a1d13574affbb7f97" address="unix:///run/containerd/s/46ae37bfbf741a27e9f9c62f3a9357eb79d3bbfae6d829ea34c494f2afba7b0f" protocol=ttrpc version=3 Jul 15 05:14:54.113842 systemd[1]: Started cri-containerd-935e6c6eb2605224238945cb52c88709d09a7b7157701a4a1d13574affbb7f97.scope - libcontainer container 935e6c6eb2605224238945cb52c88709d09a7b7157701a4a1d13574affbb7f97. Jul 15 05:14:54.168187 containerd[1607]: time="2025-07-15T05:14:54.168068988Z" level=info msg="StartContainer for \"935e6c6eb2605224238945cb52c88709d09a7b7157701a4a1d13574affbb7f97\" returns successfully" Jul 15 05:14:54.553781 kubelet[2756]: I0715 05:14:54.553715 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5b9b8c5c6-9l8jm" podStartSLOduration=1.72278786 podStartE2EDuration="4.553699178s" podCreationTimestamp="2025-07-15 05:14:50 +0000 UTC" firstStartedPulling="2025-07-15 05:14:51.222100271 +0000 UTC m=+18.959374472" lastFinishedPulling="2025-07-15 05:14:54.053011589 +0000 UTC m=+21.790285790" observedRunningTime="2025-07-15 05:14:54.552912026 +0000 UTC m=+22.290186257" watchObservedRunningTime="2025-07-15 05:14:54.553699178 +0000 UTC m=+22.290973389" Jul 15 05:14:54.601457 kubelet[2756]: E0715 05:14:54.601408 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.601457 kubelet[2756]: W0715 05:14:54.601444 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.601457 kubelet[2756]: E0715 05:14:54.601464 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.601685 kubelet[2756]: E0715 05:14:54.601604 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.601685 kubelet[2756]: W0715 05:14:54.601609 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.601685 kubelet[2756]: E0715 05:14:54.601616 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.601788 kubelet[2756]: E0715 05:14:54.601775 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.601788 kubelet[2756]: W0715 05:14:54.601780 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.601788 kubelet[2756]: E0715 05:14:54.601786 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.602041 kubelet[2756]: E0715 05:14:54.602026 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.602041 kubelet[2756]: W0715 05:14:54.602035 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.602103 kubelet[2756]: E0715 05:14:54.602042 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.602208 kubelet[2756]: E0715 05:14:54.602192 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.602378 kubelet[2756]: W0715 05:14:54.602212 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.602378 kubelet[2756]: E0715 05:14:54.602222 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.602378 kubelet[2756]: E0715 05:14:54.602371 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.602378 kubelet[2756]: W0715 05:14:54.602377 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.602447 kubelet[2756]: E0715 05:14:54.602383 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.602565 kubelet[2756]: E0715 05:14:54.602534 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.602565 kubelet[2756]: W0715 05:14:54.602543 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.602565 kubelet[2756]: E0715 05:14:54.602552 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.602724 kubelet[2756]: E0715 05:14:54.602706 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.602724 kubelet[2756]: W0715 05:14:54.602715 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.602724 kubelet[2756]: E0715 05:14:54.602721 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.602910 kubelet[2756]: E0715 05:14:54.602876 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.602910 kubelet[2756]: W0715 05:14:54.602885 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.602910 kubelet[2756]: E0715 05:14:54.602891 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.603804 kubelet[2756]: E0715 05:14:54.603015 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.603804 kubelet[2756]: W0715 05:14:54.603020 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.603804 kubelet[2756]: E0715 05:14:54.603026 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.603804 kubelet[2756]: E0715 05:14:54.603194 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.603804 kubelet[2756]: W0715 05:14:54.603202 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.603804 kubelet[2756]: E0715 05:14:54.603210 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.603804 kubelet[2756]: E0715 05:14:54.603387 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.603804 kubelet[2756]: W0715 05:14:54.603393 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.603804 kubelet[2756]: E0715 05:14:54.603400 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.603804 kubelet[2756]: E0715 05:14:54.603654 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.604104 kubelet[2756]: W0715 05:14:54.603662 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.604104 kubelet[2756]: E0715 05:14:54.603691 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.604104 kubelet[2756]: E0715 05:14:54.603848 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.604104 kubelet[2756]: W0715 05:14:54.603854 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.604104 kubelet[2756]: E0715 05:14:54.603861 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.604302 kubelet[2756]: E0715 05:14:54.604286 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.604302 kubelet[2756]: W0715 05:14:54.604295 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.604302 kubelet[2756]: E0715 05:14:54.604302 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.620559 kubelet[2756]: E0715 05:14:54.620529 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.620559 kubelet[2756]: W0715 05:14:54.620548 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.620716 kubelet[2756]: E0715 05:14:54.620564 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.621646 kubelet[2756]: E0715 05:14:54.621625 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.621646 kubelet[2756]: W0715 05:14:54.621637 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.621646 kubelet[2756]: E0715 05:14:54.621645 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.621828 kubelet[2756]: E0715 05:14:54.621812 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.621828 kubelet[2756]: W0715 05:14:54.621821 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.621828 kubelet[2756]: E0715 05:14:54.621827 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.622073 kubelet[2756]: E0715 05:14:54.622056 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.622073 kubelet[2756]: W0715 05:14:54.622068 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.622115 kubelet[2756]: E0715 05:14:54.622076 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.622250 kubelet[2756]: E0715 05:14:54.622237 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.622250 kubelet[2756]: W0715 05:14:54.622245 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.622300 kubelet[2756]: E0715 05:14:54.622251 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.622386 kubelet[2756]: E0715 05:14:54.622373 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.622386 kubelet[2756]: W0715 05:14:54.622381 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.622420 kubelet[2756]: E0715 05:14:54.622386 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.622575 kubelet[2756]: E0715 05:14:54.622562 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.622575 kubelet[2756]: W0715 05:14:54.622570 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.622616 kubelet[2756]: E0715 05:14:54.622576 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.622837 kubelet[2756]: E0715 05:14:54.622818 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.622837 kubelet[2756]: W0715 05:14:54.622828 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.622837 kubelet[2756]: E0715 05:14:54.622836 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.622982 kubelet[2756]: E0715 05:14:54.622967 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.622982 kubelet[2756]: W0715 05:14:54.622976 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.622982 kubelet[2756]: E0715 05:14:54.622983 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.623158 kubelet[2756]: E0715 05:14:54.623138 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.623158 kubelet[2756]: W0715 05:14:54.623146 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.623227 kubelet[2756]: E0715 05:14:54.623164 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.623333 kubelet[2756]: E0715 05:14:54.623316 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.623333 kubelet[2756]: W0715 05:14:54.623326 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.623333 kubelet[2756]: E0715 05:14:54.623333 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.623499 kubelet[2756]: E0715 05:14:54.623486 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.623499 kubelet[2756]: W0715 05:14:54.623494 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.623532 kubelet[2756]: E0715 05:14:54.623500 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.623707 kubelet[2756]: E0715 05:14:54.623696 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.623707 kubelet[2756]: W0715 05:14:54.623704 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.623761 kubelet[2756]: E0715 05:14:54.623710 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.624101 kubelet[2756]: E0715 05:14:54.624087 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.624101 kubelet[2756]: W0715 05:14:54.624096 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.624143 kubelet[2756]: E0715 05:14:54.624120 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.624295 kubelet[2756]: E0715 05:14:54.624282 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.624295 kubelet[2756]: W0715 05:14:54.624290 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.624333 kubelet[2756]: E0715 05:14:54.624296 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.624431 kubelet[2756]: E0715 05:14:54.624422 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.624431 kubelet[2756]: W0715 05:14:54.624429 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.624466 kubelet[2756]: E0715 05:14:54.624434 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.624585 kubelet[2756]: E0715 05:14:54.624571 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.624585 kubelet[2756]: W0715 05:14:54.624578 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.624585 kubelet[2756]: E0715 05:14:54.624584 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:54.624883 kubelet[2756]: E0715 05:14:54.624869 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:54.624883 kubelet[2756]: W0715 05:14:54.624877 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:54.624926 kubelet[2756]: E0715 05:14:54.624883 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.364184 kubelet[2756]: E0715 05:14:55.363795 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5fd7c" podUID="47d635b5-605e-4c77-ba36-e640d13113f8" Jul 15 05:14:55.542370 kubelet[2756]: I0715 05:14:55.542306 2756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:14:55.610410 kubelet[2756]: E0715 05:14:55.610362 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.610410 kubelet[2756]: W0715 05:14:55.610388 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.610410 kubelet[2756]: E0715 05:14:55.610407 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.611288 kubelet[2756]: E0715 05:14:55.610563 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.611288 kubelet[2756]: W0715 05:14:55.610569 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.611288 kubelet[2756]: E0715 05:14:55.610577 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.611288 kubelet[2756]: E0715 05:14:55.610741 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.611288 kubelet[2756]: W0715 05:14:55.610748 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.611288 kubelet[2756]: E0715 05:14:55.610769 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.611288 kubelet[2756]: E0715 05:14:55.610911 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.611288 kubelet[2756]: W0715 05:14:55.610916 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.611288 kubelet[2756]: E0715 05:14:55.610936 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.611288 kubelet[2756]: E0715 05:14:55.611068 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.611950 kubelet[2756]: W0715 05:14:55.611073 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.611950 kubelet[2756]: E0715 05:14:55.611079 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.611950 kubelet[2756]: E0715 05:14:55.611229 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.611950 kubelet[2756]: W0715 05:14:55.611234 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.611950 kubelet[2756]: E0715 05:14:55.611240 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.611950 kubelet[2756]: E0715 05:14:55.611382 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.611950 kubelet[2756]: W0715 05:14:55.611387 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.611950 kubelet[2756]: E0715 05:14:55.611392 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.611950 kubelet[2756]: E0715 05:14:55.611538 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.611950 kubelet[2756]: W0715 05:14:55.611543 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.612462 kubelet[2756]: E0715 05:14:55.611549 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.612462 kubelet[2756]: E0715 05:14:55.611716 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.612462 kubelet[2756]: W0715 05:14:55.611726 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.612462 kubelet[2756]: E0715 05:14:55.611755 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.612462 kubelet[2756]: E0715 05:14:55.611889 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.612462 kubelet[2756]: W0715 05:14:55.611894 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.612462 kubelet[2756]: E0715 05:14:55.611916 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.612462 kubelet[2756]: E0715 05:14:55.612035 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.612462 kubelet[2756]: W0715 05:14:55.612040 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.612462 kubelet[2756]: E0715 05:14:55.612045 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.612978 kubelet[2756]: E0715 05:14:55.612190 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.612978 kubelet[2756]: W0715 05:14:55.612195 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.612978 kubelet[2756]: E0715 05:14:55.612200 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.612978 kubelet[2756]: E0715 05:14:55.612342 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.612978 kubelet[2756]: W0715 05:14:55.612348 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.612978 kubelet[2756]: E0715 05:14:55.612354 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.612978 kubelet[2756]: E0715 05:14:55.612502 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.612978 kubelet[2756]: W0715 05:14:55.612508 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.612978 kubelet[2756]: E0715 05:14:55.612514 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.612978 kubelet[2756]: E0715 05:14:55.612637 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.613524 kubelet[2756]: W0715 05:14:55.612641 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.613524 kubelet[2756]: E0715 05:14:55.612661 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.630822 kubelet[2756]: E0715 05:14:55.629875 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.630822 kubelet[2756]: W0715 05:14:55.629896 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.630822 kubelet[2756]: E0715 05:14:55.629928 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.630822 kubelet[2756]: E0715 05:14:55.630306 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.630822 kubelet[2756]: W0715 05:14:55.630315 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.630822 kubelet[2756]: E0715 05:14:55.630325 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.630822 kubelet[2756]: E0715 05:14:55.630844 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.630822 kubelet[2756]: W0715 05:14:55.630853 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.630822 kubelet[2756]: E0715 05:14:55.630862 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.634058 kubelet[2756]: E0715 05:14:55.634030 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.634113 kubelet[2756]: W0715 05:14:55.634062 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.634113 kubelet[2756]: E0715 05:14:55.634088 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.634630 kubelet[2756]: E0715 05:14:55.634599 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.634679 kubelet[2756]: W0715 05:14:55.634632 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.634679 kubelet[2756]: E0715 05:14:55.634655 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.635612 kubelet[2756]: E0715 05:14:55.635582 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.635651 kubelet[2756]: W0715 05:14:55.635614 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.635651 kubelet[2756]: E0715 05:14:55.635635 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.636178 kubelet[2756]: E0715 05:14:55.636104 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.636178 kubelet[2756]: W0715 05:14:55.636130 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.636178 kubelet[2756]: E0715 05:14:55.636167 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.636609 kubelet[2756]: E0715 05:14:55.636579 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.636609 kubelet[2756]: W0715 05:14:55.636596 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.636664 kubelet[2756]: E0715 05:14:55.636610 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.637442 kubelet[2756]: E0715 05:14:55.637017 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.637442 kubelet[2756]: W0715 05:14:55.637037 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.637442 kubelet[2756]: E0715 05:14:55.637054 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.637518 kubelet[2756]: E0715 05:14:55.637494 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.637518 kubelet[2756]: W0715 05:14:55.637509 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.637564 kubelet[2756]: E0715 05:14:55.637527 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.637979 kubelet[2756]: E0715 05:14:55.637955 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.637979 kubelet[2756]: W0715 05:14:55.637978 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.638076 kubelet[2756]: E0715 05:14:55.637991 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.638418 kubelet[2756]: E0715 05:14:55.638393 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.638456 kubelet[2756]: W0715 05:14:55.638420 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.638456 kubelet[2756]: E0715 05:14:55.638437 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.639251 kubelet[2756]: E0715 05:14:55.639214 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.639251 kubelet[2756]: W0715 05:14:55.639229 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.639251 kubelet[2756]: E0715 05:14:55.639243 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.640056 kubelet[2756]: E0715 05:14:55.639964 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.640056 kubelet[2756]: W0715 05:14:55.639986 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.640056 kubelet[2756]: E0715 05:14:55.640005 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.640432 kubelet[2756]: E0715 05:14:55.640406 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.640432 kubelet[2756]: W0715 05:14:55.640422 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.640485 kubelet[2756]: E0715 05:14:55.640436 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.640854 kubelet[2756]: E0715 05:14:55.640790 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.640854 kubelet[2756]: W0715 05:14:55.640805 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.640854 kubelet[2756]: E0715 05:14:55.640818 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.641454 kubelet[2756]: E0715 05:14:55.641174 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.641454 kubelet[2756]: W0715 05:14:55.641188 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.641454 kubelet[2756]: E0715 05:14:55.641200 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.641867 kubelet[2756]: E0715 05:14:55.641825 2756 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:14:55.641867 kubelet[2756]: W0715 05:14:55.641847 2756 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:14:55.641937 kubelet[2756]: E0715 05:14:55.641865 2756 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:14:55.852777 containerd[1607]: time="2025-07-15T05:14:55.852738302Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:55.853827 containerd[1607]: time="2025-07-15T05:14:55.853655272Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Jul 15 05:14:55.854575 containerd[1607]: time="2025-07-15T05:14:55.854547974Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:55.856071 containerd[1607]: time="2025-07-15T05:14:55.856048268Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:14:55.856500 containerd[1607]: time="2025-07-15T05:14:55.856480063Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.803111378s" Jul 15 05:14:55.856570 containerd[1607]: time="2025-07-15T05:14:55.856558562Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Jul 15 05:14:55.860934 containerd[1607]: time="2025-07-15T05:14:55.860918219Z" level=info msg="CreateContainer within sandbox \"976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 15 05:14:55.870921 containerd[1607]: time="2025-07-15T05:14:55.870898738Z" level=info msg="Container d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:14:55.875452 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3770837058.mount: Deactivated successfully. Jul 15 05:14:55.881252 containerd[1607]: time="2025-07-15T05:14:55.881179965Z" level=info msg="CreateContainer within sandbox \"976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1\"" Jul 15 05:14:55.882318 containerd[1607]: time="2025-07-15T05:14:55.882288733Z" level=info msg="StartContainer for \"d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1\"" Jul 15 05:14:55.884684 containerd[1607]: time="2025-07-15T05:14:55.884625280Z" level=info msg="connecting to shim d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1" address="unix:///run/containerd/s/ac1c60ff0341718428040350bcc2a4c870c7a6e419da37b6a00ec23bfd90629c" protocol=ttrpc version=3 Jul 15 05:14:55.911775 systemd[1]: Started cri-containerd-d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1.scope - libcontainer container d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1. Jul 15 05:14:55.970328 containerd[1607]: time="2025-07-15T05:14:55.970277876Z" level=info msg="StartContainer for \"d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1\" returns successfully" Jul 15 05:14:55.985114 systemd[1]: cri-containerd-d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1.scope: Deactivated successfully. Jul 15 05:14:55.997302 containerd[1607]: time="2025-07-15T05:14:55.997100705Z" level=info msg="received exit event container_id:\"d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1\" id:\"d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1\" pid:3486 exited_at:{seconds:1752556495 nanos:988767980}" Jul 15 05:14:55.997591 containerd[1607]: time="2025-07-15T05:14:55.997573070Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1\" id:\"d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1\" pid:3486 exited_at:{seconds:1752556495 nanos:988767980}" Jul 15 05:14:56.025828 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d2d8076d0cf43039c9b17511093d59d0152cb958beb1c413d8d71e61de8adaf1-rootfs.mount: Deactivated successfully. Jul 15 05:14:56.545858 containerd[1607]: time="2025-07-15T05:14:56.545819919Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 15 05:14:57.364060 kubelet[2756]: E0715 05:14:57.364006 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5fd7c" podUID="47d635b5-605e-4c77-ba36-e640d13113f8" Jul 15 05:14:59.364522 kubelet[2756]: E0715 05:14:59.364465 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5fd7c" podUID="47d635b5-605e-4c77-ba36-e640d13113f8" Jul 15 05:15:01.002285 containerd[1607]: time="2025-07-15T05:15:01.002217358Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:01.003147 containerd[1607]: time="2025-07-15T05:15:01.003093452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Jul 15 05:15:01.004721 containerd[1607]: time="2025-07-15T05:15:01.004139304Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:01.006048 containerd[1607]: time="2025-07-15T05:15:01.006012932Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:01.006498 containerd[1607]: time="2025-07-15T05:15:01.006470429Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 4.460579891s" Jul 15 05:15:01.006653 containerd[1607]: time="2025-07-15T05:15:01.006641288Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Jul 15 05:15:01.010809 containerd[1607]: time="2025-07-15T05:15:01.010754770Z" level=info msg="CreateContainer within sandbox \"976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 15 05:15:01.022899 containerd[1607]: time="2025-07-15T05:15:01.021935105Z" level=info msg="Container cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:01.027605 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount859847486.mount: Deactivated successfully. Jul 15 05:15:01.035944 containerd[1607]: time="2025-07-15T05:15:01.035893232Z" level=info msg="CreateContainer within sandbox \"976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df\"" Jul 15 05:15:01.037206 containerd[1607]: time="2025-07-15T05:15:01.037040854Z" level=info msg="StartContainer for \"cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df\"" Jul 15 05:15:01.040928 containerd[1607]: time="2025-07-15T05:15:01.040836169Z" level=info msg="connecting to shim cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df" address="unix:///run/containerd/s/ac1c60ff0341718428040350bcc2a4c870c7a6e419da37b6a00ec23bfd90629c" protocol=ttrpc version=3 Jul 15 05:15:01.062709 systemd[1]: Started cri-containerd-cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df.scope - libcontainer container cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df. Jul 15 05:15:01.108393 containerd[1607]: time="2025-07-15T05:15:01.108334415Z" level=info msg="StartContainer for \"cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df\" returns successfully" Jul 15 05:15:01.365761 kubelet[2756]: E0715 05:15:01.363889 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5fd7c" podUID="47d635b5-605e-4c77-ba36-e640d13113f8" Jul 15 05:15:01.561043 systemd[1]: cri-containerd-cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df.scope: Deactivated successfully. Jul 15 05:15:01.561313 systemd[1]: cri-containerd-cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df.scope: Consumed 406ms CPU time, 163.3M memory peak, 12.1M read from disk, 171.2M written to disk. Jul 15 05:15:01.609717 kubelet[2756]: I0715 05:15:01.609156 2756 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jul 15 05:15:01.612494 containerd[1607]: time="2025-07-15T05:15:01.612470340Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df\" id:\"cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df\" pid:3544 exited_at:{seconds:1752556501 nanos:612238802}" Jul 15 05:15:01.612623 containerd[1607]: time="2025-07-15T05:15:01.612612319Z" level=info msg="received exit event container_id:\"cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df\" id:\"cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df\" pid:3544 exited_at:{seconds:1752556501 nanos:612238802}" Jul 15 05:15:01.662908 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cc3783164761a2c173d11306c6863c9fc335e3c2654fc8e4d11d8749418624df-rootfs.mount: Deactivated successfully. Jul 15 05:15:01.671829 systemd[1]: Created slice kubepods-burstable-pode9766883_0839_4e35_8d93_e80f3df5e08d.slice - libcontainer container kubepods-burstable-pode9766883_0839_4e35_8d93_e80f3df5e08d.slice. Jul 15 05:15:01.680624 kubelet[2756]: I0715 05:15:01.680370 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94qxt\" (UniqueName: \"kubernetes.io/projected/e9766883-0839-4e35-8d93-e80f3df5e08d-kube-api-access-94qxt\") pod \"coredns-674b8bbfcf-zf4mn\" (UID: \"e9766883-0839-4e35-8d93-e80f3df5e08d\") " pod="kube-system/coredns-674b8bbfcf-zf4mn" Jul 15 05:15:01.680624 kubelet[2756]: I0715 05:15:01.680414 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxpx\" (UniqueName: \"kubernetes.io/projected/b0fee464-fb97-4a07-8e93-e75fe141d636-kube-api-access-5nxpx\") pod \"whisker-79fb9b8df8-s7fqr\" (UID: \"b0fee464-fb97-4a07-8e93-e75fe141d636\") " pod="calico-system/whisker-79fb9b8df8-s7fqr" Jul 15 05:15:01.680624 kubelet[2756]: I0715 05:15:01.680430 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e9766883-0839-4e35-8d93-e80f3df5e08d-config-volume\") pod \"coredns-674b8bbfcf-zf4mn\" (UID: \"e9766883-0839-4e35-8d93-e80f3df5e08d\") " pod="kube-system/coredns-674b8bbfcf-zf4mn" Jul 15 05:15:01.680624 kubelet[2756]: I0715 05:15:01.680447 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c19bd3db-f118-488d-8572-089dbfcb231c-config-volume\") pod \"coredns-674b8bbfcf-w4hdl\" (UID: \"c19bd3db-f118-488d-8572-089dbfcb231c\") " pod="kube-system/coredns-674b8bbfcf-w4hdl" Jul 15 05:15:01.680624 kubelet[2756]: I0715 05:15:01.680474 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fee464-fb97-4a07-8e93-e75fe141d636-whisker-ca-bundle\") pod \"whisker-79fb9b8df8-s7fqr\" (UID: \"b0fee464-fb97-4a07-8e93-e75fe141d636\") " pod="calico-system/whisker-79fb9b8df8-s7fqr" Jul 15 05:15:01.680811 kubelet[2756]: I0715 05:15:01.680500 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b0fee464-fb97-4a07-8e93-e75fe141d636-whisker-backend-key-pair\") pod \"whisker-79fb9b8df8-s7fqr\" (UID: \"b0fee464-fb97-4a07-8e93-e75fe141d636\") " pod="calico-system/whisker-79fb9b8df8-s7fqr" Jul 15 05:15:01.680811 kubelet[2756]: I0715 05:15:01.680520 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f-calico-apiserver-certs\") pod \"calico-apiserver-758794b6cf-lx5t5\" (UID: \"e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f\") " pod="calico-apiserver/calico-apiserver-758794b6cf-lx5t5" Jul 15 05:15:01.680811 kubelet[2756]: I0715 05:15:01.680536 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf9ms\" (UniqueName: \"kubernetes.io/projected/83844cf0-3ac1-4b4a-a615-1a97a5ba273a-kube-api-access-vf9ms\") pod \"calico-apiserver-758794b6cf-hzkhv\" (UID: \"83844cf0-3ac1-4b4a-a615-1a97a5ba273a\") " pod="calico-apiserver/calico-apiserver-758794b6cf-hzkhv" Jul 15 05:15:01.680811 kubelet[2756]: I0715 05:15:01.680555 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f66w\" (UniqueName: \"kubernetes.io/projected/c19bd3db-f118-488d-8572-089dbfcb231c-kube-api-access-4f66w\") pod \"coredns-674b8bbfcf-w4hdl\" (UID: \"c19bd3db-f118-488d-8572-089dbfcb231c\") " pod="kube-system/coredns-674b8bbfcf-w4hdl" Jul 15 05:15:01.680811 kubelet[2756]: I0715 05:15:01.680570 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/83844cf0-3ac1-4b4a-a615-1a97a5ba273a-calico-apiserver-certs\") pod \"calico-apiserver-758794b6cf-hzkhv\" (UID: \"83844cf0-3ac1-4b4a-a615-1a97a5ba273a\") " pod="calico-apiserver/calico-apiserver-758794b6cf-hzkhv" Jul 15 05:15:01.683514 kubelet[2756]: I0715 05:15:01.680583 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4gpc\" (UniqueName: \"kubernetes.io/projected/e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f-kube-api-access-j4gpc\") pod \"calico-apiserver-758794b6cf-lx5t5\" (UID: \"e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f\") " pod="calico-apiserver/calico-apiserver-758794b6cf-lx5t5" Jul 15 05:15:01.687350 systemd[1]: Created slice kubepods-besteffort-pod317d09d9_a2c7_4c4b_9c92_5e5751ee42b1.slice - libcontainer container kubepods-besteffort-pod317d09d9_a2c7_4c4b_9c92_5e5751ee42b1.slice. Jul 15 05:15:01.738769 systemd[1]: Created slice kubepods-burstable-podc19bd3db_f118_488d_8572_089dbfcb231c.slice - libcontainer container kubepods-burstable-podc19bd3db_f118_488d_8572_089dbfcb231c.slice. Jul 15 05:15:01.750332 systemd[1]: Created slice kubepods-besteffort-podb0fee464_fb97_4a07_8e93_e75fe141d636.slice - libcontainer container kubepods-besteffort-podb0fee464_fb97_4a07_8e93_e75fe141d636.slice. Jul 15 05:15:01.759907 systemd[1]: Created slice kubepods-besteffort-pode25c6bf2_5b68_4d6d_a4f4_3b091e386e1f.slice - libcontainer container kubepods-besteffort-pode25c6bf2_5b68_4d6d_a4f4_3b091e386e1f.slice. Jul 15 05:15:01.768988 systemd[1]: Created slice kubepods-besteffort-podef68457b_55d3_4298_9f51_b7d63fa11f06.slice - libcontainer container kubepods-besteffort-podef68457b_55d3_4298_9f51_b7d63fa11f06.slice. Jul 15 05:15:01.774536 systemd[1]: Created slice kubepods-besteffort-pod83844cf0_3ac1_4b4a_a615_1a97a5ba273a.slice - libcontainer container kubepods-besteffort-pod83844cf0_3ac1_4b4a_a615_1a97a5ba273a.slice. Jul 15 05:15:01.781372 kubelet[2756]: I0715 05:15:01.781338 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jmzl\" (UniqueName: \"kubernetes.io/projected/317d09d9-a2c7-4c4b-9c92-5e5751ee42b1-kube-api-access-2jmzl\") pod \"goldmane-768f4c5c69-pgtpk\" (UID: \"317d09d9-a2c7-4c4b-9c92-5e5751ee42b1\") " pod="calico-system/goldmane-768f4c5c69-pgtpk" Jul 15 05:15:01.781481 kubelet[2756]: I0715 05:15:01.781379 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/317d09d9-a2c7-4c4b-9c92-5e5751ee42b1-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-pgtpk\" (UID: \"317d09d9-a2c7-4c4b-9c92-5e5751ee42b1\") " pod="calico-system/goldmane-768f4c5c69-pgtpk" Jul 15 05:15:01.781481 kubelet[2756]: I0715 05:15:01.781401 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ef68457b-55d3-4298-9f51-b7d63fa11f06-tigera-ca-bundle\") pod \"calico-kube-controllers-7bd8cf7f64-cj4fk\" (UID: \"ef68457b-55d3-4298-9f51-b7d63fa11f06\") " pod="calico-system/calico-kube-controllers-7bd8cf7f64-cj4fk" Jul 15 05:15:01.781481 kubelet[2756]: I0715 05:15:01.781412 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-272tj\" (UniqueName: \"kubernetes.io/projected/ef68457b-55d3-4298-9f51-b7d63fa11f06-kube-api-access-272tj\") pod \"calico-kube-controllers-7bd8cf7f64-cj4fk\" (UID: \"ef68457b-55d3-4298-9f51-b7d63fa11f06\") " pod="calico-system/calico-kube-controllers-7bd8cf7f64-cj4fk" Jul 15 05:15:01.782126 kubelet[2756]: I0715 05:15:01.781482 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/317d09d9-a2c7-4c4b-9c92-5e5751ee42b1-config\") pod \"goldmane-768f4c5c69-pgtpk\" (UID: \"317d09d9-a2c7-4c4b-9c92-5e5751ee42b1\") " pod="calico-system/goldmane-768f4c5c69-pgtpk" Jul 15 05:15:01.782126 kubelet[2756]: I0715 05:15:01.781497 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/317d09d9-a2c7-4c4b-9c92-5e5751ee42b1-goldmane-key-pair\") pod \"goldmane-768f4c5c69-pgtpk\" (UID: \"317d09d9-a2c7-4c4b-9c92-5e5751ee42b1\") " pod="calico-system/goldmane-768f4c5c69-pgtpk" Jul 15 05:15:01.985134 containerd[1607]: time="2025-07-15T05:15:01.985017058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zf4mn,Uid:e9766883-0839-4e35-8d93-e80f3df5e08d,Namespace:kube-system,Attempt:0,}" Jul 15 05:15:02.026785 containerd[1607]: time="2025-07-15T05:15:02.025854865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-pgtpk,Uid:317d09d9-a2c7-4c4b-9c92-5e5751ee42b1,Namespace:calico-system,Attempt:0,}" Jul 15 05:15:02.048368 containerd[1607]: time="2025-07-15T05:15:02.048337164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w4hdl,Uid:c19bd3db-f118-488d-8572-089dbfcb231c,Namespace:kube-system,Attempt:0,}" Jul 15 05:15:02.058091 containerd[1607]: time="2025-07-15T05:15:02.058051413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79fb9b8df8-s7fqr,Uid:b0fee464-fb97-4a07-8e93-e75fe141d636,Namespace:calico-system,Attempt:0,}" Jul 15 05:15:02.064931 containerd[1607]: time="2025-07-15T05:15:02.064758081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-758794b6cf-lx5t5,Uid:e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:15:02.074188 containerd[1607]: time="2025-07-15T05:15:02.074105473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bd8cf7f64-cj4fk,Uid:ef68457b-55d3-4298-9f51-b7d63fa11f06,Namespace:calico-system,Attempt:0,}" Jul 15 05:15:02.080991 containerd[1607]: time="2025-07-15T05:15:02.080824751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-758794b6cf-hzkhv,Uid:83844cf0-3ac1-4b4a-a615-1a97a5ba273a,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:15:02.233928 containerd[1607]: time="2025-07-15T05:15:02.233855822Z" level=error msg="Failed to destroy network for sandbox \"6fd447ddcfc5a9723a16f0fa2389a04a5e58d569666ab1d7d640534fa46a48e3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.234368 containerd[1607]: time="2025-07-15T05:15:02.234202260Z" level=error msg="Failed to destroy network for sandbox \"3a75cc7f52b5ecd1146a9f011fbeef861abbef941e67c47f40243424e0dcf46e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.236274 containerd[1607]: time="2025-07-15T05:15:02.236049877Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-79fb9b8df8-s7fqr,Uid:b0fee464-fb97-4a07-8e93-e75fe141d636,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a75cc7f52b5ecd1146a9f011fbeef861abbef941e67c47f40243424e0dcf46e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.243868 containerd[1607]: time="2025-07-15T05:15:02.243824849Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bd8cf7f64-cj4fk,Uid:ef68457b-55d3-4298-9f51-b7d63fa11f06,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd447ddcfc5a9723a16f0fa2389a04a5e58d569666ab1d7d640534fa46a48e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.249508 kubelet[2756]: E0715 05:15:02.247432 2756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a75cc7f52b5ecd1146a9f011fbeef861abbef941e67c47f40243424e0dcf46e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.249508 kubelet[2756]: E0715 05:15:02.247520 2756 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a75cc7f52b5ecd1146a9f011fbeef861abbef941e67c47f40243424e0dcf46e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-79fb9b8df8-s7fqr" Jul 15 05:15:02.249508 kubelet[2756]: E0715 05:15:02.247557 2756 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a75cc7f52b5ecd1146a9f011fbeef861abbef941e67c47f40243424e0dcf46e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-79fb9b8df8-s7fqr" Jul 15 05:15:02.257818 kubelet[2756]: E0715 05:15:02.257773 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-79fb9b8df8-s7fqr_calico-system(b0fee464-fb97-4a07-8e93-e75fe141d636)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-79fb9b8df8-s7fqr_calico-system(b0fee464-fb97-4a07-8e93-e75fe141d636)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a75cc7f52b5ecd1146a9f011fbeef861abbef941e67c47f40243424e0dcf46e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-79fb9b8df8-s7fqr" podUID="b0fee464-fb97-4a07-8e93-e75fe141d636" Jul 15 05:15:02.259265 kubelet[2756]: E0715 05:15:02.259235 2756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd447ddcfc5a9723a16f0fa2389a04a5e58d569666ab1d7d640534fa46a48e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.259334 kubelet[2756]: E0715 05:15:02.259278 2756 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd447ddcfc5a9723a16f0fa2389a04a5e58d569666ab1d7d640534fa46a48e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bd8cf7f64-cj4fk" Jul 15 05:15:02.259334 kubelet[2756]: E0715 05:15:02.259295 2756 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd447ddcfc5a9723a16f0fa2389a04a5e58d569666ab1d7d640534fa46a48e3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7bd8cf7f64-cj4fk" Jul 15 05:15:02.259334 kubelet[2756]: E0715 05:15:02.259328 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7bd8cf7f64-cj4fk_calico-system(ef68457b-55d3-4298-9f51-b7d63fa11f06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7bd8cf7f64-cj4fk_calico-system(ef68457b-55d3-4298-9f51-b7d63fa11f06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6fd447ddcfc5a9723a16f0fa2389a04a5e58d569666ab1d7d640534fa46a48e3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7bd8cf7f64-cj4fk" podUID="ef68457b-55d3-4298-9f51-b7d63fa11f06" Jul 15 05:15:02.267405 containerd[1607]: time="2025-07-15T05:15:02.267367051Z" level=error msg="Failed to destroy network for sandbox \"709ec5eae86a71f5e4a022025c3e2ff17c201fc1cc2925cd45d95c81f556d336\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.269283 containerd[1607]: time="2025-07-15T05:15:02.269249369Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w4hdl,Uid:c19bd3db-f118-488d-8572-089dbfcb231c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"709ec5eae86a71f5e4a022025c3e2ff17c201fc1cc2925cd45d95c81f556d336\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.271283 kubelet[2756]: E0715 05:15:02.271066 2756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"709ec5eae86a71f5e4a022025c3e2ff17c201fc1cc2925cd45d95c81f556d336\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.271283 kubelet[2756]: E0715 05:15:02.271149 2756 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"709ec5eae86a71f5e4a022025c3e2ff17c201fc1cc2925cd45d95c81f556d336\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-w4hdl" Jul 15 05:15:02.271283 kubelet[2756]: E0715 05:15:02.271165 2756 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"709ec5eae86a71f5e4a022025c3e2ff17c201fc1cc2925cd45d95c81f556d336\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-w4hdl" Jul 15 05:15:02.271417 kubelet[2756]: E0715 05:15:02.271241 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-w4hdl_kube-system(c19bd3db-f118-488d-8572-089dbfcb231c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-w4hdl_kube-system(c19bd3db-f118-488d-8572-089dbfcb231c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"709ec5eae86a71f5e4a022025c3e2ff17c201fc1cc2925cd45d95c81f556d336\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-w4hdl" podUID="c19bd3db-f118-488d-8572-089dbfcb231c" Jul 15 05:15:02.279311 containerd[1607]: time="2025-07-15T05:15:02.279258616Z" level=error msg="Failed to destroy network for sandbox \"b9ef50e6741e126a5ceddf655e90174ba09da426653dcc6d0e09978afb9c45fe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.281498 containerd[1607]: time="2025-07-15T05:15:02.281447513Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-758794b6cf-hzkhv,Uid:83844cf0-3ac1-4b4a-a615-1a97a5ba273a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9ef50e6741e126a5ceddf655e90174ba09da426653dcc6d0e09978afb9c45fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.282099 kubelet[2756]: E0715 05:15:02.281985 2756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9ef50e6741e126a5ceddf655e90174ba09da426653dcc6d0e09978afb9c45fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.282099 kubelet[2756]: E0715 05:15:02.282045 2756 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9ef50e6741e126a5ceddf655e90174ba09da426653dcc6d0e09978afb9c45fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-758794b6cf-hzkhv" Jul 15 05:15:02.282099 kubelet[2756]: E0715 05:15:02.282064 2756 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9ef50e6741e126a5ceddf655e90174ba09da426653dcc6d0e09978afb9c45fe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-758794b6cf-hzkhv" Jul 15 05:15:02.282331 kubelet[2756]: E0715 05:15:02.282295 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-758794b6cf-hzkhv_calico-apiserver(83844cf0-3ac1-4b4a-a615-1a97a5ba273a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-758794b6cf-hzkhv_calico-apiserver(83844cf0-3ac1-4b4a-a615-1a97a5ba273a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9ef50e6741e126a5ceddf655e90174ba09da426653dcc6d0e09978afb9c45fe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-758794b6cf-hzkhv" podUID="83844cf0-3ac1-4b4a-a615-1a97a5ba273a" Jul 15 05:15:02.287907 containerd[1607]: time="2025-07-15T05:15:02.287859343Z" level=error msg="Failed to destroy network for sandbox \"8935b4dee12d932afc7dab095153baf705662f87e00766a95843b44b013f2d48\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.290024 containerd[1607]: time="2025-07-15T05:15:02.289991259Z" level=error msg="Failed to destroy network for sandbox \"b9404e57edbe4f21192db18a35535daf1a5d384b7ef013826e5e9a1487cb2c52\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.290256 containerd[1607]: time="2025-07-15T05:15:02.290227148Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-pgtpk,Uid:317d09d9-a2c7-4c4b-9c92-5e5751ee42b1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8935b4dee12d932afc7dab095153baf705662f87e00766a95843b44b013f2d48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.290591 kubelet[2756]: E0715 05:15:02.290536 2756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8935b4dee12d932afc7dab095153baf705662f87e00766a95843b44b013f2d48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.290591 kubelet[2756]: E0715 05:15:02.290579 2756 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8935b4dee12d932afc7dab095153baf705662f87e00766a95843b44b013f2d48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-pgtpk" Jul 15 05:15:02.290649 kubelet[2756]: E0715 05:15:02.290596 2756 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8935b4dee12d932afc7dab095153baf705662f87e00766a95843b44b013f2d48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-pgtpk" Jul 15 05:15:02.290649 kubelet[2756]: E0715 05:15:02.290638 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-pgtpk_calico-system(317d09d9-a2c7-4c4b-9c92-5e5751ee42b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-pgtpk_calico-system(317d09d9-a2c7-4c4b-9c92-5e5751ee42b1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8935b4dee12d932afc7dab095153baf705662f87e00766a95843b44b013f2d48\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-pgtpk" podUID="317d09d9-a2c7-4c4b-9c92-5e5751ee42b1" Jul 15 05:15:02.291386 containerd[1607]: time="2025-07-15T05:15:02.291264981Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zf4mn,Uid:e9766883-0839-4e35-8d93-e80f3df5e08d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9404e57edbe4f21192db18a35535daf1a5d384b7ef013826e5e9a1487cb2c52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.291444 containerd[1607]: time="2025-07-15T05:15:02.291399410Z" level=error msg="Failed to destroy network for sandbox \"b3f70af25f2b64877f5e5f0738d338a1b5fb16f60a7317d9023adedc6f3d6cd1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.291705 kubelet[2756]: E0715 05:15:02.291628 2756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9404e57edbe4f21192db18a35535daf1a5d384b7ef013826e5e9a1487cb2c52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.291912 kubelet[2756]: E0715 05:15:02.291807 2756 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9404e57edbe4f21192db18a35535daf1a5d384b7ef013826e5e9a1487cb2c52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zf4mn" Jul 15 05:15:02.291999 kubelet[2756]: E0715 05:15:02.291892 2756 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9404e57edbe4f21192db18a35535daf1a5d384b7ef013826e5e9a1487cb2c52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zf4mn" Jul 15 05:15:02.292133 kubelet[2756]: E0715 05:15:02.292102 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-zf4mn_kube-system(e9766883-0839-4e35-8d93-e80f3df5e08d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-zf4mn_kube-system(e9766883-0839-4e35-8d93-e80f3df5e08d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9404e57edbe4f21192db18a35535daf1a5d384b7ef013826e5e9a1487cb2c52\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-zf4mn" podUID="e9766883-0839-4e35-8d93-e80f3df5e08d" Jul 15 05:15:02.292780 containerd[1607]: time="2025-07-15T05:15:02.292746252Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-758794b6cf-lx5t5,Uid:e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3f70af25f2b64877f5e5f0738d338a1b5fb16f60a7317d9023adedc6f3d6cd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.293002 kubelet[2756]: E0715 05:15:02.292871 2756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3f70af25f2b64877f5e5f0738d338a1b5fb16f60a7317d9023adedc6f3d6cd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:02.293002 kubelet[2756]: E0715 05:15:02.292893 2756 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3f70af25f2b64877f5e5f0738d338a1b5fb16f60a7317d9023adedc6f3d6cd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-758794b6cf-lx5t5" Jul 15 05:15:02.293002 kubelet[2756]: E0715 05:15:02.292905 2756 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b3f70af25f2b64877f5e5f0738d338a1b5fb16f60a7317d9023adedc6f3d6cd1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-758794b6cf-lx5t5" Jul 15 05:15:02.293094 kubelet[2756]: E0715 05:15:02.292932 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-758794b6cf-lx5t5_calico-apiserver(e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-758794b6cf-lx5t5_calico-apiserver(e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b3f70af25f2b64877f5e5f0738d338a1b5fb16f60a7317d9023adedc6f3d6cd1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-758794b6cf-lx5t5" podUID="e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f" Jul 15 05:15:02.585977 containerd[1607]: time="2025-07-15T05:15:02.585775685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 15 05:15:03.024022 systemd[1]: run-netns-cni\x2dd4981419\x2d685d\x2d7034\x2dfa49\x2d8ce0963019c2.mount: Deactivated successfully. Jul 15 05:15:03.024217 systemd[1]: run-netns-cni\x2dedc77a7d\x2d4852\x2df6e8\x2d0c23\x2d1f4be6e2dfc9.mount: Deactivated successfully. Jul 15 05:15:03.024341 systemd[1]: run-netns-cni\x2d5ec08d98\x2d24ef\x2d974e\x2d9c4d\x2d9467600dee39.mount: Deactivated successfully. Jul 15 05:15:03.024498 systemd[1]: run-netns-cni\x2df416c991\x2d78e9\x2d3d6c\x2d3d93\x2de3fbc58f0adf.mount: Deactivated successfully. Jul 15 05:15:03.024630 systemd[1]: run-netns-cni\x2d197f27f7\x2d88ae\x2dac12\x2d7642\x2d24ce5210d885.mount: Deactivated successfully. Jul 15 05:15:03.373991 systemd[1]: Created slice kubepods-besteffort-pod47d635b5_605e_4c77_ba36_e640d13113f8.slice - libcontainer container kubepods-besteffort-pod47d635b5_605e_4c77_ba36_e640d13113f8.slice. Jul 15 05:15:03.378432 containerd[1607]: time="2025-07-15T05:15:03.378359844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5fd7c,Uid:47d635b5-605e-4c77-ba36-e640d13113f8,Namespace:calico-system,Attempt:0,}" Jul 15 05:15:03.459175 containerd[1607]: time="2025-07-15T05:15:03.459023952Z" level=error msg="Failed to destroy network for sandbox \"83f3f55ffed8827ddc966e4e59ec1e8f3e17b3475600ac91e205fd2c04ff5603\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:03.461194 systemd[1]: run-netns-cni\x2d47bd5e88\x2d1ff6\x2d17cd\x2d13e3\x2d0648fcd69f59.mount: Deactivated successfully. Jul 15 05:15:03.463109 containerd[1607]: time="2025-07-15T05:15:03.463053198Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5fd7c,Uid:47d635b5-605e-4c77-ba36-e640d13113f8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"83f3f55ffed8827ddc966e4e59ec1e8f3e17b3475600ac91e205fd2c04ff5603\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:03.463489 kubelet[2756]: E0715 05:15:03.463419 2756 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83f3f55ffed8827ddc966e4e59ec1e8f3e17b3475600ac91e205fd2c04ff5603\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:15:03.464332 kubelet[2756]: E0715 05:15:03.463760 2756 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83f3f55ffed8827ddc966e4e59ec1e8f3e17b3475600ac91e205fd2c04ff5603\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5fd7c" Jul 15 05:15:03.464332 kubelet[2756]: E0715 05:15:03.463785 2756 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"83f3f55ffed8827ddc966e4e59ec1e8f3e17b3475600ac91e205fd2c04ff5603\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5fd7c" Jul 15 05:15:03.464332 kubelet[2756]: E0715 05:15:03.463834 2756 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5fd7c_calico-system(47d635b5-605e-4c77-ba36-e640d13113f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5fd7c_calico-system(47d635b5-605e-4c77-ba36-e640d13113f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"83f3f55ffed8827ddc966e4e59ec1e8f3e17b3475600ac91e205fd2c04ff5603\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5fd7c" podUID="47d635b5-605e-4c77-ba36-e640d13113f8" Jul 15 05:15:06.433033 kubelet[2756]: I0715 05:15:06.432949 2756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:15:10.459047 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1381856855.mount: Deactivated successfully. Jul 15 05:15:10.532154 containerd[1607]: time="2025-07-15T05:15:10.523297520Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:10.537228 containerd[1607]: time="2025-07-15T05:15:10.537191350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Jul 15 05:15:10.539830 containerd[1607]: time="2025-07-15T05:15:10.539780231Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:10.541707 containerd[1607]: time="2025-07-15T05:15:10.541658735Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:10.544124 containerd[1607]: time="2025-07-15T05:15:10.544088016Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 7.95604797s" Jul 15 05:15:10.544213 containerd[1607]: time="2025-07-15T05:15:10.544197846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Jul 15 05:15:10.575271 containerd[1607]: time="2025-07-15T05:15:10.574766136Z" level=info msg="CreateContainer within sandbox \"976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 15 05:15:10.599983 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount123424380.mount: Deactivated successfully. Jul 15 05:15:10.600544 containerd[1607]: time="2025-07-15T05:15:10.600503944Z" level=info msg="Container fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:10.620211 containerd[1607]: time="2025-07-15T05:15:10.620164684Z" level=info msg="CreateContainer within sandbox \"976b1a34a262b4ddc5ec4e11af0e44903ae42bbf2190314d100271abacbbfbe1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\"" Jul 15 05:15:10.620868 containerd[1607]: time="2025-07-15T05:15:10.620837992Z" level=info msg="StartContainer for \"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\"" Jul 15 05:15:10.624206 containerd[1607]: time="2025-07-15T05:15:10.623695901Z" level=info msg="connecting to shim fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f" address="unix:///run/containerd/s/ac1c60ff0341718428040350bcc2a4c870c7a6e419da37b6a00ec23bfd90629c" protocol=ttrpc version=3 Jul 15 05:15:10.707133 systemd[1]: Started cri-containerd-fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f.scope - libcontainer container fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f. Jul 15 05:15:10.783423 containerd[1607]: time="2025-07-15T05:15:10.783364961Z" level=info msg="StartContainer for \"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\" returns successfully" Jul 15 05:15:10.936080 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 15 05:15:10.937265 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 15 05:15:11.254710 kubelet[2756]: I0715 05:15:11.254654 2756 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b0fee464-fb97-4a07-8e93-e75fe141d636-whisker-backend-key-pair\") pod \"b0fee464-fb97-4a07-8e93-e75fe141d636\" (UID: \"b0fee464-fb97-4a07-8e93-e75fe141d636\") " Jul 15 05:15:11.255863 kubelet[2756]: I0715 05:15:11.255818 2756 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nxpx\" (UniqueName: \"kubernetes.io/projected/b0fee464-fb97-4a07-8e93-e75fe141d636-kube-api-access-5nxpx\") pod \"b0fee464-fb97-4a07-8e93-e75fe141d636\" (UID: \"b0fee464-fb97-4a07-8e93-e75fe141d636\") " Jul 15 05:15:11.255933 kubelet[2756]: I0715 05:15:11.255903 2756 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fee464-fb97-4a07-8e93-e75fe141d636-whisker-ca-bundle\") pod \"b0fee464-fb97-4a07-8e93-e75fe141d636\" (UID: \"b0fee464-fb97-4a07-8e93-e75fe141d636\") " Jul 15 05:15:11.260361 kubelet[2756]: I0715 05:15:11.260305 2756 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0fee464-fb97-4a07-8e93-e75fe141d636-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "b0fee464-fb97-4a07-8e93-e75fe141d636" (UID: "b0fee464-fb97-4a07-8e93-e75fe141d636"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jul 15 05:15:11.271659 kubelet[2756]: I0715 05:15:11.271599 2756 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fee464-fb97-4a07-8e93-e75fe141d636-kube-api-access-5nxpx" (OuterVolumeSpecName: "kube-api-access-5nxpx") pod "b0fee464-fb97-4a07-8e93-e75fe141d636" (UID: "b0fee464-fb97-4a07-8e93-e75fe141d636"). InnerVolumeSpecName "kube-api-access-5nxpx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 15 05:15:11.272893 kubelet[2756]: I0715 05:15:11.272843 2756 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fee464-fb97-4a07-8e93-e75fe141d636-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "b0fee464-fb97-4a07-8e93-e75fe141d636" (UID: "b0fee464-fb97-4a07-8e93-e75fe141d636"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 15 05:15:11.356738 kubelet[2756]: I0715 05:15:11.356149 2756 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/b0fee464-fb97-4a07-8e93-e75fe141d636-whisker-backend-key-pair\") on node \"ci-4396-0-0-n-85c8113064\" DevicePath \"\"" Jul 15 05:15:11.356738 kubelet[2756]: I0715 05:15:11.356704 2756 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5nxpx\" (UniqueName: \"kubernetes.io/projected/b0fee464-fb97-4a07-8e93-e75fe141d636-kube-api-access-5nxpx\") on node \"ci-4396-0-0-n-85c8113064\" DevicePath \"\"" Jul 15 05:15:11.356738 kubelet[2756]: I0715 05:15:11.356716 2756 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fee464-fb97-4a07-8e93-e75fe141d636-whisker-ca-bundle\") on node \"ci-4396-0-0-n-85c8113064\" DevicePath \"\"" Jul 15 05:15:11.461822 systemd[1]: var-lib-kubelet-pods-b0fee464\x2dfb97\x2d4a07\x2d8e93\x2de75fe141d636-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5nxpx.mount: Deactivated successfully. Jul 15 05:15:11.461973 systemd[1]: var-lib-kubelet-pods-b0fee464\x2dfb97\x2d4a07\x2d8e93\x2de75fe141d636-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 15 05:15:11.634234 systemd[1]: Removed slice kubepods-besteffort-podb0fee464_fb97_4a07_8e93_e75fe141d636.slice - libcontainer container kubepods-besteffort-podb0fee464_fb97_4a07_8e93_e75fe141d636.slice. Jul 15 05:15:11.654345 kubelet[2756]: I0715 05:15:11.652358 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9zns4" podStartSLOduration=1.726092168 podStartE2EDuration="20.652343318s" podCreationTimestamp="2025-07-15 05:14:51 +0000 UTC" firstStartedPulling="2025-07-15 05:14:51.618745592 +0000 UTC m=+19.356019793" lastFinishedPulling="2025-07-15 05:15:10.544996742 +0000 UTC m=+38.282270943" observedRunningTime="2025-07-15 05:15:11.651077293 +0000 UTC m=+39.388351504" watchObservedRunningTime="2025-07-15 05:15:11.652343318 +0000 UTC m=+39.389617529" Jul 15 05:15:11.750708 systemd[1]: Created slice kubepods-besteffort-pod681aab73_df3d_433d_a797_f4e2410ed78e.slice - libcontainer container kubepods-besteffort-pod681aab73_df3d_433d_a797_f4e2410ed78e.slice. Jul 15 05:15:11.822474 containerd[1607]: time="2025-07-15T05:15:11.822440633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\" id:\"4e1d759688a1025bf47353da6c0dc94433d612f8e532da55c8459aa88632e93e\" pid:3884 exit_status:1 exited_at:{seconds:1752556511 nanos:821864695}" Jul 15 05:15:11.860722 kubelet[2756]: I0715 05:15:11.860686 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/681aab73-df3d-433d-a797-f4e2410ed78e-whisker-ca-bundle\") pod \"whisker-85bd7f758b-vxkst\" (UID: \"681aab73-df3d-433d-a797-f4e2410ed78e\") " pod="calico-system/whisker-85bd7f758b-vxkst" Jul 15 05:15:11.860722 kubelet[2756]: I0715 05:15:11.860720 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/681aab73-df3d-433d-a797-f4e2410ed78e-whisker-backend-key-pair\") pod \"whisker-85bd7f758b-vxkst\" (UID: \"681aab73-df3d-433d-a797-f4e2410ed78e\") " pod="calico-system/whisker-85bd7f758b-vxkst" Jul 15 05:15:11.860931 kubelet[2756]: I0715 05:15:11.860741 2756 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szs99\" (UniqueName: \"kubernetes.io/projected/681aab73-df3d-433d-a797-f4e2410ed78e-kube-api-access-szs99\") pod \"whisker-85bd7f758b-vxkst\" (UID: \"681aab73-df3d-433d-a797-f4e2410ed78e\") " pod="calico-system/whisker-85bd7f758b-vxkst" Jul 15 05:15:12.056118 containerd[1607]: time="2025-07-15T05:15:12.056056339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85bd7f758b-vxkst,Uid:681aab73-df3d-433d-a797-f4e2410ed78e,Namespace:calico-system,Attempt:0,}" Jul 15 05:15:12.369484 kubelet[2756]: I0715 05:15:12.369274 2756 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fee464-fb97-4a07-8e93-e75fe141d636" path="/var/lib/kubelet/pods/b0fee464-fb97-4a07-8e93-e75fe141d636/volumes" Jul 15 05:15:12.377408 systemd-networkd[1477]: caliec89929e7bf: Link UP Jul 15 05:15:12.377881 systemd-networkd[1477]: caliec89929e7bf: Gained carrier Jul 15 05:15:12.399795 containerd[1607]: 2025-07-15 05:15:12.109 [INFO][3899] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:15:12.399795 containerd[1607]: 2025-07-15 05:15:12.137 [INFO][3899] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0 whisker-85bd7f758b- calico-system 681aab73-df3d-433d-a797-f4e2410ed78e 921 0 2025-07-15 05:15:11 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:85bd7f758b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4396-0-0-n-85c8113064 whisker-85bd7f758b-vxkst eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliec89929e7bf [] [] }} ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Namespace="calico-system" Pod="whisker-85bd7f758b-vxkst" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-" Jul 15 05:15:12.399795 containerd[1607]: 2025-07-15 05:15:12.137 [INFO][3899] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Namespace="calico-system" Pod="whisker-85bd7f758b-vxkst" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" Jul 15 05:15:12.399795 containerd[1607]: 2025-07-15 05:15:12.300 [INFO][3911] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" HandleID="k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Workload="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.302 [INFO][3911] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" HandleID="k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Workload="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00036a540), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-85c8113064", "pod":"whisker-85bd7f758b-vxkst", "timestamp":"2025-07-15 05:15:12.300506974 +0000 UTC"}, Hostname:"ci-4396-0-0-n-85c8113064", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.302 [INFO][3911] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.303 [INFO][3911] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.303 [INFO][3911] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-85c8113064' Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.324 [INFO][3911] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.338 [INFO][3911] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.344 [INFO][3911] ipam/ipam.go 511: Trying affinity for 192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.346 [INFO][3911] ipam/ipam.go 158: Attempting to load block cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.399992 containerd[1607]: 2025-07-15 05:15:12.348 [INFO][3911] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.400157 containerd[1607]: 2025-07-15 05:15:12.348 [INFO][3911] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.0.0/26 handle="k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.400157 containerd[1607]: 2025-07-15 05:15:12.349 [INFO][3911] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984 Jul 15 05:15:12.400157 containerd[1607]: 2025-07-15 05:15:12.354 [INFO][3911] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.0.0/26 handle="k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.400157 containerd[1607]: 2025-07-15 05:15:12.360 [INFO][3911] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.0.1/26] block=192.168.0.0/26 handle="k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.400157 containerd[1607]: 2025-07-15 05:15:12.360 [INFO][3911] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.0.1/26] handle="k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:12.400157 containerd[1607]: 2025-07-15 05:15:12.360 [INFO][3911] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:12.400157 containerd[1607]: 2025-07-15 05:15:12.360 [INFO][3911] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.0.1/26] IPv6=[] ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" HandleID="k8s-pod-network.fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Workload="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" Jul 15 05:15:12.400271 containerd[1607]: 2025-07-15 05:15:12.364 [INFO][3899] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Namespace="calico-system" Pod="whisker-85bd7f758b-vxkst" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0", GenerateName:"whisker-85bd7f758b-", Namespace:"calico-system", SelfLink:"", UID:"681aab73-df3d-433d-a797-f4e2410ed78e", ResourceVersion:"921", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 15, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85bd7f758b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"", Pod:"whisker-85bd7f758b-vxkst", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.0.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliec89929e7bf", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:12.400271 containerd[1607]: 2025-07-15 05:15:12.364 [INFO][3899] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.0.1/32] ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Namespace="calico-system" Pod="whisker-85bd7f758b-vxkst" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" Jul 15 05:15:12.400325 containerd[1607]: 2025-07-15 05:15:12.364 [INFO][3899] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliec89929e7bf ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Namespace="calico-system" Pod="whisker-85bd7f758b-vxkst" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" Jul 15 05:15:12.400325 containerd[1607]: 2025-07-15 05:15:12.378 [INFO][3899] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Namespace="calico-system" Pod="whisker-85bd7f758b-vxkst" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" Jul 15 05:15:12.400354 containerd[1607]: 2025-07-15 05:15:12.378 [INFO][3899] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Namespace="calico-system" Pod="whisker-85bd7f758b-vxkst" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0", GenerateName:"whisker-85bd7f758b-", Namespace:"calico-system", SelfLink:"", UID:"681aab73-df3d-433d-a797-f4e2410ed78e", ResourceVersion:"921", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 15, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85bd7f758b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984", Pod:"whisker-85bd7f758b-vxkst", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.0.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliec89929e7bf", MAC:"1a:e6:b0:5e:74:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:12.400393 containerd[1607]: 2025-07-15 05:15:12.388 [INFO][3899] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" Namespace="calico-system" Pod="whisker-85bd7f758b-vxkst" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-whisker--85bd7f758b--vxkst-eth0" Jul 15 05:15:12.564565 containerd[1607]: time="2025-07-15T05:15:12.564067440Z" level=info msg="connecting to shim fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984" address="unix:///run/containerd/s/b1b11c817460e5a32390e2fc4d71b2eb26a8d522aa63abf641b1e8fb66682b38" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:12.616489 systemd[1]: Started cri-containerd-fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984.scope - libcontainer container fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984. Jul 15 05:15:12.755568 containerd[1607]: time="2025-07-15T05:15:12.754714850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85bd7f758b-vxkst,Uid:681aab73-df3d-433d-a797-f4e2410ed78e,Namespace:calico-system,Attempt:0,} returns sandbox id \"fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984\"" Jul 15 05:15:12.762714 containerd[1607]: time="2025-07-15T05:15:12.762001888Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 15 05:15:12.903247 containerd[1607]: time="2025-07-15T05:15:12.902574334Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\" id:\"4a87977f68c280dd9663a3bc2f81585fca3fa9f2d5efc4a26c9520ff08011e6b\" pid:4063 exit_status:1 exited_at:{seconds:1752556512 nanos:901913196}" Jul 15 05:15:13.139160 systemd-networkd[1477]: vxlan.calico: Link UP Jul 15 05:15:13.140277 systemd-networkd[1477]: vxlan.calico: Gained carrier Jul 15 05:15:13.871655 systemd-networkd[1477]: caliec89929e7bf: Gained IPv6LL Jul 15 05:15:14.366010 containerd[1607]: time="2025-07-15T05:15:14.365604007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w4hdl,Uid:c19bd3db-f118-488d-8572-089dbfcb231c,Namespace:kube-system,Attempt:0,}" Jul 15 05:15:14.367013 containerd[1607]: time="2025-07-15T05:15:14.366489335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-pgtpk,Uid:317d09d9-a2c7-4c4b-9c92-5e5751ee42b1,Namespace:calico-system,Attempt:0,}" Jul 15 05:15:14.367013 containerd[1607]: time="2025-07-15T05:15:14.366612265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-758794b6cf-lx5t5,Uid:e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:15:14.367013 containerd[1607]: time="2025-07-15T05:15:14.366761954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5fd7c,Uid:47d635b5-605e-4c77-ba36-e640d13113f8,Namespace:calico-system,Attempt:0,}" Jul 15 05:15:14.367013 containerd[1607]: time="2025-07-15T05:15:14.366936934Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bd8cf7f64-cj4fk,Uid:ef68457b-55d3-4298-9f51-b7d63fa11f06,Namespace:calico-system,Attempt:0,}" Jul 15 05:15:14.574960 systemd-networkd[1477]: vxlan.calico: Gained IPv6LL Jul 15 05:15:14.618384 systemd-networkd[1477]: calied6975798c0: Link UP Jul 15 05:15:14.619570 systemd-networkd[1477]: calied6975798c0: Gained carrier Jul 15 05:15:14.651592 containerd[1607]: 2025-07-15 05:15:14.477 [INFO][4208] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0 calico-apiserver-758794b6cf- calico-apiserver e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f 842 0 2025-07-15 05:14:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:758794b6cf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396-0-0-n-85c8113064 calico-apiserver-758794b6cf-lx5t5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calied6975798c0 [] [] }} ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-lx5t5" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-" Jul 15 05:15:14.651592 containerd[1607]: 2025-07-15 05:15:14.477 [INFO][4208] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-lx5t5" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" Jul 15 05:15:14.651592 containerd[1607]: 2025-07-15 05:15:14.553 [INFO][4256] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" HandleID="k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.553 [INFO][4256] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" HandleID="k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032d4f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396-0-0-n-85c8113064", "pod":"calico-apiserver-758794b6cf-lx5t5", "timestamp":"2025-07-15 05:15:14.552811658 +0000 UTC"}, Hostname:"ci-4396-0-0-n-85c8113064", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.553 [INFO][4256] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.553 [INFO][4256] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.553 [INFO][4256] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-85c8113064' Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.566 [INFO][4256] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.574 [INFO][4256] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.582 [INFO][4256] ipam/ipam.go 511: Trying affinity for 192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.585 [INFO][4256] ipam/ipam.go 158: Attempting to load block cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651798 containerd[1607]: 2025-07-15 05:15:14.587 [INFO][4256] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651954 containerd[1607]: 2025-07-15 05:15:14.587 [INFO][4256] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.0.0/26 handle="k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651954 containerd[1607]: 2025-07-15 05:15:14.588 [INFO][4256] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc Jul 15 05:15:14.651954 containerd[1607]: 2025-07-15 05:15:14.594 [INFO][4256] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.0.0/26 handle="k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651954 containerd[1607]: 2025-07-15 05:15:14.604 [INFO][4256] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.0.2/26] block=192.168.0.0/26 handle="k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651954 containerd[1607]: 2025-07-15 05:15:14.604 [INFO][4256] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.0.2/26] handle="k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.651954 containerd[1607]: 2025-07-15 05:15:14.604 [INFO][4256] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:14.651954 containerd[1607]: 2025-07-15 05:15:14.604 [INFO][4256] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.0.2/26] IPv6=[] ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" HandleID="k8s-pod-network.d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" Jul 15 05:15:14.652056 containerd[1607]: 2025-07-15 05:15:14.610 [INFO][4208] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-lx5t5" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0", GenerateName:"calico-apiserver-758794b6cf-", Namespace:"calico-apiserver", SelfLink:"", UID:"e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"758794b6cf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"", Pod:"calico-apiserver-758794b6cf-lx5t5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.0.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calied6975798c0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:14.652444 containerd[1607]: 2025-07-15 05:15:14.610 [INFO][4208] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.0.2/32] ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-lx5t5" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" Jul 15 05:15:14.652444 containerd[1607]: 2025-07-15 05:15:14.610 [INFO][4208] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calied6975798c0 ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-lx5t5" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" Jul 15 05:15:14.652444 containerd[1607]: 2025-07-15 05:15:14.620 [INFO][4208] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-lx5t5" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" Jul 15 05:15:14.652538 containerd[1607]: 2025-07-15 05:15:14.621 [INFO][4208] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-lx5t5" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0", GenerateName:"calico-apiserver-758794b6cf-", Namespace:"calico-apiserver", SelfLink:"", UID:"e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"758794b6cf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc", Pod:"calico-apiserver-758794b6cf-lx5t5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.0.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calied6975798c0", MAC:"8a:4b:7f:5c:30:94", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:14.652584 containerd[1607]: 2025-07-15 05:15:14.635 [INFO][4208] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-lx5t5" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--lx5t5-eth0" Jul 15 05:15:14.710902 containerd[1607]: time="2025-07-15T05:15:14.710661058Z" level=info msg="connecting to shim d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc" address="unix:///run/containerd/s/b4b06111e4aba1c61aecf0ebd4d5a6da4cd42128a49ef4e1f65a7082f2ae8cf0" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:14.727028 systemd-networkd[1477]: caliacb7bb820be: Link UP Jul 15 05:15:14.727612 systemd-networkd[1477]: caliacb7bb820be: Gained carrier Jul 15 05:15:14.759751 containerd[1607]: 2025-07-15 05:15:14.478 [INFO][4191] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0 coredns-674b8bbfcf- kube-system c19bd3db-f118-488d-8572-089dbfcb231c 843 0 2025-07-15 05:14:39 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4396-0-0-n-85c8113064 coredns-674b8bbfcf-w4hdl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliacb7bb820be [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Namespace="kube-system" Pod="coredns-674b8bbfcf-w4hdl" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-" Jul 15 05:15:14.759751 containerd[1607]: 2025-07-15 05:15:14.479 [INFO][4191] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Namespace="kube-system" Pod="coredns-674b8bbfcf-w4hdl" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" Jul 15 05:15:14.759751 containerd[1607]: 2025-07-15 05:15:14.603 [INFO][4266] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" HandleID="k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Workload="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.603 [INFO][4266] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" HandleID="k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Workload="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00032a400), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4396-0-0-n-85c8113064", "pod":"coredns-674b8bbfcf-w4hdl", "timestamp":"2025-07-15 05:15:14.603391363 +0000 UTC"}, Hostname:"ci-4396-0-0-n-85c8113064", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.603 [INFO][4266] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.604 [INFO][4266] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.604 [INFO][4266] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-85c8113064' Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.667 [INFO][4266] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.678 [INFO][4266] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.688 [INFO][4266] ipam/ipam.go 511: Trying affinity for 192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.691 [INFO][4266] ipam/ipam.go 158: Attempting to load block cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.759967 containerd[1607]: 2025-07-15 05:15:14.694 [INFO][4266] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.761747 containerd[1607]: 2025-07-15 05:15:14.695 [INFO][4266] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.0.0/26 handle="k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.761747 containerd[1607]: 2025-07-15 05:15:14.699 [INFO][4266] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579 Jul 15 05:15:14.761747 containerd[1607]: 2025-07-15 05:15:14.707 [INFO][4266] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.0.0/26 handle="k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.761747 containerd[1607]: 2025-07-15 05:15:14.713 [INFO][4266] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.0.3/26] block=192.168.0.0/26 handle="k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.761747 containerd[1607]: 2025-07-15 05:15:14.713 [INFO][4266] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.0.3/26] handle="k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.761747 containerd[1607]: 2025-07-15 05:15:14.714 [INFO][4266] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:14.761747 containerd[1607]: 2025-07-15 05:15:14.714 [INFO][4266] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.0.3/26] IPv6=[] ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" HandleID="k8s-pod-network.bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Workload="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" Jul 15 05:15:14.762539 containerd[1607]: 2025-07-15 05:15:14.720 [INFO][4191] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Namespace="kube-system" Pod="coredns-674b8bbfcf-w4hdl" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c19bd3db-f118-488d-8572-089dbfcb231c", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"", Pod:"coredns-674b8bbfcf-w4hdl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.0.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliacb7bb820be", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:14.762539 containerd[1607]: 2025-07-15 05:15:14.720 [INFO][4191] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.0.3/32] ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Namespace="kube-system" Pod="coredns-674b8bbfcf-w4hdl" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" Jul 15 05:15:14.762539 containerd[1607]: 2025-07-15 05:15:14.720 [INFO][4191] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliacb7bb820be ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Namespace="kube-system" Pod="coredns-674b8bbfcf-w4hdl" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" Jul 15 05:15:14.762539 containerd[1607]: 2025-07-15 05:15:14.723 [INFO][4191] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Namespace="kube-system" Pod="coredns-674b8bbfcf-w4hdl" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" Jul 15 05:15:14.762539 containerd[1607]: 2025-07-15 05:15:14.731 [INFO][4191] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Namespace="kube-system" Pod="coredns-674b8bbfcf-w4hdl" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"c19bd3db-f118-488d-8572-089dbfcb231c", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579", Pod:"coredns-674b8bbfcf-w4hdl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.0.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliacb7bb820be", MAC:"c6:b6:74:d1:ec:e8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:14.762539 containerd[1607]: 2025-07-15 05:15:14.753 [INFO][4191] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" Namespace="kube-system" Pod="coredns-674b8bbfcf-w4hdl" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--w4hdl-eth0" Jul 15 05:15:14.772848 systemd[1]: Started cri-containerd-d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc.scope - libcontainer container d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc. Jul 15 05:15:14.800189 containerd[1607]: time="2025-07-15T05:15:14.800147709Z" level=info msg="connecting to shim bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579" address="unix:///run/containerd/s/19504ba18a07772abe7d0bbcfb7d77ddc1a4dae5f9b3d86667178b8c687891e7" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:14.811486 containerd[1607]: time="2025-07-15T05:15:14.811444659Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:14.812936 containerd[1607]: time="2025-07-15T05:15:14.812746596Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Jul 15 05:15:14.813801 containerd[1607]: time="2025-07-15T05:15:14.813777333Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:14.821332 containerd[1607]: time="2025-07-15T05:15:14.821111433Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:14.821782 containerd[1607]: time="2025-07-15T05:15:14.821733341Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 2.059682653s" Jul 15 05:15:14.821812 containerd[1607]: time="2025-07-15T05:15:14.821786851Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Jul 15 05:15:14.843077 systemd[1]: Started cri-containerd-bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579.scope - libcontainer container bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579. Jul 15 05:15:14.858894 containerd[1607]: time="2025-07-15T05:15:14.857581356Z" level=info msg="CreateContainer within sandbox \"fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 15 05:15:14.863645 systemd-networkd[1477]: cali0b6a17265a9: Link UP Jul 15 05:15:14.864569 systemd-networkd[1477]: cali0b6a17265a9: Gained carrier Jul 15 05:15:14.871026 containerd[1607]: time="2025-07-15T05:15:14.870923691Z" level=info msg="Container 20fac5090676aae2995c9e7557315797403a76c17b9d1e117a114859c5a316e3: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.482 [INFO][4202] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0 goldmane-768f4c5c69- calico-system 317d09d9-a2c7-4c4b-9c92-5e5751ee42b1 841 0 2025-07-15 05:14:50 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4396-0-0-n-85c8113064 goldmane-768f4c5c69-pgtpk eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali0b6a17265a9 [] [] }} ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Namespace="calico-system" Pod="goldmane-768f4c5c69-pgtpk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.482 [INFO][4202] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Namespace="calico-system" Pod="goldmane-768f4c5c69-pgtpk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.609 [INFO][4271] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" HandleID="k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Workload="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.609 [INFO][4271] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" HandleID="k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Workload="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033ccb0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-85c8113064", "pod":"goldmane-768f4c5c69-pgtpk", "timestamp":"2025-07-15 05:15:14.609798216 +0000 UTC"}, Hostname:"ci-4396-0-0-n-85c8113064", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.609 [INFO][4271] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.714 [INFO][4271] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.714 [INFO][4271] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-85c8113064' Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.769 [INFO][4271] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.779 [INFO][4271] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.787 [INFO][4271] ipam/ipam.go 511: Trying affinity for 192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.790 [INFO][4271] ipam/ipam.go 158: Attempting to load block cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.793 [INFO][4271] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.793 [INFO][4271] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.0.0/26 handle="k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.795 [INFO][4271] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.801 [INFO][4271] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.0.0/26 handle="k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.808 [INFO][4271] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.0.4/26] block=192.168.0.0/26 handle="k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.808 [INFO][4271] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.0.4/26] handle="k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.809 [INFO][4271] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:14.887865 containerd[1607]: 2025-07-15 05:15:14.810 [INFO][4271] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.0.4/26] IPv6=[] ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" HandleID="k8s-pod-network.7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Workload="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" Jul 15 05:15:14.889069 containerd[1607]: 2025-07-15 05:15:14.836 [INFO][4202] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Namespace="calico-system" Pod="goldmane-768f4c5c69-pgtpk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"317d09d9-a2c7-4c4b-9c92-5e5751ee42b1", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"", Pod:"goldmane-768f4c5c69-pgtpk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.0.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0b6a17265a9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:14.889069 containerd[1607]: 2025-07-15 05:15:14.838 [INFO][4202] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.0.4/32] ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Namespace="calico-system" Pod="goldmane-768f4c5c69-pgtpk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" Jul 15 05:15:14.889069 containerd[1607]: 2025-07-15 05:15:14.838 [INFO][4202] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0b6a17265a9 ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Namespace="calico-system" Pod="goldmane-768f4c5c69-pgtpk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" Jul 15 05:15:14.889069 containerd[1607]: 2025-07-15 05:15:14.864 [INFO][4202] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Namespace="calico-system" Pod="goldmane-768f4c5c69-pgtpk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" Jul 15 05:15:14.889069 containerd[1607]: 2025-07-15 05:15:14.865 [INFO][4202] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Namespace="calico-system" Pod="goldmane-768f4c5c69-pgtpk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"317d09d9-a2c7-4c4b-9c92-5e5751ee42b1", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c", Pod:"goldmane-768f4c5c69-pgtpk", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.0.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali0b6a17265a9", MAC:"9e:63:72:31:bd:bb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:14.889069 containerd[1607]: 2025-07-15 05:15:14.882 [INFO][4202] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" Namespace="calico-system" Pod="goldmane-768f4c5c69-pgtpk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-goldmane--768f4c5c69--pgtpk-eth0" Jul 15 05:15:14.896434 containerd[1607]: time="2025-07-15T05:15:14.895198416Z" level=info msg="CreateContainer within sandbox \"fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"20fac5090676aae2995c9e7557315797403a76c17b9d1e117a114859c5a316e3\"" Jul 15 05:15:14.901365 containerd[1607]: time="2025-07-15T05:15:14.901342459Z" level=info msg="StartContainer for \"20fac5090676aae2995c9e7557315797403a76c17b9d1e117a114859c5a316e3\"" Jul 15 05:15:14.910094 containerd[1607]: time="2025-07-15T05:15:14.909868537Z" level=info msg="connecting to shim 20fac5090676aae2995c9e7557315797403a76c17b9d1e117a114859c5a316e3" address="unix:///run/containerd/s/b1b11c817460e5a32390e2fc4d71b2eb26a8d522aa63abf641b1e8fb66682b38" protocol=ttrpc version=3 Jul 15 05:15:14.919062 containerd[1607]: time="2025-07-15T05:15:14.918909713Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-758794b6cf-lx5t5,Uid:e25c6bf2-5b68-4d6d-a4f4-3b091e386e1f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc\"" Jul 15 05:15:14.924951 containerd[1607]: time="2025-07-15T05:15:14.924918766Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:15:14.950804 systemd[1]: Started cri-containerd-20fac5090676aae2995c9e7557315797403a76c17b9d1e117a114859c5a316e3.scope - libcontainer container 20fac5090676aae2995c9e7557315797403a76c17b9d1e117a114859c5a316e3. Jul 15 05:15:14.965449 containerd[1607]: time="2025-07-15T05:15:14.965118699Z" level=info msg="connecting to shim 7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c" address="unix:///run/containerd/s/b9e471ac59f3c3ffef2e9335d8f0b43fe5ac55b081e5944474ed5a9c605c1718" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:14.989175 containerd[1607]: time="2025-07-15T05:15:14.988976236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-w4hdl,Uid:c19bd3db-f118-488d-8572-089dbfcb231c,Namespace:kube-system,Attempt:0,} returns sandbox id \"bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579\"" Jul 15 05:15:15.012182 containerd[1607]: time="2025-07-15T05:15:15.012144407Z" level=info msg="CreateContainer within sandbox \"bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:15:15.013586 systemd-networkd[1477]: calid410363f694: Link UP Jul 15 05:15:15.015654 systemd-networkd[1477]: calid410363f694: Gained carrier Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.519 [INFO][4229] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0 calico-kube-controllers-7bd8cf7f64- calico-system ef68457b-55d3-4298-9f51-b7d63fa11f06 844 0 2025-07-15 05:14:51 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7bd8cf7f64 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4396-0-0-n-85c8113064 calico-kube-controllers-7bd8cf7f64-cj4fk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid410363f694 [] [] }} ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Namespace="calico-system" Pod="calico-kube-controllers-7bd8cf7f64-cj4fk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.519 [INFO][4229] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Namespace="calico-system" Pod="calico-kube-controllers-7bd8cf7f64-cj4fk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.635 [INFO][4278] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" HandleID="k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.636 [INFO][4278] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" HandleID="k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123710), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-85c8113064", "pod":"calico-kube-controllers-7bd8cf7f64-cj4fk", "timestamp":"2025-07-15 05:15:14.629063835 +0000 UTC"}, Hostname:"ci-4396-0-0-n-85c8113064", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.636 [INFO][4278] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.809 [INFO][4278] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.810 [INFO][4278] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-85c8113064' Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.874 [INFO][4278] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.905 [INFO][4278] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.917 [INFO][4278] ipam/ipam.go 511: Trying affinity for 192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.921 [INFO][4278] ipam/ipam.go 158: Attempting to load block cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.923 [INFO][4278] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.924 [INFO][4278] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.0.0/26 handle="k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.927 [INFO][4278] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8 Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.935 [INFO][4278] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.0.0/26 handle="k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.951 [INFO][4278] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.0.5/26] block=192.168.0.0/26 handle="k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.953 [INFO][4278] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.0.5/26] handle="k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.956 [INFO][4278] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:15.037155 containerd[1607]: 2025-07-15 05:15:14.956 [INFO][4278] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.0.5/26] IPv6=[] ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" HandleID="k8s-pod-network.8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" Jul 15 05:15:15.037749 containerd[1607]: 2025-07-15 05:15:14.977 [INFO][4229] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Namespace="calico-system" Pod="calico-kube-controllers-7bd8cf7f64-cj4fk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0", GenerateName:"calico-kube-controllers-7bd8cf7f64-", Namespace:"calico-system", SelfLink:"", UID:"ef68457b-55d3-4298-9f51-b7d63fa11f06", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7bd8cf7f64", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"", Pod:"calico-kube-controllers-7bd8cf7f64-cj4fk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.0.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid410363f694", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:15.037749 containerd[1607]: 2025-07-15 05:15:14.979 [INFO][4229] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.0.5/32] ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Namespace="calico-system" Pod="calico-kube-controllers-7bd8cf7f64-cj4fk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" Jul 15 05:15:15.037749 containerd[1607]: 2025-07-15 05:15:14.981 [INFO][4229] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid410363f694 ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Namespace="calico-system" Pod="calico-kube-controllers-7bd8cf7f64-cj4fk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" Jul 15 05:15:15.037749 containerd[1607]: 2025-07-15 05:15:15.016 [INFO][4229] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Namespace="calico-system" Pod="calico-kube-controllers-7bd8cf7f64-cj4fk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" Jul 15 05:15:15.037749 containerd[1607]: 2025-07-15 05:15:15.016 [INFO][4229] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Namespace="calico-system" Pod="calico-kube-controllers-7bd8cf7f64-cj4fk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0", GenerateName:"calico-kube-controllers-7bd8cf7f64-", Namespace:"calico-system", SelfLink:"", UID:"ef68457b-55d3-4298-9f51-b7d63fa11f06", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7bd8cf7f64", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8", Pod:"calico-kube-controllers-7bd8cf7f64-cj4fk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.0.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid410363f694", MAC:"c2:21:4e:85:f7:1f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:15.037749 containerd[1607]: 2025-07-15 05:15:15.034 [INFO][4229] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" Namespace="calico-system" Pod="calico-kube-controllers-7bd8cf7f64-cj4fk" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--kube--controllers--7bd8cf7f64--cj4fk-eth0" Jul 15 05:15:15.037997 systemd[1]: Started cri-containerd-7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c.scope - libcontainer container 7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c. Jul 15 05:15:15.044642 containerd[1607]: time="2025-07-15T05:15:15.044319287Z" level=info msg="Container b6a0ec27c205a8f340d45234e2bb2a738ce7528c097a451c3f386e11de77358e: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:15.048854 containerd[1607]: time="2025-07-15T05:15:15.048816956Z" level=info msg="CreateContainer within sandbox \"bef8e3626f5df9b3a6cf8abd36a93cfc7e13492062bb746ce599e16e3c07e579\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b6a0ec27c205a8f340d45234e2bb2a738ce7528c097a451c3f386e11de77358e\"" Jul 15 05:15:15.050828 containerd[1607]: time="2025-07-15T05:15:15.050804761Z" level=info msg="StartContainer for \"b6a0ec27c205a8f340d45234e2bb2a738ce7528c097a451c3f386e11de77358e\"" Jul 15 05:15:15.051416 containerd[1607]: time="2025-07-15T05:15:15.051394449Z" level=info msg="connecting to shim b6a0ec27c205a8f340d45234e2bb2a738ce7528c097a451c3f386e11de77358e" address="unix:///run/containerd/s/19504ba18a07772abe7d0bbcfb7d77ddc1a4dae5f9b3d86667178b8c687891e7" protocol=ttrpc version=3 Jul 15 05:15:15.081213 systemd-networkd[1477]: caliac57cb45fa4: Link UP Jul 15 05:15:15.081505 systemd-networkd[1477]: caliac57cb45fa4: Gained carrier Jul 15 05:15:15.082000 systemd[1]: Started cri-containerd-b6a0ec27c205a8f340d45234e2bb2a738ce7528c097a451c3f386e11de77358e.scope - libcontainer container b6a0ec27c205a8f340d45234e2bb2a738ce7528c097a451c3f386e11de77358e. Jul 15 05:15:15.106401 containerd[1607]: time="2025-07-15T05:15:15.106359153Z" level=info msg="connecting to shim 8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8" address="unix:///run/containerd/s/4bdfb40059d89399d37f46fffc43a5b39b278ff872b3da5321d378e15067c819" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:14.519 [INFO][4224] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0 csi-node-driver- calico-system 47d635b5-605e-4c77-ba36-e640d13113f8 738 0 2025-07-15 05:14:51 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4396-0-0-n-85c8113064 csi-node-driver-5fd7c eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliac57cb45fa4 [] [] }} ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Namespace="calico-system" Pod="csi-node-driver-5fd7c" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:14.519 [INFO][4224] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Namespace="calico-system" Pod="csi-node-driver-5fd7c" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:14.647 [INFO][4273] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" HandleID="k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Workload="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:14.649 [INFO][4273] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" HandleID="k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Workload="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fb40), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-85c8113064", "pod":"csi-node-driver-5fd7c", "timestamp":"2025-07-15 05:15:14.642649559 +0000 UTC"}, Hostname:"ci-4396-0-0-n-85c8113064", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:14.649 [INFO][4273] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:14.955 [INFO][4273] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:14.957 [INFO][4273] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-85c8113064' Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:14.982 [INFO][4273] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.010 [INFO][4273] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.027 [INFO][4273] ipam/ipam.go 511: Trying affinity for 192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.036 [INFO][4273] ipam/ipam.go 158: Attempting to load block cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.043 [INFO][4273] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.043 [INFO][4273] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.0.0/26 handle="k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.045 [INFO][4273] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.053 [INFO][4273] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.0.0/26 handle="k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.059 [INFO][4273] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.0.6/26] block=192.168.0.0/26 handle="k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.059 [INFO][4273] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.0.6/26] handle="k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.059 [INFO][4273] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:15.122270 containerd[1607]: 2025-07-15 05:15:15.059 [INFO][4273] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.0.6/26] IPv6=[] ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" HandleID="k8s-pod-network.277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Workload="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" Jul 15 05:15:15.123874 containerd[1607]: 2025-07-15 05:15:15.069 [INFO][4224] cni-plugin/k8s.go 418: Populated endpoint ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Namespace="calico-system" Pod="csi-node-driver-5fd7c" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"47d635b5-605e-4c77-ba36-e640d13113f8", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"", Pod:"csi-node-driver-5fd7c", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.0.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliac57cb45fa4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:15.123874 containerd[1607]: 2025-07-15 05:15:15.070 [INFO][4224] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.0.6/32] ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Namespace="calico-system" Pod="csi-node-driver-5fd7c" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" Jul 15 05:15:15.123874 containerd[1607]: 2025-07-15 05:15:15.070 [INFO][4224] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliac57cb45fa4 ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Namespace="calico-system" Pod="csi-node-driver-5fd7c" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" Jul 15 05:15:15.123874 containerd[1607]: 2025-07-15 05:15:15.086 [INFO][4224] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Namespace="calico-system" Pod="csi-node-driver-5fd7c" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" Jul 15 05:15:15.123874 containerd[1607]: 2025-07-15 05:15:15.087 [INFO][4224] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Namespace="calico-system" Pod="csi-node-driver-5fd7c" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"47d635b5-605e-4c77-ba36-e640d13113f8", ResourceVersion:"738", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc", Pod:"csi-node-driver-5fd7c", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.0.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliac57cb45fa4", MAC:"1e:53:b3:b8:91:61", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:15.123874 containerd[1607]: 2025-07-15 05:15:15.110 [INFO][4224] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" Namespace="calico-system" Pod="csi-node-driver-5fd7c" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-csi--node--driver--5fd7c-eth0" Jul 15 05:15:15.149570 containerd[1607]: time="2025-07-15T05:15:15.148922049Z" level=info msg="StartContainer for \"b6a0ec27c205a8f340d45234e2bb2a738ce7528c097a451c3f386e11de77358e\" returns successfully" Jul 15 05:15:15.179296 systemd[1]: Started cri-containerd-8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8.scope - libcontainer container 8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8. Jul 15 05:15:15.190093 containerd[1607]: time="2025-07-15T05:15:15.190032927Z" level=info msg="connecting to shim 277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc" address="unix:///run/containerd/s/b57a3c28c71c6de168c5060ffe3144d7712b8357569b321d0e578e86a9060095" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:15.221234 containerd[1607]: time="2025-07-15T05:15:15.221112450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-pgtpk,Uid:317d09d9-a2c7-4c4b-9c92-5e5751ee42b1,Namespace:calico-system,Attempt:0,} returns sandbox id \"7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c\"" Jul 15 05:15:15.226755 containerd[1607]: time="2025-07-15T05:15:15.226623846Z" level=info msg="StartContainer for \"20fac5090676aae2995c9e7557315797403a76c17b9d1e117a114859c5a316e3\" returns successfully" Jul 15 05:15:15.252871 systemd[1]: Started cri-containerd-277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc.scope - libcontainer container 277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc. Jul 15 05:15:15.288456 containerd[1607]: time="2025-07-15T05:15:15.288406134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7bd8cf7f64-cj4fk,Uid:ef68457b-55d3-4298-9f51-b7d63fa11f06,Namespace:calico-system,Attempt:0,} returns sandbox id \"8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8\"" Jul 15 05:15:15.323860 containerd[1607]: time="2025-07-15T05:15:15.323795615Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5fd7c,Uid:47d635b5-605e-4c77-ba36-e640d13113f8,Namespace:calico-system,Attempt:0,} returns sandbox id \"277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc\"" Jul 15 05:15:15.720703 kubelet[2756]: I0715 05:15:15.713464 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-w4hdl" podStartSLOduration=36.713449262 podStartE2EDuration="36.713449262s" podCreationTimestamp="2025-07-15 05:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:15:15.712578214 +0000 UTC m=+43.449852415" watchObservedRunningTime="2025-07-15 05:15:15.713449262 +0000 UTC m=+43.450723473" Jul 15 05:15:16.110895 systemd-networkd[1477]: calied6975798c0: Gained IPv6LL Jul 15 05:15:16.303200 systemd-networkd[1477]: caliac57cb45fa4: Gained IPv6LL Jul 15 05:15:16.365662 containerd[1607]: time="2025-07-15T05:15:16.365442906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zf4mn,Uid:e9766883-0839-4e35-8d93-e80f3df5e08d,Namespace:kube-system,Attempt:0,}" Jul 15 05:15:16.367877 containerd[1607]: time="2025-07-15T05:15:16.367246372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-758794b6cf-hzkhv,Uid:83844cf0-3ac1-4b4a-a615-1a97a5ba273a,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:15:16.499147 systemd-networkd[1477]: caliafdcc698b06: Link UP Jul 15 05:15:16.499350 systemd-networkd[1477]: caliafdcc698b06: Gained carrier Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.425 [INFO][4649] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0 coredns-674b8bbfcf- kube-system e9766883-0839-4e35-8d93-e80f3df5e08d 836 0 2025-07-15 05:14:39 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4396-0-0-n-85c8113064 coredns-674b8bbfcf-zf4mn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliafdcc698b06 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Namespace="kube-system" Pod="coredns-674b8bbfcf-zf4mn" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.425 [INFO][4649] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Namespace="kube-system" Pod="coredns-674b8bbfcf-zf4mn" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.454 [INFO][4673] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" HandleID="k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Workload="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.454 [INFO][4673] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" HandleID="k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Workload="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5930), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4396-0-0-n-85c8113064", "pod":"coredns-674b8bbfcf-zf4mn", "timestamp":"2025-07-15 05:15:16.454181372 +0000 UTC"}, Hostname:"ci-4396-0-0-n-85c8113064", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.454 [INFO][4673] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.454 [INFO][4673] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.454 [INFO][4673] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-85c8113064' Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.461 [INFO][4673] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.466 [INFO][4673] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.472 [INFO][4673] ipam/ipam.go 511: Trying affinity for 192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.474 [INFO][4673] ipam/ipam.go 158: Attempting to load block cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.476 [INFO][4673] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.476 [INFO][4673] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.0.0/26 handle="k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.477 [INFO][4673] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.482 [INFO][4673] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.0.0/26 handle="k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.490 [INFO][4673] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.0.7/26] block=192.168.0.0/26 handle="k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.490 [INFO][4673] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.0.7/26] handle="k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.490 [INFO][4673] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:16.518255 containerd[1607]: 2025-07-15 05:15:16.490 [INFO][4673] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.0.7/26] IPv6=[] ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" HandleID="k8s-pod-network.807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Workload="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" Jul 15 05:15:16.519255 containerd[1607]: 2025-07-15 05:15:16.493 [INFO][4649] cni-plugin/k8s.go 418: Populated endpoint ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Namespace="kube-system" Pod="coredns-674b8bbfcf-zf4mn" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e9766883-0839-4e35-8d93-e80f3df5e08d", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"", Pod:"coredns-674b8bbfcf-zf4mn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.0.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliafdcc698b06", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:16.519255 containerd[1607]: 2025-07-15 05:15:16.494 [INFO][4649] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.0.7/32] ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Namespace="kube-system" Pod="coredns-674b8bbfcf-zf4mn" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" Jul 15 05:15:16.519255 containerd[1607]: 2025-07-15 05:15:16.494 [INFO][4649] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliafdcc698b06 ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Namespace="kube-system" Pod="coredns-674b8bbfcf-zf4mn" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" Jul 15 05:15:16.519255 containerd[1607]: 2025-07-15 05:15:16.501 [INFO][4649] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Namespace="kube-system" Pod="coredns-674b8bbfcf-zf4mn" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" Jul 15 05:15:16.519255 containerd[1607]: 2025-07-15 05:15:16.503 [INFO][4649] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Namespace="kube-system" Pod="coredns-674b8bbfcf-zf4mn" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e9766883-0839-4e35-8d93-e80f3df5e08d", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa", Pod:"coredns-674b8bbfcf-zf4mn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.0.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliafdcc698b06", MAC:"f6:0e:1f:e3:91:74", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:16.519255 containerd[1607]: 2025-07-15 05:15:16.511 [INFO][4649] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" Namespace="kube-system" Pod="coredns-674b8bbfcf-zf4mn" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-coredns--674b8bbfcf--zf4mn-eth0" Jul 15 05:15:16.549727 containerd[1607]: time="2025-07-15T05:15:16.549586744Z" level=info msg="connecting to shim 807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa" address="unix:///run/containerd/s/b420c5487e807e71a60502ffdd216ef958b1d418546f5fc2a978dc8305c8df9a" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:16.559240 systemd-networkd[1477]: cali0b6a17265a9: Gained IPv6LL Jul 15 05:15:16.601847 systemd[1]: Started cri-containerd-807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa.scope - libcontainer container 807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa. Jul 15 05:15:16.633274 systemd-networkd[1477]: cali627a6fe4f28: Link UP Jul 15 05:15:16.637166 systemd-networkd[1477]: cali627a6fe4f28: Gained carrier Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.443 [INFO][4660] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0 calico-apiserver-758794b6cf- calico-apiserver 83844cf0-3ac1-4b4a-a615-1a97a5ba273a 846 0 2025-07-15 05:14:48 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:758794b6cf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396-0-0-n-85c8113064 calico-apiserver-758794b6cf-hzkhv eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali627a6fe4f28 [] [] }} ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-hzkhv" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.444 [INFO][4660] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-hzkhv" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.478 [INFO][4679] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" HandleID="k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.479 [INFO][4679] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" HandleID="k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5020), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396-0-0-n-85c8113064", "pod":"calico-apiserver-758794b6cf-hzkhv", "timestamp":"2025-07-15 05:15:16.475312334 +0000 UTC"}, Hostname:"ci-4396-0-0-n-85c8113064", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.479 [INFO][4679] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.490 [INFO][4679] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.490 [INFO][4679] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-85c8113064' Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.562 [INFO][4679] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.571 [INFO][4679] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.578 [INFO][4679] ipam/ipam.go 511: Trying affinity for 192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.581 [INFO][4679] ipam/ipam.go 158: Attempting to load block cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.587 [INFO][4679] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.0.0/26 host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.587 [INFO][4679] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.0.0/26 handle="k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.598 [INFO][4679] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.606 [INFO][4679] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.0.0/26 handle="k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.619 [INFO][4679] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.0.8/26] block=192.168.0.0/26 handle="k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.620 [INFO][4679] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.0.8/26] handle="k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" host="ci-4396-0-0-n-85c8113064" Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.620 [INFO][4679] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:15:16.663478 containerd[1607]: 2025-07-15 05:15:16.620 [INFO][4679] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.0.8/26] IPv6=[] ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" HandleID="k8s-pod-network.f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Workload="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" Jul 15 05:15:16.665306 containerd[1607]: 2025-07-15 05:15:16.627 [INFO][4660] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-hzkhv" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0", GenerateName:"calico-apiserver-758794b6cf-", Namespace:"calico-apiserver", SelfLink:"", UID:"83844cf0-3ac1-4b4a-a615-1a97a5ba273a", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"758794b6cf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"", Pod:"calico-apiserver-758794b6cf-hzkhv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.0.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali627a6fe4f28", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:16.665306 containerd[1607]: 2025-07-15 05:15:16.627 [INFO][4660] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.0.8/32] ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-hzkhv" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" Jul 15 05:15:16.665306 containerd[1607]: 2025-07-15 05:15:16.628 [INFO][4660] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali627a6fe4f28 ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-hzkhv" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" Jul 15 05:15:16.665306 containerd[1607]: 2025-07-15 05:15:16.638 [INFO][4660] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-hzkhv" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" Jul 15 05:15:16.665306 containerd[1607]: 2025-07-15 05:15:16.639 [INFO][4660] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-hzkhv" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0", GenerateName:"calico-apiserver-758794b6cf-", Namespace:"calico-apiserver", SelfLink:"", UID:"83844cf0-3ac1-4b4a-a615-1a97a5ba273a", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 14, 48, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"758794b6cf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-85c8113064", ContainerID:"f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f", Pod:"calico-apiserver-758794b6cf-hzkhv", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.0.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali627a6fe4f28", MAC:"96:6c:5e:97:4d:2e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:15:16.665306 containerd[1607]: 2025-07-15 05:15:16.656 [INFO][4660] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" Namespace="calico-apiserver" Pod="calico-apiserver-758794b6cf-hzkhv" WorkloadEndpoint="ci--4396--0--0--n--85c8113064-k8s-calico--apiserver--758794b6cf--hzkhv-eth0" Jul 15 05:15:16.687840 systemd-networkd[1477]: caliacb7bb820be: Gained IPv6LL Jul 15 05:15:16.692707 containerd[1607]: time="2025-07-15T05:15:16.691727258Z" level=info msg="connecting to shim f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f" address="unix:///run/containerd/s/89d3f7fc39539c9a8688214449b3539cd847ed97fadec1d9f38c36d292316865" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:16.708297 containerd[1607]: time="2025-07-15T05:15:16.708230020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zf4mn,Uid:e9766883-0839-4e35-8d93-e80f3df5e08d,Namespace:kube-system,Attempt:0,} returns sandbox id \"807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa\"" Jul 15 05:15:16.714206 containerd[1607]: time="2025-07-15T05:15:16.714088146Z" level=info msg="CreateContainer within sandbox \"807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:15:16.723199 containerd[1607]: time="2025-07-15T05:15:16.723157196Z" level=info msg="Container 8a4de023e224be66bb924860051817d57aaba05d374ea5c44bf9f36c116549d6: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:16.726899 systemd[1]: Started cri-containerd-f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f.scope - libcontainer container f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f. Jul 15 05:15:16.736264 containerd[1607]: time="2025-07-15T05:15:16.736223186Z" level=info msg="CreateContainer within sandbox \"807af26e5ca9d8590687bba4eeeb4f8cc6463ed6617f72635866ba5599f229aa\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8a4de023e224be66bb924860051817d57aaba05d374ea5c44bf9f36c116549d6\"" Jul 15 05:15:16.738209 containerd[1607]: time="2025-07-15T05:15:16.738180601Z" level=info msg="StartContainer for \"8a4de023e224be66bb924860051817d57aaba05d374ea5c44bf9f36c116549d6\"" Jul 15 05:15:16.740236 containerd[1607]: time="2025-07-15T05:15:16.740204667Z" level=info msg="connecting to shim 8a4de023e224be66bb924860051817d57aaba05d374ea5c44bf9f36c116549d6" address="unix:///run/containerd/s/b420c5487e807e71a60502ffdd216ef958b1d418546f5fc2a978dc8305c8df9a" protocol=ttrpc version=3 Jul 15 05:15:16.783854 systemd[1]: Started cri-containerd-8a4de023e224be66bb924860051817d57aaba05d374ea5c44bf9f36c116549d6.scope - libcontainer container 8a4de023e224be66bb924860051817d57aaba05d374ea5c44bf9f36c116549d6. Jul 15 05:15:16.824530 containerd[1607]: time="2025-07-15T05:15:16.824460293Z" level=info msg="StartContainer for \"8a4de023e224be66bb924860051817d57aaba05d374ea5c44bf9f36c116549d6\" returns successfully" Jul 15 05:15:16.826627 containerd[1607]: time="2025-07-15T05:15:16.826148140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-758794b6cf-hzkhv,Uid:83844cf0-3ac1-4b4a-a615-1a97a5ba273a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f\"" Jul 15 05:15:16.942913 systemd-networkd[1477]: calid410363f694: Gained IPv6LL Jul 15 05:15:17.734872 kubelet[2756]: I0715 05:15:17.733196 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-zf4mn" podStartSLOduration=38.733177495 podStartE2EDuration="38.733177495s" podCreationTimestamp="2025-07-15 05:14:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:15:17.717808977 +0000 UTC m=+45.455083188" watchObservedRunningTime="2025-07-15 05:15:17.733177495 +0000 UTC m=+45.470451696" Jul 15 05:15:17.902942 systemd-networkd[1477]: caliafdcc698b06: Gained IPv6LL Jul 15 05:15:18.388473 containerd[1607]: time="2025-07-15T05:15:18.388420605Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:18.389556 containerd[1607]: time="2025-07-15T05:15:18.389527243Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Jul 15 05:15:18.390322 containerd[1607]: time="2025-07-15T05:15:18.390275272Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:18.391626 containerd[1607]: time="2025-07-15T05:15:18.391608559Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:18.392057 containerd[1607]: time="2025-07-15T05:15:18.391950478Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 3.467002342s" Jul 15 05:15:18.392057 containerd[1607]: time="2025-07-15T05:15:18.391972238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:15:18.393247 containerd[1607]: time="2025-07-15T05:15:18.393132156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 15 05:15:18.395246 containerd[1607]: time="2025-07-15T05:15:18.395217722Z" level=info msg="CreateContainer within sandbox \"d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:15:18.403332 containerd[1607]: time="2025-07-15T05:15:18.402799137Z" level=info msg="Container 86b18e319e220a1454cce8be952f25c29427bc2aa971939f85302ec423307ec7: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:18.416659 containerd[1607]: time="2025-07-15T05:15:18.416609159Z" level=info msg="CreateContainer within sandbox \"d3047a04addd5136adc13a737afc786dd57cd03eebb929624555c8c0873bb9dc\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"86b18e319e220a1454cce8be952f25c29427bc2aa971939f85302ec423307ec7\"" Jul 15 05:15:18.417527 containerd[1607]: time="2025-07-15T05:15:18.417501388Z" level=info msg="StartContainer for \"86b18e319e220a1454cce8be952f25c29427bc2aa971939f85302ec423307ec7\"" Jul 15 05:15:18.418619 containerd[1607]: time="2025-07-15T05:15:18.418587086Z" level=info msg="connecting to shim 86b18e319e220a1454cce8be952f25c29427bc2aa971939f85302ec423307ec7" address="unix:///run/containerd/s/b4b06111e4aba1c61aecf0ebd4d5a6da4cd42128a49ef4e1f65a7082f2ae8cf0" protocol=ttrpc version=3 Jul 15 05:15:18.454917 systemd[1]: Started cri-containerd-86b18e319e220a1454cce8be952f25c29427bc2aa971939f85302ec423307ec7.scope - libcontainer container 86b18e319e220a1454cce8be952f25c29427bc2aa971939f85302ec423307ec7. Jul 15 05:15:18.514020 containerd[1607]: time="2025-07-15T05:15:18.513832959Z" level=info msg="StartContainer for \"86b18e319e220a1454cce8be952f25c29427bc2aa971939f85302ec423307ec7\" returns successfully" Jul 15 05:15:18.606881 systemd-networkd[1477]: cali627a6fe4f28: Gained IPv6LL Jul 15 05:15:18.737794 kubelet[2756]: I0715 05:15:18.737497 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-758794b6cf-lx5t5" podStartSLOduration=27.26907837 podStartE2EDuration="30.737478009s" podCreationTimestamp="2025-07-15 05:14:48 +0000 UTC" firstStartedPulling="2025-07-15 05:15:14.924411898 +0000 UTC m=+42.661686109" lastFinishedPulling="2025-07-15 05:15:18.392811547 +0000 UTC m=+46.130085748" observedRunningTime="2025-07-15 05:15:18.735854452 +0000 UTC m=+46.473128673" watchObservedRunningTime="2025-07-15 05:15:18.737478009 +0000 UTC m=+46.474752220" Jul 15 05:15:19.714534 kubelet[2756]: I0715 05:15:19.714481 2756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:15:21.661915 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1388647759.mount: Deactivated successfully. Jul 15 05:15:22.094912 containerd[1607]: time="2025-07-15T05:15:22.094867022Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:22.097388 containerd[1607]: time="2025-07-15T05:15:22.097361108Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Jul 15 05:15:22.107142 containerd[1607]: time="2025-07-15T05:15:22.107097086Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:22.109138 containerd[1607]: time="2025-07-15T05:15:22.109097512Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:22.109622 containerd[1607]: time="2025-07-15T05:15:22.109472132Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.716319466s" Jul 15 05:15:22.109622 containerd[1607]: time="2025-07-15T05:15:22.109496332Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Jul 15 05:15:22.110685 containerd[1607]: time="2025-07-15T05:15:22.110587130Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 15 05:15:22.114927 containerd[1607]: time="2025-07-15T05:15:22.114693805Z" level=info msg="CreateContainer within sandbox \"7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 15 05:15:22.144790 containerd[1607]: time="2025-07-15T05:15:22.144753771Z" level=info msg="Container 60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:22.196103 containerd[1607]: time="2025-07-15T05:15:22.195835938Z" level=info msg="CreateContainer within sandbox \"7a4545f5897b473f3d04164f3ac7a8613c289f0e5b07b91b4bb5cd7ff136207c\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\"" Jul 15 05:15:22.198247 containerd[1607]: time="2025-07-15T05:15:22.197838026Z" level=info msg="StartContainer for \"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\"" Jul 15 05:15:22.199240 containerd[1607]: time="2025-07-15T05:15:22.199202644Z" level=info msg="connecting to shim 60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61" address="unix:///run/containerd/s/b9e471ac59f3c3ffef2e9335d8f0b43fe5ac55b081e5944474ed5a9c605c1718" protocol=ttrpc version=3 Jul 15 05:15:22.257875 systemd[1]: Started cri-containerd-60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61.scope - libcontainer container 60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61. Jul 15 05:15:22.329636 containerd[1607]: time="2025-07-15T05:15:22.329588298Z" level=info msg="StartContainer for \"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" returns successfully" Jul 15 05:15:23.386589 containerd[1607]: time="2025-07-15T05:15:23.386505110Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"851fabe2a437761309f6f9846af00fb6aae7518da1051ba265411275455a5076\" pid:4949 exit_status:1 exited_at:{seconds:1752556523 nanos:370078282}" Jul 15 05:15:23.490347 containerd[1607]: time="2025-07-15T05:15:23.490291485Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"de93ef9376e4f566a93ea2322677e6524bbf7fcad1fc222811d996cd01f7bd9e\" pid:4971 exit_status:1 exited_at:{seconds:1752556523 nanos:490048934}" Jul 15 05:15:23.853659 containerd[1607]: time="2025-07-15T05:15:23.853608629Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"7069560eb1e8642a15200ed464aca9ebf6bbc0f7789c20aa55bc10e23f6b7b2e\" pid:5000 exit_status:1 exited_at:{seconds:1752556523 nanos:853053110}" Jul 15 05:15:24.930231 containerd[1607]: time="2025-07-15T05:15:24.930109401Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"61e9dd73dac27bf3b80304fd6ad67711d3449fab677ec781e2e7ba9a0e70357e\" pid:5028 exit_status:1 exited_at:{seconds:1752556524 nanos:929878752}" Jul 15 05:15:25.028419 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1529409305.mount: Deactivated successfully. Jul 15 05:15:25.045971 containerd[1607]: time="2025-07-15T05:15:25.045930428Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:25.046729 containerd[1607]: time="2025-07-15T05:15:25.046564867Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Jul 15 05:15:25.047406 containerd[1607]: time="2025-07-15T05:15:25.047386925Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:25.049287 containerd[1607]: time="2025-07-15T05:15:25.049263503Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:25.049817 containerd[1607]: time="2025-07-15T05:15:25.049735363Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 2.939131613s" Jul 15 05:15:25.049817 containerd[1607]: time="2025-07-15T05:15:25.049754323Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Jul 15 05:15:25.066177 containerd[1607]: time="2025-07-15T05:15:25.065812955Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 15 05:15:25.084119 containerd[1607]: time="2025-07-15T05:15:25.083618545Z" level=info msg="CreateContainer within sandbox \"fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 15 05:15:25.108279 containerd[1607]: time="2025-07-15T05:15:25.108243078Z" level=info msg="Container 0a7beedfa2fd940a2ebb18ddde79f67f152e760694085aafd57d2291e211d18e: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:25.121824 containerd[1607]: time="2025-07-15T05:15:25.121797994Z" level=info msg="CreateContainer within sandbox \"fd2ecc0eebf561a0c43b50083c7ca48fe736a9685b44d6b1d23897e98be8b984\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"0a7beedfa2fd940a2ebb18ddde79f67f152e760694085aafd57d2291e211d18e\"" Jul 15 05:15:25.122693 containerd[1607]: time="2025-07-15T05:15:25.122285653Z" level=info msg="StartContainer for \"0a7beedfa2fd940a2ebb18ddde79f67f152e760694085aafd57d2291e211d18e\"" Jul 15 05:15:25.123562 containerd[1607]: time="2025-07-15T05:15:25.123275842Z" level=info msg="connecting to shim 0a7beedfa2fd940a2ebb18ddde79f67f152e760694085aafd57d2291e211d18e" address="unix:///run/containerd/s/b1b11c817460e5a32390e2fc4d71b2eb26a8d522aa63abf641b1e8fb66682b38" protocol=ttrpc version=3 Jul 15 05:15:25.144796 systemd[1]: Started cri-containerd-0a7beedfa2fd940a2ebb18ddde79f67f152e760694085aafd57d2291e211d18e.scope - libcontainer container 0a7beedfa2fd940a2ebb18ddde79f67f152e760694085aafd57d2291e211d18e. Jul 15 05:15:25.203562 containerd[1607]: time="2025-07-15T05:15:25.203319894Z" level=info msg="StartContainer for \"0a7beedfa2fd940a2ebb18ddde79f67f152e760694085aafd57d2291e211d18e\" returns successfully" Jul 15 05:15:25.800912 kubelet[2756]: I0715 05:15:25.791348 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-pgtpk" podStartSLOduration=28.891444867 podStartE2EDuration="35.778400153s" podCreationTimestamp="2025-07-15 05:14:50 +0000 UTC" firstStartedPulling="2025-07-15 05:15:15.223455744 +0000 UTC m=+42.960729945" lastFinishedPulling="2025-07-15 05:15:22.11041102 +0000 UTC m=+49.847685231" observedRunningTime="2025-07-15 05:15:22.765904708 +0000 UTC m=+50.503178919" watchObservedRunningTime="2025-07-15 05:15:25.778400153 +0000 UTC m=+53.515674394" Jul 15 05:15:25.800912 kubelet[2756]: I0715 05:15:25.800558 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-85bd7f758b-vxkst" podStartSLOduration=2.496949856 podStartE2EDuration="14.800535918s" podCreationTimestamp="2025-07-15 05:15:11 +0000 UTC" firstStartedPulling="2025-07-15 05:15:12.759148847 +0000 UTC m=+40.496423058" lastFinishedPulling="2025-07-15 05:15:25.062734909 +0000 UTC m=+52.800009120" observedRunningTime="2025-07-15 05:15:25.777922222 +0000 UTC m=+53.515196493" watchObservedRunningTime="2025-07-15 05:15:25.800535918 +0000 UTC m=+53.537810149" Jul 15 05:15:26.142407 containerd[1607]: time="2025-07-15T05:15:26.142278626Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"37d2195456f19bdbcefb7d9f857bb7d15fe978fd96bbe639dae8c657c0b13673\" pid:5088 exited_at:{seconds:1752556526 nanos:141799587}" Jul 15 05:15:28.441932 containerd[1607]: time="2025-07-15T05:15:28.441872862Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:28.442871 containerd[1607]: time="2025-07-15T05:15:28.442713411Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Jul 15 05:15:28.443728 containerd[1607]: time="2025-07-15T05:15:28.443695340Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:28.445694 containerd[1607]: time="2025-07-15T05:15:28.445223358Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:28.445694 containerd[1607]: time="2025-07-15T05:15:28.445605718Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 3.379758013s" Jul 15 05:15:28.445694 containerd[1607]: time="2025-07-15T05:15:28.445624598Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Jul 15 05:15:28.446714 containerd[1607]: time="2025-07-15T05:15:28.446702388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 15 05:15:28.513313 containerd[1607]: time="2025-07-15T05:15:28.513253603Z" level=info msg="CreateContainer within sandbox \"8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 15 05:15:28.524033 containerd[1607]: time="2025-07-15T05:15:28.523989353Z" level=info msg="Container 813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:28.529389 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1490195691.mount: Deactivated successfully. Jul 15 05:15:28.532650 containerd[1607]: time="2025-07-15T05:15:28.532615796Z" level=info msg="CreateContainer within sandbox \"8349c955e0687e0ff94d12046db25b5ae5c7ba66df6c599a242f6b9c2d27e1a8\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\"" Jul 15 05:15:28.533392 containerd[1607]: time="2025-07-15T05:15:28.533338995Z" level=info msg="StartContainer for \"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\"" Jul 15 05:15:28.535178 containerd[1607]: time="2025-07-15T05:15:28.535148624Z" level=info msg="connecting to shim 813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b" address="unix:///run/containerd/s/4bdfb40059d89399d37f46fffc43a5b39b278ff872b3da5321d378e15067c819" protocol=ttrpc version=3 Jul 15 05:15:28.577850 systemd[1]: Started cri-containerd-813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b.scope - libcontainer container 813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b. Jul 15 05:15:28.633859 containerd[1607]: time="2025-07-15T05:15:28.633822202Z" level=info msg="StartContainer for \"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\" returns successfully" Jul 15 05:15:28.792305 kubelet[2756]: I0715 05:15:28.792182 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7bd8cf7f64-cj4fk" podStartSLOduration=24.636883648 podStartE2EDuration="37.79206604s" podCreationTimestamp="2025-07-15 05:14:51 +0000 UTC" firstStartedPulling="2025-07-15 05:15:15.291291266 +0000 UTC m=+43.028565467" lastFinishedPulling="2025-07-15 05:15:28.446473648 +0000 UTC m=+56.183747859" observedRunningTime="2025-07-15 05:15:28.791767491 +0000 UTC m=+56.529041692" watchObservedRunningTime="2025-07-15 05:15:28.79206604 +0000 UTC m=+56.529340251" Jul 15 05:15:28.833307 containerd[1607]: time="2025-07-15T05:15:28.833270036Z" level=info msg="TaskExit event in podsandbox handler container_id:\"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\" id:\"eecf47629409f6714958bdf15edbbd68df266b36424c6b2c31446ffbd9ec7b81\" pid:5158 exited_at:{seconds:1752556528 nanos:825080062}" Jul 15 05:15:30.767169 containerd[1607]: time="2025-07-15T05:15:30.767121421Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:30.768162 containerd[1607]: time="2025-07-15T05:15:30.767990091Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Jul 15 05:15:30.768958 containerd[1607]: time="2025-07-15T05:15:30.768931350Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:30.770412 containerd[1607]: time="2025-07-15T05:15:30.770395939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:30.770773 containerd[1607]: time="2025-07-15T05:15:30.770745539Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 2.323966421s" Jul 15 05:15:30.770810 containerd[1607]: time="2025-07-15T05:15:30.770771839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Jul 15 05:15:30.771957 containerd[1607]: time="2025-07-15T05:15:30.771863248Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:15:30.778725 containerd[1607]: time="2025-07-15T05:15:30.778683653Z" level=info msg="CreateContainer within sandbox \"277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 15 05:15:30.816279 containerd[1607]: time="2025-07-15T05:15:30.815276308Z" level=info msg="Container a7f0e363a1be8719be2de2b0b4adeeb739c71c990279d324dc5d4d83bdf9a152: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:30.831080 containerd[1607]: time="2025-07-15T05:15:30.831040038Z" level=info msg="CreateContainer within sandbox \"277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a7f0e363a1be8719be2de2b0b4adeeb739c71c990279d324dc5d4d83bdf9a152\"" Jul 15 05:15:30.832712 containerd[1607]: time="2025-07-15T05:15:30.832688877Z" level=info msg="StartContainer for \"a7f0e363a1be8719be2de2b0b4adeeb739c71c990279d324dc5d4d83bdf9a152\"" Jul 15 05:15:30.833871 containerd[1607]: time="2025-07-15T05:15:30.833837006Z" level=info msg="connecting to shim a7f0e363a1be8719be2de2b0b4adeeb739c71c990279d324dc5d4d83bdf9a152" address="unix:///run/containerd/s/b57a3c28c71c6de168c5060ffe3144d7712b8357569b321d0e578e86a9060095" protocol=ttrpc version=3 Jul 15 05:15:30.856781 systemd[1]: Started cri-containerd-a7f0e363a1be8719be2de2b0b4adeeb739c71c990279d324dc5d4d83bdf9a152.scope - libcontainer container a7f0e363a1be8719be2de2b0b4adeeb739c71c990279d324dc5d4d83bdf9a152. Jul 15 05:15:30.901340 containerd[1607]: time="2025-07-15T05:15:30.901219790Z" level=info msg="StartContainer for \"a7f0e363a1be8719be2de2b0b4adeeb739c71c990279d324dc5d4d83bdf9a152\" returns successfully" Jul 15 05:15:31.298971 containerd[1607]: time="2025-07-15T05:15:31.298873330Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:31.300134 containerd[1607]: time="2025-07-15T05:15:31.300078349Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 05:15:31.303180 containerd[1607]: time="2025-07-15T05:15:31.303123948Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 531.2079ms" Jul 15 05:15:31.303180 containerd[1607]: time="2025-07-15T05:15:31.303166788Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:15:31.305102 containerd[1607]: time="2025-07-15T05:15:31.304967176Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 15 05:15:31.310741 containerd[1607]: time="2025-07-15T05:15:31.310619333Z" level=info msg="CreateContainer within sandbox \"f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:15:31.321729 containerd[1607]: time="2025-07-15T05:15:31.320801156Z" level=info msg="Container fb91887df1698010fe0b70edc404d79aa6bacbe9643faae5076e459cc9f5e759: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:31.324636 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4074514783.mount: Deactivated successfully. Jul 15 05:15:31.337347 containerd[1607]: time="2025-07-15T05:15:31.337272267Z" level=info msg="CreateContainer within sandbox \"f27e3034fd7c70a804dc7150dfaeb871258c4ca365a96182656008fc5b5dc23f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fb91887df1698010fe0b70edc404d79aa6bacbe9643faae5076e459cc9f5e759\"" Jul 15 05:15:31.338690 containerd[1607]: time="2025-07-15T05:15:31.338097706Z" level=info msg="StartContainer for \"fb91887df1698010fe0b70edc404d79aa6bacbe9643faae5076e459cc9f5e759\"" Jul 15 05:15:31.338998 containerd[1607]: time="2025-07-15T05:15:31.338982595Z" level=info msg="connecting to shim fb91887df1698010fe0b70edc404d79aa6bacbe9643faae5076e459cc9f5e759" address="unix:///run/containerd/s/89d3f7fc39539c9a8688214449b3539cd847ed97fadec1d9f38c36d292316865" protocol=ttrpc version=3 Jul 15 05:15:31.362355 systemd[1]: Started cri-containerd-fb91887df1698010fe0b70edc404d79aa6bacbe9643faae5076e459cc9f5e759.scope - libcontainer container fb91887df1698010fe0b70edc404d79aa6bacbe9643faae5076e459cc9f5e759. Jul 15 05:15:31.411964 containerd[1607]: time="2025-07-15T05:15:31.411890261Z" level=info msg="StartContainer for \"fb91887df1698010fe0b70edc404d79aa6bacbe9643faae5076e459cc9f5e759\" returns successfully" Jul 15 05:15:31.859549 kubelet[2756]: I0715 05:15:31.858739 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-758794b6cf-hzkhv" podStartSLOduration=29.382080027 podStartE2EDuration="43.858718537s" podCreationTimestamp="2025-07-15 05:14:48 +0000 UTC" firstStartedPulling="2025-07-15 05:15:16.827844436 +0000 UTC m=+44.565118637" lastFinishedPulling="2025-07-15 05:15:31.304482906 +0000 UTC m=+59.041757147" observedRunningTime="2025-07-15 05:15:31.842367617 +0000 UTC m=+59.579641828" watchObservedRunningTime="2025-07-15 05:15:31.858718537 +0000 UTC m=+59.595992738" Jul 15 05:15:32.821559 kubelet[2756]: I0715 05:15:32.821472 2756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:15:33.603900 containerd[1607]: time="2025-07-15T05:15:33.603847488Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:33.605134 containerd[1607]: time="2025-07-15T05:15:33.604958097Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Jul 15 05:15:33.606120 containerd[1607]: time="2025-07-15T05:15:33.606088617Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:33.608048 containerd[1607]: time="2025-07-15T05:15:33.608017766Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:33.608528 containerd[1607]: time="2025-07-15T05:15:33.608498926Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 2.30346484s" Jul 15 05:15:33.608735 containerd[1607]: time="2025-07-15T05:15:33.608635896Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Jul 15 05:15:33.646844 containerd[1607]: time="2025-07-15T05:15:33.646791317Z" level=info msg="CreateContainer within sandbox \"277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 15 05:15:33.662939 containerd[1607]: time="2025-07-15T05:15:33.658124421Z" level=info msg="Container 18b90f7288d17c9ba5f1a97aa4289bc242c00f21d76684ad8b8f8c50c44c7251: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:33.663493 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1498850176.mount: Deactivated successfully. Jul 15 05:15:33.676435 containerd[1607]: time="2025-07-15T05:15:33.676401813Z" level=info msg="CreateContainer within sandbox \"277cd1b9595a42050a2d9c9ad0129bab020494490549eca557b0b12a88ee1ddc\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"18b90f7288d17c9ba5f1a97aa4289bc242c00f21d76684ad8b8f8c50c44c7251\"" Jul 15 05:15:33.684280 containerd[1607]: time="2025-07-15T05:15:33.684210939Z" level=info msg="StartContainer for \"18b90f7288d17c9ba5f1a97aa4289bc242c00f21d76684ad8b8f8c50c44c7251\"" Jul 15 05:15:33.690846 containerd[1607]: time="2025-07-15T05:15:33.690824705Z" level=info msg="connecting to shim 18b90f7288d17c9ba5f1a97aa4289bc242c00f21d76684ad8b8f8c50c44c7251" address="unix:///run/containerd/s/b57a3c28c71c6de168c5060ffe3144d7712b8357569b321d0e578e86a9060095" protocol=ttrpc version=3 Jul 15 05:15:33.714835 systemd[1]: Started cri-containerd-18b90f7288d17c9ba5f1a97aa4289bc242c00f21d76684ad8b8f8c50c44c7251.scope - libcontainer container 18b90f7288d17c9ba5f1a97aa4289bc242c00f21d76684ad8b8f8c50c44c7251. Jul 15 05:15:33.765270 containerd[1607]: time="2025-07-15T05:15:33.764489240Z" level=info msg="StartContainer for \"18b90f7288d17c9ba5f1a97aa4289bc242c00f21d76684ad8b8f8c50c44c7251\" returns successfully" Jul 15 05:15:33.852638 kubelet[2756]: I0715 05:15:33.852444 2756 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5fd7c" podStartSLOduration=24.566336204 podStartE2EDuration="42.850032308s" podCreationTimestamp="2025-07-15 05:14:51 +0000 UTC" firstStartedPulling="2025-07-15 05:15:15.325658881 +0000 UTC m=+43.062933092" lastFinishedPulling="2025-07-15 05:15:33.609354995 +0000 UTC m=+61.346629196" observedRunningTime="2025-07-15 05:15:33.844260391 +0000 UTC m=+61.581534592" watchObservedRunningTime="2025-07-15 05:15:33.850032308 +0000 UTC m=+61.587306509" Jul 15 05:15:34.626543 kubelet[2756]: I0715 05:15:34.626454 2756 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 15 05:15:34.626759 kubelet[2756]: I0715 05:15:34.626568 2756 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 15 05:15:37.602387 kubelet[2756]: I0715 05:15:37.601769 2756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:15:38.596224 containerd[1607]: time="2025-07-15T05:15:38.595977984Z" level=info msg="TaskExit event in podsandbox handler container_id:\"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\" id:\"96f4a86255e2b3bbea031c63ea838687d9927ba787bf24d1145a3a97fd879b45\" pid:5294 exited_at:{seconds:1752556538 nanos:594629624}" Jul 15 05:15:42.265006 kubelet[2756]: I0715 05:15:42.264921 2756 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:15:42.757552 containerd[1607]: time="2025-07-15T05:15:42.757507779Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\" id:\"6efcf9c8d989db49cce21120091b2bd29f487f7a8d34c9c4031029fcc193d351\" pid:5321 exited_at:{seconds:1752556542 nanos:757293579}" Jul 15 05:15:54.992897 containerd[1607]: time="2025-07-15T05:15:54.992858528Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"17dfeaea3f928743ee1b3433a1e718e3e8d24f3fd0aa404c7f8930c6c2e139ae\" pid:5361 exited_at:{seconds:1752556554 nanos:992419280}" Jul 15 05:15:58.849039 containerd[1607]: time="2025-07-15T05:15:58.848990250Z" level=info msg="TaskExit event in podsandbox handler container_id:\"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\" id:\"ca952e4e3c9d8c865c1792dd357b8118272e51e2e525329cb61ee59c54e42abb\" pid:5390 exited_at:{seconds:1752556558 nanos:848838324}" Jul 15 05:16:12.706853 containerd[1607]: time="2025-07-15T05:16:12.706779191Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\" id:\"28143f6a398db7535d81db261cacb4a9229e9fa9cdceb715d19ba4f67625be57\" pid:5414 exited_at:{seconds:1752556572 nanos:706515096}" Jul 15 05:16:24.980117 containerd[1607]: time="2025-07-15T05:16:24.980081327Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"4b17ba8ccdc84d5052f2ac948afdc2bd22fefcac26797f170cbec7f0b8182733\" pid:5438 exited_at:{seconds:1752556584 nanos:979507303}" Jul 15 05:16:26.200431 containerd[1607]: time="2025-07-15T05:16:26.200267770Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"ce7855410b9f12c42a8c2f0ba6ac5cffba82e8cf3977a73a1fc0243af6bff2d7\" pid:5462 exited_at:{seconds:1752556586 nanos:199941364}" Jul 15 05:16:27.750331 systemd[1]: Started sshd@7-95.217.135.169:22-139.178.89.65:43928.service - OpenSSH per-connection server daemon (139.178.89.65:43928). Jul 15 05:16:28.796834 sshd[5479]: Accepted publickey for core from 139.178.89.65 port 43928 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:16:28.800022 sshd-session[5479]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:16:28.808462 systemd-logind[1575]: New session 8 of user core. Jul 15 05:16:28.812708 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 15 05:16:28.854303 containerd[1607]: time="2025-07-15T05:16:28.840361469Z" level=info msg="TaskExit event in podsandbox handler container_id:\"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\" id:\"4a4665c02dfdf4898a09bc97782fc4c37983a5dd41ece0d7995c5a090787636a\" pid:5493 exited_at:{seconds:1752556588 nanos:839878624}" Jul 15 05:16:29.892747 sshd[5499]: Connection closed by 139.178.89.65 port 43928 Jul 15 05:16:29.894119 sshd-session[5479]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:29.906256 systemd-logind[1575]: Session 8 logged out. Waiting for processes to exit. Jul 15 05:16:29.907515 systemd[1]: sshd@7-95.217.135.169:22-139.178.89.65:43928.service: Deactivated successfully. Jul 15 05:16:29.912449 systemd[1]: session-8.scope: Deactivated successfully. Jul 15 05:16:29.916448 systemd-logind[1575]: Removed session 8. Jul 15 05:16:35.067340 systemd[1]: Started sshd@8-95.217.135.169:22-139.178.89.65:33574.service - OpenSSH per-connection server daemon (139.178.89.65:33574). Jul 15 05:16:36.088575 sshd[5524]: Accepted publickey for core from 139.178.89.65 port 33574 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:16:36.091651 sshd-session[5524]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:16:36.101769 systemd-logind[1575]: New session 9 of user core. Jul 15 05:16:36.109945 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 15 05:16:36.947732 sshd[5527]: Connection closed by 139.178.89.65 port 33574 Jul 15 05:16:36.949304 sshd-session[5524]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:36.956630 systemd[1]: sshd@8-95.217.135.169:22-139.178.89.65:33574.service: Deactivated successfully. Jul 15 05:16:36.961085 systemd[1]: session-9.scope: Deactivated successfully. Jul 15 05:16:36.964504 systemd-logind[1575]: Session 9 logged out. Waiting for processes to exit. Jul 15 05:16:36.967796 systemd-logind[1575]: Removed session 9. Jul 15 05:16:38.603861 containerd[1607]: time="2025-07-15T05:16:38.603805231Z" level=info msg="TaskExit event in podsandbox handler container_id:\"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\" id:\"fed3749634d56f6033ecaa29cb5d1e16eec120e861658ada8471c2951b4e0ffc\" pid:5553 exited_at:{seconds:1752556598 nanos:603473334}" Jul 15 05:16:42.122915 systemd[1]: Started sshd@9-95.217.135.169:22-139.178.89.65:33658.service - OpenSSH per-connection server daemon (139.178.89.65:33658). Jul 15 05:16:42.725386 containerd[1607]: time="2025-07-15T05:16:42.725284315Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\" id:\"2378e03aa2f4ab29d7b58d19344060cc5cd6ece1f9569385d1b3602afcffc939\" pid:5581 exited_at:{seconds:1752556602 nanos:724627880}" Jul 15 05:16:43.112922 sshd[5566]: Accepted publickey for core from 139.178.89.65 port 33658 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:16:43.114875 sshd-session[5566]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:16:43.119066 systemd-logind[1575]: New session 10 of user core. Jul 15 05:16:43.122787 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 15 05:16:43.857912 sshd[5597]: Connection closed by 139.178.89.65 port 33658 Jul 15 05:16:43.861208 sshd-session[5566]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:43.871229 systemd[1]: sshd@9-95.217.135.169:22-139.178.89.65:33658.service: Deactivated successfully. Jul 15 05:16:43.876166 systemd[1]: session-10.scope: Deactivated successfully. Jul 15 05:16:43.881345 systemd-logind[1575]: Session 10 logged out. Waiting for processes to exit. Jul 15 05:16:43.883912 systemd-logind[1575]: Removed session 10. Jul 15 05:16:44.035140 systemd[1]: Started sshd@10-95.217.135.169:22-139.178.89.65:33666.service - OpenSSH per-connection server daemon (139.178.89.65:33666). Jul 15 05:16:45.043479 sshd[5610]: Accepted publickey for core from 139.178.89.65 port 33666 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:16:45.046854 sshd-session[5610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:16:45.055153 systemd-logind[1575]: New session 11 of user core. Jul 15 05:16:45.063980 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 15 05:16:45.854756 sshd[5613]: Connection closed by 139.178.89.65 port 33666 Jul 15 05:16:45.856406 sshd-session[5610]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:45.867075 systemd[1]: sshd@10-95.217.135.169:22-139.178.89.65:33666.service: Deactivated successfully. Jul 15 05:16:45.867096 systemd-logind[1575]: Session 11 logged out. Waiting for processes to exit. Jul 15 05:16:45.869382 systemd[1]: session-11.scope: Deactivated successfully. Jul 15 05:16:45.871577 systemd-logind[1575]: Removed session 11. Jul 15 05:16:46.026762 systemd[1]: Started sshd@11-95.217.135.169:22-139.178.89.65:33680.service - OpenSSH per-connection server daemon (139.178.89.65:33680). Jul 15 05:16:47.038727 sshd[5623]: Accepted publickey for core from 139.178.89.65 port 33680 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:16:47.041653 sshd-session[5623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:16:47.050770 systemd-logind[1575]: New session 12 of user core. Jul 15 05:16:47.056869 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 15 05:16:47.787832 sshd[5626]: Connection closed by 139.178.89.65 port 33680 Jul 15 05:16:47.789016 sshd-session[5623]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:47.794511 systemd[1]: sshd@11-95.217.135.169:22-139.178.89.65:33680.service: Deactivated successfully. Jul 15 05:16:47.797542 systemd[1]: session-12.scope: Deactivated successfully. Jul 15 05:16:47.799506 systemd-logind[1575]: Session 12 logged out. Waiting for processes to exit. Jul 15 05:16:47.801561 systemd-logind[1575]: Removed session 12. Jul 15 05:16:52.964069 systemd[1]: Started sshd@12-95.217.135.169:22-139.178.89.65:41640.service - OpenSSH per-connection server daemon (139.178.89.65:41640). Jul 15 05:16:53.948534 sshd[5659]: Accepted publickey for core from 139.178.89.65 port 41640 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:16:53.948939 sshd-session[5659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:16:53.953342 systemd-logind[1575]: New session 13 of user core. Jul 15 05:16:53.960837 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 15 05:16:54.678416 sshd[5662]: Connection closed by 139.178.89.65 port 41640 Jul 15 05:16:54.681513 sshd-session[5659]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:54.694072 systemd[1]: sshd@12-95.217.135.169:22-139.178.89.65:41640.service: Deactivated successfully. Jul 15 05:16:54.698819 systemd[1]: session-13.scope: Deactivated successfully. Jul 15 05:16:54.701975 systemd-logind[1575]: Session 13 logged out. Waiting for processes to exit. Jul 15 05:16:54.705749 systemd-logind[1575]: Removed session 13. Jul 15 05:16:54.846900 systemd[1]: Started sshd@13-95.217.135.169:22-139.178.89.65:41652.service - OpenSSH per-connection server daemon (139.178.89.65:41652). Jul 15 05:16:54.904321 containerd[1607]: time="2025-07-15T05:16:54.904287371Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"30b0fa2ac064a88a41f303ffb767579281ddc54c22718936d6f2fccab2c3100d\" pid:5684 exited_at:{seconds:1752556614 nanos:903880654}" Jul 15 05:16:55.868588 sshd[5694]: Accepted publickey for core from 139.178.89.65 port 41652 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:16:55.874373 sshd-session[5694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:16:55.880255 systemd-logind[1575]: New session 14 of user core. Jul 15 05:16:55.887917 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 15 05:16:56.838949 sshd[5699]: Connection closed by 139.178.89.65 port 41652 Jul 15 05:16:56.841350 sshd-session[5694]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:56.849228 systemd[1]: sshd@13-95.217.135.169:22-139.178.89.65:41652.service: Deactivated successfully. Jul 15 05:16:56.850770 systemd[1]: session-14.scope: Deactivated successfully. Jul 15 05:16:56.852160 systemd-logind[1575]: Session 14 logged out. Waiting for processes to exit. Jul 15 05:16:56.853878 systemd-logind[1575]: Removed session 14. Jul 15 05:16:57.005887 systemd[1]: Started sshd@14-95.217.135.169:22-139.178.89.65:41662.service - OpenSSH per-connection server daemon (139.178.89.65:41662). Jul 15 05:16:58.011918 sshd[5709]: Accepted publickey for core from 139.178.89.65 port 41662 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:16:58.013616 sshd-session[5709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:16:58.018771 systemd-logind[1575]: New session 15 of user core. Jul 15 05:16:58.027857 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 15 05:16:58.831169 containerd[1607]: time="2025-07-15T05:16:58.831126237Z" level=info msg="TaskExit event in podsandbox handler container_id:\"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\" id:\"b8f03230765426726568205c5a65f6f0c6d2510b80a03d8932da59ec0bb3a533\" pid:5730 exited_at:{seconds:1752556618 nanos:830849358}" Jul 15 05:16:59.713570 sshd[5712]: Connection closed by 139.178.89.65 port 41662 Jul 15 05:16:59.717016 sshd-session[5709]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:59.724993 systemd[1]: sshd@14-95.217.135.169:22-139.178.89.65:41662.service: Deactivated successfully. Jul 15 05:16:59.729614 systemd[1]: session-15.scope: Deactivated successfully. Jul 15 05:16:59.734081 systemd-logind[1575]: Session 15 logged out. Waiting for processes to exit. Jul 15 05:16:59.737796 systemd-logind[1575]: Removed session 15. Jul 15 05:16:59.883059 systemd[1]: Started sshd@15-95.217.135.169:22-139.178.89.65:55870.service - OpenSSH per-connection server daemon (139.178.89.65:55870). Jul 15 05:17:00.892026 sshd[5752]: Accepted publickey for core from 139.178.89.65 port 55870 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:00.893808 sshd-session[5752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:00.899323 systemd-logind[1575]: New session 16 of user core. Jul 15 05:17:00.904933 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 15 05:17:02.073730 sshd[5755]: Connection closed by 139.178.89.65 port 55870 Jul 15 05:17:02.079560 sshd-session[5752]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:02.088099 systemd-logind[1575]: Session 16 logged out. Waiting for processes to exit. Jul 15 05:17:02.091152 systemd[1]: sshd@15-95.217.135.169:22-139.178.89.65:55870.service: Deactivated successfully. Jul 15 05:17:02.096523 systemd[1]: session-16.scope: Deactivated successfully. Jul 15 05:17:02.100293 systemd-logind[1575]: Removed session 16. Jul 15 05:17:02.247488 systemd[1]: Started sshd@16-95.217.135.169:22-139.178.89.65:55886.service - OpenSSH per-connection server daemon (139.178.89.65:55886). Jul 15 05:17:03.265594 sshd[5765]: Accepted publickey for core from 139.178.89.65 port 55886 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:03.269339 sshd-session[5765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:03.280719 systemd-logind[1575]: New session 17 of user core. Jul 15 05:17:03.283805 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 15 05:17:04.067012 sshd[5768]: Connection closed by 139.178.89.65 port 55886 Jul 15 05:17:04.068071 sshd-session[5765]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:04.078960 systemd[1]: sshd@16-95.217.135.169:22-139.178.89.65:55886.service: Deactivated successfully. Jul 15 05:17:04.083196 systemd[1]: session-17.scope: Deactivated successfully. Jul 15 05:17:04.085395 systemd-logind[1575]: Session 17 logged out. Waiting for processes to exit. Jul 15 05:17:04.088166 systemd-logind[1575]: Removed session 17. Jul 15 05:17:09.245808 systemd[1]: Started sshd@17-95.217.135.169:22-139.178.89.65:48236.service - OpenSSH per-connection server daemon (139.178.89.65:48236). Jul 15 05:17:10.264624 sshd[5783]: Accepted publickey for core from 139.178.89.65 port 48236 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:10.268626 sshd-session[5783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:10.276590 systemd-logind[1575]: New session 18 of user core. Jul 15 05:17:10.285918 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 15 05:17:11.102704 sshd[5788]: Connection closed by 139.178.89.65 port 48236 Jul 15 05:17:11.103814 sshd-session[5783]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:11.110992 systemd[1]: sshd@17-95.217.135.169:22-139.178.89.65:48236.service: Deactivated successfully. Jul 15 05:17:11.115832 systemd[1]: session-18.scope: Deactivated successfully. Jul 15 05:17:11.117942 systemd-logind[1575]: Session 18 logged out. Waiting for processes to exit. Jul 15 05:17:11.121458 systemd-logind[1575]: Removed session 18. Jul 15 05:17:12.824181 containerd[1607]: time="2025-07-15T05:17:12.824129014Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fc1cdfe89949a1e839dee5fc12c0cb70222b4dd19173cc745a2161b0ff65398f\" id:\"c0e77bf717e3100298515f708c59f1c60692e4b4ed9bec1f6ba4745d89ea6347\" pid:5812 exited_at:{seconds:1752556632 nanos:796907069}" Jul 15 05:17:24.935854 containerd[1607]: time="2025-07-15T05:17:24.935728036Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"eb7ef83b61382bd3cb291d56ea452b04552e0efa75c9270d1b8f607abba6f850\" pid:5836 exited_at:{seconds:1752556644 nanos:935397676}" Jul 15 05:17:25.822610 systemd[1]: cri-containerd-86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d.scope: Deactivated successfully. Jul 15 05:17:25.823916 systemd[1]: cri-containerd-86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d.scope: Consumed 3.210s CPU time, 89.2M memory peak, 101.4M read from disk. Jul 15 05:17:25.929296 containerd[1607]: time="2025-07-15T05:17:25.929240300Z" level=info msg="received exit event container_id:\"86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d\" id:\"86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d\" pid:2580 exit_status:1 exited_at:{seconds:1752556645 nanos:897369656}" Jul 15 05:17:25.934932 containerd[1607]: time="2025-07-15T05:17:25.934896747Z" level=info msg="TaskExit event in podsandbox handler container_id:\"86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d\" id:\"86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d\" pid:2580 exit_status:1 exited_at:{seconds:1752556645 nanos:897369656}" Jul 15 05:17:25.997739 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d-rootfs.mount: Deactivated successfully. Jul 15 05:17:26.166588 containerd[1607]: time="2025-07-15T05:17:26.166332787Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60383ec3aed3214b9c452816d60b0c17e162eb3c660c2ce781d4d55390e29a61\" id:\"059faebbe4ec21be62714cfd58663392477334942fcc96e70e723a6c908d8357\" pid:5872 exited_at:{seconds:1752556646 nanos:166084698}" Jul 15 05:17:26.250532 systemd[1]: cri-containerd-a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50.scope: Deactivated successfully. Jul 15 05:17:26.252184 systemd[1]: cri-containerd-a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50.scope: Consumed 11.853s CPU time, 117.5M memory peak, 64.8M read from disk. Jul 15 05:17:26.254476 containerd[1607]: time="2025-07-15T05:17:26.254240947Z" level=info msg="received exit event container_id:\"a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50\" id:\"a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50\" pid:3098 exit_status:1 exited_at:{seconds:1752556646 nanos:252631073}" Jul 15 05:17:26.256415 containerd[1607]: time="2025-07-15T05:17:26.256386468Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50\" id:\"a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50\" pid:3098 exit_status:1 exited_at:{seconds:1752556646 nanos:252631073}" Jul 15 05:17:26.301917 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50-rootfs.mount: Deactivated successfully. Jul 15 05:17:26.338361 kubelet[2756]: E0715 05:17:26.338321 2756 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:50264->10.0.0.2:2379: read: connection timed out" Jul 15 05:17:26.344386 kubelet[2756]: I0715 05:17:26.340537 2756 scope.go:117] "RemoveContainer" containerID="86e51cc5464e16b0992a794aa133dd3f1dfd87aaa26a04e1c9102e248a52cf1d" Jul 15 05:17:26.404768 containerd[1607]: time="2025-07-15T05:17:26.404720351Z" level=info msg="CreateContainer within sandbox \"066737f2ab977322f9e722af92daa44fba6ba8ed96b3ebca1053a3fa63d0561c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jul 15 05:17:26.483250 containerd[1607]: time="2025-07-15T05:17:26.482870702Z" level=info msg="Container 8687fc91b4b1a87e280216b1f24fbe07d796bfd218c759f10e5407a4d37f758f: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:26.484020 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3854351893.mount: Deactivated successfully. Jul 15 05:17:26.488515 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount418411211.mount: Deactivated successfully. Jul 15 05:17:26.499718 containerd[1607]: time="2025-07-15T05:17:26.499657632Z" level=info msg="CreateContainer within sandbox \"066737f2ab977322f9e722af92daa44fba6ba8ed96b3ebca1053a3fa63d0561c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"8687fc91b4b1a87e280216b1f24fbe07d796bfd218c759f10e5407a4d37f758f\"" Jul 15 05:17:26.507343 containerd[1607]: time="2025-07-15T05:17:26.507308769Z" level=info msg="StartContainer for \"8687fc91b4b1a87e280216b1f24fbe07d796bfd218c759f10e5407a4d37f758f\"" Jul 15 05:17:26.509169 containerd[1607]: time="2025-07-15T05:17:26.509144052Z" level=info msg="connecting to shim 8687fc91b4b1a87e280216b1f24fbe07d796bfd218c759f10e5407a4d37f758f" address="unix:///run/containerd/s/d7b41dfbe27482c66950563c4c79fb9ae32d5497dd55d996d175714b602275fc" protocol=ttrpc version=3 Jul 15 05:17:26.559778 systemd[1]: Started cri-containerd-8687fc91b4b1a87e280216b1f24fbe07d796bfd218c759f10e5407a4d37f758f.scope - libcontainer container 8687fc91b4b1a87e280216b1f24fbe07d796bfd218c759f10e5407a4d37f758f. Jul 15 05:17:26.615817 containerd[1607]: time="2025-07-15T05:17:26.615609212Z" level=info msg="StartContainer for \"8687fc91b4b1a87e280216b1f24fbe07d796bfd218c759f10e5407a4d37f758f\" returns successfully" Jul 15 05:17:27.349022 kubelet[2756]: I0715 05:17:27.348984 2756 scope.go:117] "RemoveContainer" containerID="a7d7254876e2fbfd1b52863ff660b57bccd0f4e4918e1038bce5a66eb2ba6d50" Jul 15 05:17:27.369931 containerd[1607]: time="2025-07-15T05:17:27.369886086Z" level=info msg="CreateContainer within sandbox \"0fa1fa9d9003d87c454cf681d348a10303092550a8f3802c1d480155082dac53\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jul 15 05:17:27.383371 containerd[1607]: time="2025-07-15T05:17:27.383333620Z" level=info msg="Container 01775db26aada34ad5e31951efd64257c441fbd5bbe75b6940d2612841302795: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:27.391856 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount960681441.mount: Deactivated successfully. Jul 15 05:17:27.396266 containerd[1607]: time="2025-07-15T05:17:27.396219716Z" level=info msg="CreateContainer within sandbox \"0fa1fa9d9003d87c454cf681d348a10303092550a8f3802c1d480155082dac53\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"01775db26aada34ad5e31951efd64257c441fbd5bbe75b6940d2612841302795\"" Jul 15 05:17:27.397509 containerd[1607]: time="2025-07-15T05:17:27.397484021Z" level=info msg="StartContainer for \"01775db26aada34ad5e31951efd64257c441fbd5bbe75b6940d2612841302795\"" Jul 15 05:17:27.398685 containerd[1607]: time="2025-07-15T05:17:27.398305667Z" level=info msg="connecting to shim 01775db26aada34ad5e31951efd64257c441fbd5bbe75b6940d2612841302795" address="unix:///run/containerd/s/e4ccbfe6bc1d773351544be81badaf096360b6de9570aff28c73328799f7ce9c" protocol=ttrpc version=3 Jul 15 05:17:27.427818 systemd[1]: Started cri-containerd-01775db26aada34ad5e31951efd64257c441fbd5bbe75b6940d2612841302795.scope - libcontainer container 01775db26aada34ad5e31951efd64257c441fbd5bbe75b6940d2612841302795. Jul 15 05:17:27.463546 containerd[1607]: time="2025-07-15T05:17:27.463511585Z" level=info msg="StartContainer for \"01775db26aada34ad5e31951efd64257c441fbd5bbe75b6940d2612841302795\" returns successfully" Jul 15 05:17:28.823111 containerd[1607]: time="2025-07-15T05:17:28.823043139Z" level=info msg="TaskExit event in podsandbox handler container_id:\"813bfa48f38b587a7c0590c003d8d130aee8f0a17b088e38ed417cd2af52ab7b\" id:\"da18fa3ac274ba7b6b537d6376ce7d83bfe735d03762ce6cc5d3e6d7b3b4cd56\" pid:5973 exit_status:1 exited_at:{seconds:1752556648 nanos:822807989}" Jul 15 05:17:30.860154 kubelet[2756]: E0715 05:17:30.840175 2756 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:50080->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4396-0-0-n-85c8113064.185254fa97090acb kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4396-0-0-n-85c8113064,UID:4efbf38801bb6c8a1de499bd6b9057eb,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4396-0-0-n-85c8113064,},FirstTimestamp:2025-07-15 05:17:20.253053643 +0000 UTC m=+167.990327924,LastTimestamp:2025-07-15 05:17:20.253053643 +0000 UTC m=+167.990327924,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396-0-0-n-85c8113064,}" Jul 15 05:17:31.521103 systemd[1]: cri-containerd-c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2.scope: Deactivated successfully. Jul 15 05:17:31.521387 systemd[1]: cri-containerd-c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2.scope: Consumed 2.162s CPU time, 41.6M memory peak, 61M read from disk. Jul 15 05:17:31.524221 containerd[1607]: time="2025-07-15T05:17:31.524190504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2\" id:\"c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2\" pid:2608 exit_status:1 exited_at:{seconds:1752556651 nanos:523753255}" Jul 15 05:17:31.524537 containerd[1607]: time="2025-07-15T05:17:31.524269383Z" level=info msg="received exit event container_id:\"c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2\" id:\"c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2\" pid:2608 exit_status:1 exited_at:{seconds:1752556651 nanos:523753255}" Jul 15 05:17:31.549333 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2-rootfs.mount: Deactivated successfully. Jul 15 05:17:32.365432 kubelet[2756]: I0715 05:17:32.365396 2756 scope.go:117] "RemoveContainer" containerID="c3e6fa61b5c3ccf5f1d5188cf19e7e9d52fa0d00a1778f31e5a617dc1acf63d2" Jul 15 05:17:32.367238 containerd[1607]: time="2025-07-15T05:17:32.367188492Z" level=info msg="CreateContainer within sandbox \"5ee3287b61c4e2261ffd95f480d7587ee932d00998e422f62414315dbd5f096c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jul 15 05:17:32.392313 containerd[1607]: time="2025-07-15T05:17:32.391883653Z" level=info msg="Container f41f3751046979686e6d3dd18571f496e7e8794013f227ffc9908df7a5872441: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:32.398414 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1075409568.mount: Deactivated successfully. Jul 15 05:17:32.403586 containerd[1607]: time="2025-07-15T05:17:32.403494497Z" level=info msg="CreateContainer within sandbox \"5ee3287b61c4e2261ffd95f480d7587ee932d00998e422f62414315dbd5f096c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"f41f3751046979686e6d3dd18571f496e7e8794013f227ffc9908df7a5872441\"" Jul 15 05:17:32.404419 containerd[1607]: time="2025-07-15T05:17:32.404404493Z" level=info msg="StartContainer for \"f41f3751046979686e6d3dd18571f496e7e8794013f227ffc9908df7a5872441\"" Jul 15 05:17:32.405221 containerd[1607]: time="2025-07-15T05:17:32.405178431Z" level=info msg="connecting to shim f41f3751046979686e6d3dd18571f496e7e8794013f227ffc9908df7a5872441" address="unix:///run/containerd/s/02ee354128fa85f9cc8880d663c7ac090d23aeb08cc60f8491c98e0906779af6" protocol=ttrpc version=3 Jul 15 05:17:32.427806 systemd[1]: Started cri-containerd-f41f3751046979686e6d3dd18571f496e7e8794013f227ffc9908df7a5872441.scope - libcontainer container f41f3751046979686e6d3dd18571f496e7e8794013f227ffc9908df7a5872441. Jul 15 05:17:32.472736 containerd[1607]: time="2025-07-15T05:17:32.472689691Z" level=info msg="StartContainer for \"f41f3751046979686e6d3dd18571f496e7e8794013f227ffc9908df7a5872441\" returns successfully"