Jul 7 00:36:02.802088 kernel: Linux version 6.12.35-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Sun Jul 6 21:58:13 -00 2025 Jul 7 00:36:02.802108 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e91aabf5a2d4674d97b8508f9502216224d5fb9433440e4c8f906b950e21abf8 Jul 7 00:36:02.802115 kernel: BIOS-provided physical RAM map: Jul 7 00:36:02.802120 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jul 7 00:36:02.802125 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jul 7 00:36:02.802130 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jul 7 00:36:02.802137 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Jul 7 00:36:02.802141 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Jul 7 00:36:02.802146 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jul 7 00:36:02.802151 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jul 7 00:36:02.802155 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jul 7 00:36:02.802160 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jul 7 00:36:02.802165 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jul 7 00:36:02.802170 kernel: NX (Execute Disable) protection: active Jul 7 00:36:02.802177 kernel: APIC: Static calls initialized Jul 7 00:36:02.802182 kernel: SMBIOS 3.0.0 present. Jul 7 00:36:02.802187 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Jul 7 00:36:02.802192 kernel: DMI: Memory slots populated: 1/1 Jul 7 00:36:02.802197 kernel: Hypervisor detected: KVM Jul 7 00:36:02.802202 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jul 7 00:36:02.802207 kernel: kvm-clock: using sched offset of 4013793513 cycles Jul 7 00:36:02.802212 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jul 7 00:36:02.802219 kernel: tsc: Detected 2445.404 MHz processor Jul 7 00:36:02.802225 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jul 7 00:36:02.802230 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jul 7 00:36:02.802235 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Jul 7 00:36:02.802241 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jul 7 00:36:02.802246 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jul 7 00:36:02.802251 kernel: Using GB pages for direct mapping Jul 7 00:36:02.802257 kernel: ACPI: Early table checksum verification disabled Jul 7 00:36:02.802262 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Jul 7 00:36:02.802268 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:36:02.802273 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:36:02.802278 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:36:02.802284 kernel: ACPI: FACS 0x000000007CFE0000 000040 Jul 7 00:36:02.802289 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:36:02.802294 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:36:02.802299 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:36:02.802318 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 7 00:36:02.802323 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Jul 7 00:36:02.802332 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Jul 7 00:36:02.802337 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Jul 7 00:36:02.802343 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Jul 7 00:36:02.802348 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Jul 7 00:36:02.802354 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Jul 7 00:36:02.802360 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Jul 7 00:36:02.802365 kernel: No NUMA configuration found Jul 7 00:36:02.802371 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Jul 7 00:36:02.802376 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Jul 7 00:36:02.802382 kernel: Zone ranges: Jul 7 00:36:02.802387 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jul 7 00:36:02.802393 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Jul 7 00:36:02.802398 kernel: Normal empty Jul 7 00:36:02.802403 kernel: Device empty Jul 7 00:36:02.802409 kernel: Movable zone start for each node Jul 7 00:36:02.802415 kernel: Early memory node ranges Jul 7 00:36:02.802420 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jul 7 00:36:02.802426 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Jul 7 00:36:02.802431 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Jul 7 00:36:02.802441 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jul 7 00:36:02.802451 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jul 7 00:36:02.802463 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Jul 7 00:36:02.802473 kernel: ACPI: PM-Timer IO Port: 0x608 Jul 7 00:36:02.802508 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jul 7 00:36:02.802520 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jul 7 00:36:02.802525 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jul 7 00:36:02.802531 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jul 7 00:36:02.802536 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jul 7 00:36:02.802542 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jul 7 00:36:02.802547 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jul 7 00:36:02.802553 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jul 7 00:36:02.802558 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jul 7 00:36:02.802564 kernel: CPU topo: Max. logical packages: 1 Jul 7 00:36:02.802571 kernel: CPU topo: Max. logical dies: 1 Jul 7 00:36:02.802576 kernel: CPU topo: Max. dies per package: 1 Jul 7 00:36:02.802581 kernel: CPU topo: Max. threads per core: 1 Jul 7 00:36:02.802587 kernel: CPU topo: Num. cores per package: 2 Jul 7 00:36:02.802593 kernel: CPU topo: Num. threads per package: 2 Jul 7 00:36:02.802598 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jul 7 00:36:02.802603 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jul 7 00:36:02.802609 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jul 7 00:36:02.802614 kernel: Booting paravirtualized kernel on KVM Jul 7 00:36:02.802620 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jul 7 00:36:02.802627 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jul 7 00:36:02.802632 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jul 7 00:36:02.802638 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jul 7 00:36:02.802643 kernel: pcpu-alloc: [0] 0 1 Jul 7 00:36:02.802648 kernel: kvm-guest: PV spinlocks disabled, no host support Jul 7 00:36:02.802655 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e91aabf5a2d4674d97b8508f9502216224d5fb9433440e4c8f906b950e21abf8 Jul 7 00:36:02.802661 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 7 00:36:02.802667 kernel: random: crng init done Jul 7 00:36:02.802673 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 7 00:36:02.802679 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jul 7 00:36:02.802684 kernel: Fallback order for Node 0: 0 Jul 7 00:36:02.802689 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Jul 7 00:36:02.802695 kernel: Policy zone: DMA32 Jul 7 00:36:02.802700 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 7 00:36:02.802706 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 7 00:36:02.802711 kernel: ftrace: allocating 40095 entries in 157 pages Jul 7 00:36:02.802717 kernel: ftrace: allocated 157 pages with 5 groups Jul 7 00:36:02.802723 kernel: Dynamic Preempt: voluntary Jul 7 00:36:02.802729 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 7 00:36:02.802735 kernel: rcu: RCU event tracing is enabled. Jul 7 00:36:02.802741 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 7 00:36:02.802746 kernel: Trampoline variant of Tasks RCU enabled. Jul 7 00:36:02.802752 kernel: Rude variant of Tasks RCU enabled. Jul 7 00:36:02.802757 kernel: Tracing variant of Tasks RCU enabled. Jul 7 00:36:02.802763 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 7 00:36:02.802768 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 7 00:36:02.802774 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 7 00:36:02.802781 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 7 00:36:02.802786 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 7 00:36:02.802792 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jul 7 00:36:02.802797 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 7 00:36:02.802803 kernel: Console: colour VGA+ 80x25 Jul 7 00:36:02.802808 kernel: printk: legacy console [tty0] enabled Jul 7 00:36:02.802814 kernel: printk: legacy console [ttyS0] enabled Jul 7 00:36:02.802820 kernel: ACPI: Core revision 20240827 Jul 7 00:36:02.802830 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jul 7 00:36:02.802836 kernel: APIC: Switch to symmetric I/O mode setup Jul 7 00:36:02.802842 kernel: x2apic enabled Jul 7 00:36:02.802848 kernel: APIC: Switched APIC routing to: physical x2apic Jul 7 00:36:02.802854 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jul 7 00:36:02.802860 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns Jul 7 00:36:02.802866 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404) Jul 7 00:36:02.802872 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jul 7 00:36:02.802878 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jul 7 00:36:02.802885 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jul 7 00:36:02.802890 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jul 7 00:36:02.802896 kernel: Spectre V2 : Mitigation: Retpolines Jul 7 00:36:02.802902 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jul 7 00:36:02.802908 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Jul 7 00:36:02.802913 kernel: RETBleed: Mitigation: untrained return thunk Jul 7 00:36:02.802919 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jul 7 00:36:02.802925 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jul 7 00:36:02.802932 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jul 7 00:36:02.802938 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jul 7 00:36:02.802944 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jul 7 00:36:02.802949 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jul 7 00:36:02.802955 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jul 7 00:36:02.802961 kernel: Freeing SMP alternatives memory: 32K Jul 7 00:36:02.802967 kernel: pid_max: default: 32768 minimum: 301 Jul 7 00:36:02.802972 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 7 00:36:02.802978 kernel: landlock: Up and running. Jul 7 00:36:02.802985 kernel: SELinux: Initializing. Jul 7 00:36:02.802990 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 7 00:36:02.802996 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 7 00:36:02.803002 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Jul 7 00:36:02.803008 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Jul 7 00:36:02.803013 kernel: ... version: 0 Jul 7 00:36:02.803019 kernel: ... bit width: 48 Jul 7 00:36:02.803025 kernel: ... generic registers: 6 Jul 7 00:36:02.803030 kernel: ... value mask: 0000ffffffffffff Jul 7 00:36:02.803037 kernel: ... max period: 00007fffffffffff Jul 7 00:36:02.803043 kernel: ... fixed-purpose events: 0 Jul 7 00:36:02.803048 kernel: ... event mask: 000000000000003f Jul 7 00:36:02.803054 kernel: signal: max sigframe size: 1776 Jul 7 00:36:02.803060 kernel: rcu: Hierarchical SRCU implementation. Jul 7 00:36:02.803066 kernel: rcu: Max phase no-delay instances is 400. Jul 7 00:36:02.803071 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 7 00:36:02.803077 kernel: smp: Bringing up secondary CPUs ... Jul 7 00:36:02.803083 kernel: smpboot: x86: Booting SMP configuration: Jul 7 00:36:02.803090 kernel: .... node #0, CPUs: #1 Jul 7 00:36:02.803095 kernel: smp: Brought up 1 node, 2 CPUs Jul 7 00:36:02.803101 kernel: smpboot: Total of 2 processors activated (9781.61 BogoMIPS) Jul 7 00:36:02.803107 kernel: Memory: 1917780K/2047464K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54432K init, 2536K bss, 125140K reserved, 0K cma-reserved) Jul 7 00:36:02.803113 kernel: devtmpfs: initialized Jul 7 00:36:02.803119 kernel: x86/mm: Memory block size: 128MB Jul 7 00:36:02.803125 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 7 00:36:02.803131 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 7 00:36:02.803136 kernel: pinctrl core: initialized pinctrl subsystem Jul 7 00:36:02.803143 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 7 00:36:02.803149 kernel: audit: initializing netlink subsys (disabled) Jul 7 00:36:02.803155 kernel: audit: type=2000 audit(1751848560.247:1): state=initialized audit_enabled=0 res=1 Jul 7 00:36:02.803161 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 7 00:36:02.803166 kernel: thermal_sys: Registered thermal governor 'user_space' Jul 7 00:36:02.803172 kernel: cpuidle: using governor menu Jul 7 00:36:02.803178 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 7 00:36:02.803184 kernel: dca service started, version 1.12.1 Jul 7 00:36:02.803190 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Jul 7 00:36:02.803197 kernel: PCI: Using configuration type 1 for base access Jul 7 00:36:02.803203 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jul 7 00:36:02.803209 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 7 00:36:02.803215 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jul 7 00:36:02.803220 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 7 00:36:02.803226 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jul 7 00:36:02.803232 kernel: ACPI: Added _OSI(Module Device) Jul 7 00:36:02.803238 kernel: ACPI: Added _OSI(Processor Device) Jul 7 00:36:02.803243 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 7 00:36:02.803250 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 7 00:36:02.803256 kernel: ACPI: Interpreter enabled Jul 7 00:36:02.803261 kernel: ACPI: PM: (supports S0 S5) Jul 7 00:36:02.803267 kernel: ACPI: Using IOAPIC for interrupt routing Jul 7 00:36:02.803273 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jul 7 00:36:02.803279 kernel: PCI: Using E820 reservations for host bridge windows Jul 7 00:36:02.803284 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jul 7 00:36:02.803290 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 7 00:36:02.803409 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 7 00:36:02.805611 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jul 7 00:36:02.805684 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jul 7 00:36:02.805693 kernel: PCI host bridge to bus 0000:00 Jul 7 00:36:02.805758 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jul 7 00:36:02.805811 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jul 7 00:36:02.805862 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jul 7 00:36:02.805917 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Jul 7 00:36:02.805966 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jul 7 00:36:02.806016 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Jul 7 00:36:02.806065 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 7 00:36:02.806135 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jul 7 00:36:02.806208 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jul 7 00:36:02.806274 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Jul 7 00:36:02.806351 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Jul 7 00:36:02.806412 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Jul 7 00:36:02.806562 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Jul 7 00:36:02.806634 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jul 7 00:36:02.806702 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.806762 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Jul 7 00:36:02.806825 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 7 00:36:02.806883 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Jul 7 00:36:02.806939 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 7 00:36:02.807003 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.807061 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Jul 7 00:36:02.807118 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 7 00:36:02.807203 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Jul 7 00:36:02.807284 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jul 7 00:36:02.807363 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.807423 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Jul 7 00:36:02.807533 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 7 00:36:02.807598 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Jul 7 00:36:02.807656 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 7 00:36:02.807719 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.807783 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Jul 7 00:36:02.808998 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 7 00:36:02.809062 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Jul 7 00:36:02.809122 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 7 00:36:02.809192 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.809251 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Jul 7 00:36:02.809325 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 7 00:36:02.809390 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Jul 7 00:36:02.809472 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 7 00:36:02.810594 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.810659 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Jul 7 00:36:02.810719 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 7 00:36:02.810776 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Jul 7 00:36:02.810833 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 7 00:36:02.810897 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.810962 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Jul 7 00:36:02.811019 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 7 00:36:02.811076 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Jul 7 00:36:02.811133 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 7 00:36:02.811195 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.811257 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Jul 7 00:36:02.811327 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 7 00:36:02.811389 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Jul 7 00:36:02.811447 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 7 00:36:02.811580 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 7 00:36:02.811644 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Jul 7 00:36:02.811703 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 7 00:36:02.811759 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Jul 7 00:36:02.811821 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 7 00:36:02.811884 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jul 7 00:36:02.811942 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jul 7 00:36:02.812005 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jul 7 00:36:02.812062 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Jul 7 00:36:02.812118 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Jul 7 00:36:02.812184 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jul 7 00:36:02.812242 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Jul 7 00:36:02.812323 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 7 00:36:02.812388 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Jul 7 00:36:02.812469 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Jul 7 00:36:02.814895 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Jul 7 00:36:02.814964 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 7 00:36:02.815413 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jul 7 00:36:02.815559 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Jul 7 00:36:02.815627 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 7 00:36:02.815698 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jul 7 00:36:02.815760 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Jul 7 00:36:02.815821 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Jul 7 00:36:02.815878 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 7 00:36:02.815949 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jul 7 00:36:02.816009 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Jul 7 00:36:02.816067 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 7 00:36:02.816135 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jul 7 00:36:02.816194 kernel: pci 0000:05:00.0: BAR 1 [mem 0xfe000000-0xfe000fff] Jul 7 00:36:02.816253 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Jul 7 00:36:02.816330 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 7 00:36:02.816399 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jul 7 00:36:02.818497 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Jul 7 00:36:02.818589 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Jul 7 00:36:02.818652 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 7 00:36:02.818662 kernel: acpiphp: Slot [0] registered Jul 7 00:36:02.818729 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 7 00:36:02.818797 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Jul 7 00:36:02.818859 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Jul 7 00:36:02.818918 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Jul 7 00:36:02.818976 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 7 00:36:02.818985 kernel: acpiphp: Slot [0-2] registered Jul 7 00:36:02.819040 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 7 00:36:02.819049 kernel: acpiphp: Slot [0-3] registered Jul 7 00:36:02.819104 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 7 00:36:02.819116 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jul 7 00:36:02.819122 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jul 7 00:36:02.819128 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jul 7 00:36:02.819133 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jul 7 00:36:02.819139 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jul 7 00:36:02.819145 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jul 7 00:36:02.819151 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jul 7 00:36:02.819157 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jul 7 00:36:02.819164 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jul 7 00:36:02.819170 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jul 7 00:36:02.819176 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jul 7 00:36:02.819181 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jul 7 00:36:02.819187 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jul 7 00:36:02.819193 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jul 7 00:36:02.819199 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jul 7 00:36:02.819205 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jul 7 00:36:02.819210 kernel: iommu: Default domain type: Translated Jul 7 00:36:02.819217 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jul 7 00:36:02.819223 kernel: PCI: Using ACPI for IRQ routing Jul 7 00:36:02.819229 kernel: PCI: pci_cache_line_size set to 64 bytes Jul 7 00:36:02.819235 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jul 7 00:36:02.819240 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Jul 7 00:36:02.819298 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jul 7 00:36:02.819371 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jul 7 00:36:02.819431 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jul 7 00:36:02.819446 kernel: vgaarb: loaded Jul 7 00:36:02.819462 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jul 7 00:36:02.819473 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jul 7 00:36:02.819676 kernel: clocksource: Switched to clocksource kvm-clock Jul 7 00:36:02.819686 kernel: VFS: Disk quotas dquot_6.6.0 Jul 7 00:36:02.819693 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 7 00:36:02.819699 kernel: pnp: PnP ACPI init Jul 7 00:36:02.819780 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jul 7 00:36:02.819791 kernel: pnp: PnP ACPI: found 5 devices Jul 7 00:36:02.819800 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jul 7 00:36:02.819806 kernel: NET: Registered PF_INET protocol family Jul 7 00:36:02.819812 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 7 00:36:02.819818 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jul 7 00:36:02.819824 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 7 00:36:02.819830 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jul 7 00:36:02.819836 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jul 7 00:36:02.819842 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jul 7 00:36:02.819848 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 7 00:36:02.819855 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 7 00:36:02.819861 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 7 00:36:02.819867 kernel: NET: Registered PF_XDP protocol family Jul 7 00:36:02.819929 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 7 00:36:02.819988 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 7 00:36:02.820047 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 7 00:36:02.820105 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Jul 7 00:36:02.820162 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Jul 7 00:36:02.820224 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Jul 7 00:36:02.820293 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 7 00:36:02.820375 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Jul 7 00:36:02.820446 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 7 00:36:02.820572 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 7 00:36:02.820651 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Jul 7 00:36:02.820716 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jul 7 00:36:02.820779 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 7 00:36:02.820837 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Jul 7 00:36:02.821063 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 7 00:36:02.821137 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 7 00:36:02.821199 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Jul 7 00:36:02.821259 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 7 00:36:02.821336 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 7 00:36:02.821398 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Jul 7 00:36:02.821705 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 7 00:36:02.821784 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 7 00:36:02.821845 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Jul 7 00:36:02.821903 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 7 00:36:02.821961 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 7 00:36:02.822019 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Jul 7 00:36:02.822077 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Jul 7 00:36:02.822138 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 7 00:36:02.822196 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 7 00:36:02.822254 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Jul 7 00:36:02.822329 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Jul 7 00:36:02.822391 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 7 00:36:02.824499 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 7 00:36:02.824584 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Jul 7 00:36:02.824646 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Jul 7 00:36:02.824706 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 7 00:36:02.824768 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jul 7 00:36:02.824820 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jul 7 00:36:02.824871 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jul 7 00:36:02.824921 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Jul 7 00:36:02.824971 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jul 7 00:36:02.825020 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Jul 7 00:36:02.825080 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Jul 7 00:36:02.825139 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 7 00:36:02.825198 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Jul 7 00:36:02.825252 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jul 7 00:36:02.825332 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Jul 7 00:36:02.825390 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 7 00:36:02.825456 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Jul 7 00:36:02.825561 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 7 00:36:02.825629 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Jul 7 00:36:02.825688 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 7 00:36:02.825748 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Jul 7 00:36:02.825806 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 7 00:36:02.825865 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Jul 7 00:36:02.825920 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Jul 7 00:36:02.825977 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 7 00:36:02.826036 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Jul 7 00:36:02.826090 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Jul 7 00:36:02.826142 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 7 00:36:02.826201 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Jul 7 00:36:02.826254 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Jul 7 00:36:02.826345 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 7 00:36:02.826363 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jul 7 00:36:02.826370 kernel: PCI: CLS 0 bytes, default 64 Jul 7 00:36:02.826377 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns Jul 7 00:36:02.826383 kernel: Initialise system trusted keyrings Jul 7 00:36:02.826389 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jul 7 00:36:02.826395 kernel: Key type asymmetric registered Jul 7 00:36:02.826401 kernel: Asymmetric key parser 'x509' registered Jul 7 00:36:02.826407 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jul 7 00:36:02.826416 kernel: io scheduler mq-deadline registered Jul 7 00:36:02.826422 kernel: io scheduler kyber registered Jul 7 00:36:02.826428 kernel: io scheduler bfq registered Jul 7 00:36:02.826542 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jul 7 00:36:02.826608 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jul 7 00:36:02.826667 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jul 7 00:36:02.826725 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jul 7 00:36:02.826783 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jul 7 00:36:02.826841 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jul 7 00:36:02.826904 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jul 7 00:36:02.826962 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jul 7 00:36:02.827020 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jul 7 00:36:02.827076 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jul 7 00:36:02.827133 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jul 7 00:36:02.827190 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jul 7 00:36:02.827248 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jul 7 00:36:02.827320 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jul 7 00:36:02.827385 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jul 7 00:36:02.827457 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jul 7 00:36:02.827476 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jul 7 00:36:02.828955 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jul 7 00:36:02.829195 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jul 7 00:36:02.829230 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jul 7 00:36:02.829268 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jul 7 00:36:02.829278 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 7 00:36:02.829284 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jul 7 00:36:02.829291 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jul 7 00:36:02.829297 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jul 7 00:36:02.829314 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jul 7 00:36:02.829381 kernel: rtc_cmos 00:03: RTC can wake from S4 Jul 7 00:36:02.830596 kernel: rtc_cmos 00:03: registered as rtc0 Jul 7 00:36:02.830613 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Jul 7 00:36:02.830672 kernel: rtc_cmos 00:03: setting system clock to 2025-07-07T00:36:02 UTC (1751848562) Jul 7 00:36:02.830727 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Jul 7 00:36:02.830736 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jul 7 00:36:02.830742 kernel: NET: Registered PF_INET6 protocol family Jul 7 00:36:02.830749 kernel: Segment Routing with IPv6 Jul 7 00:36:02.830755 kernel: In-situ OAM (IOAM) with IPv6 Jul 7 00:36:02.830761 kernel: NET: Registered PF_PACKET protocol family Jul 7 00:36:02.830769 kernel: Key type dns_resolver registered Jul 7 00:36:02.830776 kernel: IPI shorthand broadcast: enabled Jul 7 00:36:02.830783 kernel: sched_clock: Marking stable (2894011016, 143914125)->(3043287657, -5362516) Jul 7 00:36:02.830789 kernel: registered taskstats version 1 Jul 7 00:36:02.830796 kernel: Loading compiled-in X.509 certificates Jul 7 00:36:02.830802 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.35-flatcar: 025c05e23c9778f7a70ff09fb369dd949499fb06' Jul 7 00:36:02.830809 kernel: Demotion targets for Node 0: null Jul 7 00:36:02.830815 kernel: Key type .fscrypt registered Jul 7 00:36:02.830820 kernel: Key type fscrypt-provisioning registered Jul 7 00:36:02.830828 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 7 00:36:02.830834 kernel: ima: Allocated hash algorithm: sha1 Jul 7 00:36:02.830840 kernel: ima: No architecture policies found Jul 7 00:36:02.830846 kernel: clk: Disabling unused clocks Jul 7 00:36:02.830852 kernel: Warning: unable to open an initial console. Jul 7 00:36:02.830859 kernel: Freeing unused kernel image (initmem) memory: 54432K Jul 7 00:36:02.830865 kernel: Write protecting the kernel read-only data: 24576k Jul 7 00:36:02.830871 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jul 7 00:36:02.830877 kernel: Run /init as init process Jul 7 00:36:02.830884 kernel: with arguments: Jul 7 00:36:02.830890 kernel: /init Jul 7 00:36:02.830896 kernel: with environment: Jul 7 00:36:02.830902 kernel: HOME=/ Jul 7 00:36:02.830908 kernel: TERM=linux Jul 7 00:36:02.830914 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 7 00:36:02.830922 systemd[1]: Successfully made /usr/ read-only. Jul 7 00:36:02.830931 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 7 00:36:02.830939 systemd[1]: Detected virtualization kvm. Jul 7 00:36:02.830946 systemd[1]: Detected architecture x86-64. Jul 7 00:36:02.830952 systemd[1]: Running in initrd. Jul 7 00:36:02.830958 systemd[1]: No hostname configured, using default hostname. Jul 7 00:36:02.830965 systemd[1]: Hostname set to . Jul 7 00:36:02.830972 systemd[1]: Initializing machine ID from VM UUID. Jul 7 00:36:02.830979 systemd[1]: Queued start job for default target initrd.target. Jul 7 00:36:02.830985 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 7 00:36:02.830993 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 7 00:36:02.831000 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 7 00:36:02.831007 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 7 00:36:02.831014 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 7 00:36:02.831021 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 7 00:36:02.831028 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 7 00:36:02.831036 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 7 00:36:02.831043 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 7 00:36:02.831050 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 7 00:36:02.831056 systemd[1]: Reached target paths.target - Path Units. Jul 7 00:36:02.831063 systemd[1]: Reached target slices.target - Slice Units. Jul 7 00:36:02.831069 systemd[1]: Reached target swap.target - Swaps. Jul 7 00:36:02.831076 systemd[1]: Reached target timers.target - Timer Units. Jul 7 00:36:02.831083 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 7 00:36:02.831089 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 7 00:36:02.831097 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 7 00:36:02.831104 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 7 00:36:02.831110 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 7 00:36:02.831117 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 7 00:36:02.831123 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 7 00:36:02.831130 systemd[1]: Reached target sockets.target - Socket Units. Jul 7 00:36:02.831137 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 7 00:36:02.831143 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 7 00:36:02.831151 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 7 00:36:02.831158 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 7 00:36:02.831165 systemd[1]: Starting systemd-fsck-usr.service... Jul 7 00:36:02.831171 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 7 00:36:02.831178 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 7 00:36:02.831185 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:36:02.831191 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 7 00:36:02.831199 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 7 00:36:02.831206 systemd[1]: Finished systemd-fsck-usr.service. Jul 7 00:36:02.831228 systemd-journald[215]: Collecting audit messages is disabled. Jul 7 00:36:02.831247 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 7 00:36:02.831255 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 7 00:36:02.831263 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 7 00:36:02.831270 systemd-journald[215]: Journal started Jul 7 00:36:02.831286 systemd-journald[215]: Runtime Journal (/run/log/journal/87096b9c841947a2875ffb2bed6c9bb2) is 4.8M, max 38.6M, 33.7M free. Jul 7 00:36:02.807992 systemd-modules-load[217]: Inserted module 'overlay' Jul 7 00:36:02.868091 kernel: Bridge firewalling registered Jul 7 00:36:02.868108 systemd[1]: Started systemd-journald.service - Journal Service. Jul 7 00:36:02.832069 systemd-modules-load[217]: Inserted module 'br_netfilter' Jul 7 00:36:02.868617 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 7 00:36:02.869445 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:36:02.871403 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 7 00:36:02.872975 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 7 00:36:02.877136 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 7 00:36:02.883257 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 7 00:36:02.889332 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 7 00:36:02.890815 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 7 00:36:02.893735 systemd-tmpfiles[234]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 7 00:36:02.896415 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 7 00:36:02.897014 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 7 00:36:02.899565 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 7 00:36:02.901366 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 7 00:36:02.912569 dracut-cmdline[254]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e91aabf5a2d4674d97b8508f9502216224d5fb9433440e4c8f906b950e21abf8 Jul 7 00:36:02.930765 systemd-resolved[255]: Positive Trust Anchors: Jul 7 00:36:02.931310 systemd-resolved[255]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 7 00:36:02.931336 systemd-resolved[255]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 7 00:36:02.935860 systemd-resolved[255]: Defaulting to hostname 'linux'. Jul 7 00:36:02.936590 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 7 00:36:02.937233 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 7 00:36:02.967520 kernel: SCSI subsystem initialized Jul 7 00:36:02.974518 kernel: Loading iSCSI transport class v2.0-870. Jul 7 00:36:02.983512 kernel: iscsi: registered transport (tcp) Jul 7 00:36:03.002519 kernel: iscsi: registered transport (qla4xxx) Jul 7 00:36:03.002557 kernel: QLogic iSCSI HBA Driver Jul 7 00:36:03.016570 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 7 00:36:03.027181 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 7 00:36:03.029327 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 7 00:36:03.054218 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 7 00:36:03.055864 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 7 00:36:03.095517 kernel: raid6: avx2x4 gen() 31829 MB/s Jul 7 00:36:03.112508 kernel: raid6: avx2x2 gen() 30322 MB/s Jul 7 00:36:03.129714 kernel: raid6: avx2x1 gen() 20720 MB/s Jul 7 00:36:03.129752 kernel: raid6: using algorithm avx2x4 gen() 31829 MB/s Jul 7 00:36:03.148757 kernel: raid6: .... xor() 4686 MB/s, rmw enabled Jul 7 00:36:03.148802 kernel: raid6: using avx2x2 recovery algorithm Jul 7 00:36:03.166520 kernel: xor: automatically using best checksumming function avx Jul 7 00:36:03.273523 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 7 00:36:03.277450 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 7 00:36:03.279216 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 7 00:36:03.300068 systemd-udevd[464]: Using default interface naming scheme 'v255'. Jul 7 00:36:03.304037 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 7 00:36:03.306990 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 7 00:36:03.325925 dracut-pre-trigger[473]: rd.md=0: removing MD RAID activation Jul 7 00:36:03.342066 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 7 00:36:03.344148 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 7 00:36:03.384969 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 7 00:36:03.390369 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 7 00:36:03.457512 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jul 7 00:36:03.466692 kernel: scsi host0: Virtio SCSI HBA Jul 7 00:36:03.469518 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jul 7 00:36:03.472524 kernel: cryptd: max_cpu_qlen set to 1000 Jul 7 00:36:03.476510 kernel: ACPI: bus type USB registered Jul 7 00:36:03.481516 kernel: usbcore: registered new interface driver usbfs Jul 7 00:36:03.484504 kernel: usbcore: registered new interface driver hub Jul 7 00:36:03.488509 kernel: usbcore: registered new device driver usb Jul 7 00:36:03.492814 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 7 00:36:03.494689 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:36:03.496043 kernel: AES CTR mode by8 optimization enabled Jul 7 00:36:03.501062 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:36:03.502133 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:36:03.545713 kernel: sd 0:0:0:0: Power-on or device reset occurred Jul 7 00:36:03.547543 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jul 7 00:36:03.547666 kernel: sd 0:0:0:0: [sda] Write Protect is off Jul 7 00:36:03.547745 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Jul 7 00:36:03.548509 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jul 7 00:36:03.555535 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 7 00:36:03.555562 kernel: GPT:17805311 != 80003071 Jul 7 00:36:03.555571 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 7 00:36:03.555579 kernel: GPT:17805311 != 80003071 Jul 7 00:36:03.555586 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 7 00:36:03.555593 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 7 00:36:03.555600 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jul 7 00:36:03.561522 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 7 00:36:03.561650 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jul 7 00:36:03.562511 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jul 7 00:36:03.562619 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 7 00:36:03.562698 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jul 7 00:36:03.562775 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jul 7 00:36:03.563539 kernel: hub 1-0:1.0: USB hub found Jul 7 00:36:03.563649 kernel: hub 1-0:1.0: 4 ports detected Jul 7 00:36:03.563729 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jul 7 00:36:03.563818 kernel: hub 2-0:1.0: USB hub found Jul 7 00:36:03.563895 kernel: hub 2-0:1.0: 4 ports detected Jul 7 00:36:03.567510 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Jul 7 00:36:03.574501 kernel: libata version 3.00 loaded. Jul 7 00:36:03.581520 kernel: ahci 0000:00:1f.2: version 3.0 Jul 7 00:36:03.581637 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jul 7 00:36:03.582529 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jul 7 00:36:03.582635 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jul 7 00:36:03.582715 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jul 7 00:36:03.593509 kernel: scsi host1: ahci Jul 7 00:36:03.594503 kernel: scsi host2: ahci Jul 7 00:36:03.594624 kernel: scsi host3: ahci Jul 7 00:36:03.594714 kernel: scsi host4: ahci Jul 7 00:36:03.594793 kernel: scsi host5: ahci Jul 7 00:36:03.595511 kernel: scsi host6: ahci Jul 7 00:36:03.595681 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 51 lpm-pol 0 Jul 7 00:36:03.595691 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 51 lpm-pol 0 Jul 7 00:36:03.595787 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 51 lpm-pol 0 Jul 7 00:36:03.595799 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 51 lpm-pol 0 Jul 7 00:36:03.595807 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 51 lpm-pol 0 Jul 7 00:36:03.595983 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 51 lpm-pol 0 Jul 7 00:36:03.627101 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jul 7 00:36:03.666127 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:36:03.673830 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jul 7 00:36:03.680997 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 7 00:36:03.686859 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jul 7 00:36:03.687362 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jul 7 00:36:03.690049 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 7 00:36:03.709604 disk-uuid[630]: Primary Header is updated. Jul 7 00:36:03.709604 disk-uuid[630]: Secondary Entries is updated. Jul 7 00:36:03.709604 disk-uuid[630]: Secondary Header is updated. Jul 7 00:36:03.716513 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 7 00:36:03.736507 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 7 00:36:03.803019 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jul 7 00:36:03.903502 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jul 7 00:36:03.903558 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jul 7 00:36:03.905572 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jul 7 00:36:03.909305 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jul 7 00:36:03.909323 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jul 7 00:36:03.909334 kernel: ata1.00: applying bridge limits Jul 7 00:36:03.909495 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jul 7 00:36:03.911508 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jul 7 00:36:03.912506 kernel: ata1.00: configured for UDMA/100 Jul 7 00:36:03.913783 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jul 7 00:36:03.941511 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 7 00:36:03.946065 kernel: usbcore: registered new interface driver usbhid Jul 7 00:36:03.946102 kernel: usbhid: USB HID core driver Jul 7 00:36:03.952762 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 Jul 7 00:36:03.952787 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jul 7 00:36:03.956769 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jul 7 00:36:03.957656 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 7 00:36:03.979508 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Jul 7 00:36:04.250219 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 7 00:36:04.251117 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 7 00:36:04.252064 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 7 00:36:04.253225 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 7 00:36:04.255346 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 7 00:36:04.277435 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 7 00:36:04.734612 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 7 00:36:04.735614 disk-uuid[631]: The operation has completed successfully. Jul 7 00:36:04.781833 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 7 00:36:04.781914 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 7 00:36:04.811931 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 7 00:36:04.827426 sh[665]: Success Jul 7 00:36:04.844880 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 7 00:36:04.844963 kernel: device-mapper: uevent: version 1.0.3 Jul 7 00:36:04.844980 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 7 00:36:04.854516 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Jul 7 00:36:04.892740 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 7 00:36:04.895548 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 7 00:36:04.913649 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 7 00:36:04.923670 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 7 00:36:04.923702 kernel: BTRFS: device fsid 9d729180-1373-4e9f-840c-4db0e9220239 devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (677) Jul 7 00:36:04.927800 kernel: BTRFS info (device dm-0): first mount of filesystem 9d729180-1373-4e9f-840c-4db0e9220239 Jul 7 00:36:04.927840 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jul 7 00:36:04.929823 kernel: BTRFS info (device dm-0): using free-space-tree Jul 7 00:36:04.938129 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 7 00:36:04.939028 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 7 00:36:04.939831 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 7 00:36:04.940527 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 7 00:36:04.944561 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 7 00:36:04.970517 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (712) Jul 7 00:36:04.972509 kernel: BTRFS info (device sda6): first mount of filesystem a5b10ed8-ad12-45a6-8115-f8814df6901b Jul 7 00:36:04.975187 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 7 00:36:04.975209 kernel: BTRFS info (device sda6): using free-space-tree Jul 7 00:36:04.982523 kernel: BTRFS info (device sda6): last unmount of filesystem a5b10ed8-ad12-45a6-8115-f8814df6901b Jul 7 00:36:04.983569 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 7 00:36:04.985002 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 7 00:36:05.031945 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 7 00:36:05.034983 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 7 00:36:05.078034 systemd-networkd[846]: lo: Link UP Jul 7 00:36:05.078058 ignition[783]: Ignition 2.21.0 Jul 7 00:36:05.078065 ignition[783]: Stage: fetch-offline Jul 7 00:36:05.079043 systemd-networkd[846]: lo: Gained carrier Jul 7 00:36:05.078089 ignition[783]: no configs at "/usr/lib/ignition/base.d" Jul 7 00:36:05.079547 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 7 00:36:05.078095 ignition[783]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:36:05.082006 systemd-networkd[846]: Enumeration completed Jul 7 00:36:05.078159 ignition[783]: parsed url from cmdline: "" Jul 7 00:36:05.082171 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 7 00:36:05.078162 ignition[783]: no config URL provided Jul 7 00:36:05.082821 systemd-networkd[846]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:36:05.078165 ignition[783]: reading system config file "/usr/lib/ignition/user.ign" Jul 7 00:36:05.082824 systemd-networkd[846]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 7 00:36:05.078170 ignition[783]: no config at "/usr/lib/ignition/user.ign" Jul 7 00:36:05.083324 systemd-networkd[846]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:36:05.078174 ignition[783]: failed to fetch config: resource requires networking Jul 7 00:36:05.083327 systemd-networkd[846]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 7 00:36:05.078436 ignition[783]: Ignition finished successfully Jul 7 00:36:05.083836 systemd[1]: Reached target network.target - Network. Jul 7 00:36:05.084098 systemd-networkd[846]: eth0: Link UP Jul 7 00:36:05.084101 systemd-networkd[846]: eth0: Gained carrier Jul 7 00:36:05.084107 systemd-networkd[846]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:36:05.086608 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 7 00:36:05.088663 systemd-networkd[846]: eth1: Link UP Jul 7 00:36:05.088666 systemd-networkd[846]: eth1: Gained carrier Jul 7 00:36:05.088673 systemd-networkd[846]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:36:05.105021 ignition[855]: Ignition 2.21.0 Jul 7 00:36:05.105035 ignition[855]: Stage: fetch Jul 7 00:36:05.105148 ignition[855]: no configs at "/usr/lib/ignition/base.d" Jul 7 00:36:05.105156 ignition[855]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:36:05.105218 ignition[855]: parsed url from cmdline: "" Jul 7 00:36:05.105220 ignition[855]: no config URL provided Jul 7 00:36:05.105224 ignition[855]: reading system config file "/usr/lib/ignition/user.ign" Jul 7 00:36:05.105229 ignition[855]: no config at "/usr/lib/ignition/user.ign" Jul 7 00:36:05.105386 ignition[855]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jul 7 00:36:05.105566 ignition[855]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jul 7 00:36:05.126560 systemd-networkd[846]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 7 00:36:05.153553 systemd-networkd[846]: eth0: DHCPv4 address 65.108.89.120/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 7 00:36:05.305890 ignition[855]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jul 7 00:36:05.310250 ignition[855]: GET result: OK Jul 7 00:36:05.310343 ignition[855]: parsing config with SHA512: b8bf073790b8bd94503e0484d851a7e50815fec91bd18151fba037d5a7a7312c47dfa8901d738ea000c199fbb4b282a5a7393dc89bad6092647a4adfc6f9b539 Jul 7 00:36:05.314642 unknown[855]: fetched base config from "system" Jul 7 00:36:05.314658 unknown[855]: fetched base config from "system" Jul 7 00:36:05.315040 ignition[855]: fetch: fetch complete Jul 7 00:36:05.314667 unknown[855]: fetched user config from "hetzner" Jul 7 00:36:05.315047 ignition[855]: fetch: fetch passed Jul 7 00:36:05.317515 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 7 00:36:05.315101 ignition[855]: Ignition finished successfully Jul 7 00:36:05.319375 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 7 00:36:05.356346 ignition[862]: Ignition 2.21.0 Jul 7 00:36:05.356357 ignition[862]: Stage: kargs Jul 7 00:36:05.356468 ignition[862]: no configs at "/usr/lib/ignition/base.d" Jul 7 00:36:05.356476 ignition[862]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:36:05.359137 ignition[862]: kargs: kargs passed Jul 7 00:36:05.359181 ignition[862]: Ignition finished successfully Jul 7 00:36:05.360121 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 7 00:36:05.361681 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 7 00:36:05.382659 ignition[869]: Ignition 2.21.0 Jul 7 00:36:05.382674 ignition[869]: Stage: disks Jul 7 00:36:05.382831 ignition[869]: no configs at "/usr/lib/ignition/base.d" Jul 7 00:36:05.382844 ignition[869]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:36:05.383816 ignition[869]: disks: disks passed Jul 7 00:36:05.384953 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 7 00:36:05.383864 ignition[869]: Ignition finished successfully Jul 7 00:36:05.386254 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 7 00:36:05.386973 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 7 00:36:05.388108 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 7 00:36:05.389179 systemd[1]: Reached target sysinit.target - System Initialization. Jul 7 00:36:05.390448 systemd[1]: Reached target basic.target - Basic System. Jul 7 00:36:05.392477 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 7 00:36:05.414559 systemd-fsck[878]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 7 00:36:05.417939 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 7 00:36:05.419385 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 7 00:36:05.504505 kernel: EXT4-fs (sda9): mounted filesystem 98c55dfc-aac4-4fdd-8ec0-1f5587b3aa36 r/w with ordered data mode. Quota mode: none. Jul 7 00:36:05.504913 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 7 00:36:05.505699 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 7 00:36:05.507251 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 7 00:36:05.509542 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 7 00:36:05.517570 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 7 00:36:05.519094 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 7 00:36:05.519124 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 7 00:36:05.523048 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 7 00:36:05.524878 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 7 00:36:05.532700 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (886) Jul 7 00:36:05.538545 kernel: BTRFS info (device sda6): first mount of filesystem a5b10ed8-ad12-45a6-8115-f8814df6901b Jul 7 00:36:05.538580 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 7 00:36:05.538597 kernel: BTRFS info (device sda6): using free-space-tree Jul 7 00:36:05.546694 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 7 00:36:05.571996 coreos-metadata[888]: Jul 07 00:36:05.571 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jul 7 00:36:05.573287 initrd-setup-root[913]: cut: /sysroot/etc/passwd: No such file or directory Jul 7 00:36:05.574755 coreos-metadata[888]: Jul 07 00:36:05.573 INFO Fetch successful Jul 7 00:36:05.574755 coreos-metadata[888]: Jul 07 00:36:05.574 INFO wrote hostname ci-4344-1-1-6-69f6cda1f4 to /sysroot/etc/hostname Jul 7 00:36:05.576112 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 7 00:36:05.578521 initrd-setup-root[920]: cut: /sysroot/etc/group: No such file or directory Jul 7 00:36:05.579939 initrd-setup-root[928]: cut: /sysroot/etc/shadow: No such file or directory Jul 7 00:36:05.582375 initrd-setup-root[935]: cut: /sysroot/etc/gshadow: No such file or directory Jul 7 00:36:05.641830 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 7 00:36:05.643311 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 7 00:36:05.644852 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 7 00:36:05.661563 kernel: BTRFS info (device sda6): last unmount of filesystem a5b10ed8-ad12-45a6-8115-f8814df6901b Jul 7 00:36:05.672819 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 7 00:36:05.679240 ignition[1003]: INFO : Ignition 2.21.0 Jul 7 00:36:05.680354 ignition[1003]: INFO : Stage: mount Jul 7 00:36:05.680354 ignition[1003]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 7 00:36:05.680354 ignition[1003]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:36:05.682884 ignition[1003]: INFO : mount: mount passed Jul 7 00:36:05.682884 ignition[1003]: INFO : Ignition finished successfully Jul 7 00:36:05.681630 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 7 00:36:05.683234 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 7 00:36:05.922993 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 7 00:36:05.924696 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 7 00:36:05.957520 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 (8:6) scanned by mount (1015) Jul 7 00:36:05.957563 kernel: BTRFS info (device sda6): first mount of filesystem a5b10ed8-ad12-45a6-8115-f8814df6901b Jul 7 00:36:05.959729 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 7 00:36:05.962245 kernel: BTRFS info (device sda6): using free-space-tree Jul 7 00:36:05.967246 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 7 00:36:05.993765 ignition[1031]: INFO : Ignition 2.21.0 Jul 7 00:36:05.993765 ignition[1031]: INFO : Stage: files Jul 7 00:36:05.994844 ignition[1031]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 7 00:36:05.994844 ignition[1031]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:36:05.994844 ignition[1031]: DEBUG : files: compiled without relabeling support, skipping Jul 7 00:36:05.996811 ignition[1031]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 7 00:36:05.996811 ignition[1031]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 7 00:36:05.998528 ignition[1031]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 7 00:36:05.998528 ignition[1031]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 7 00:36:05.998528 ignition[1031]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 7 00:36:05.998282 unknown[1031]: wrote ssh authorized keys file for user: core Jul 7 00:36:06.001635 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jul 7 00:36:06.001635 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jul 7 00:36:06.143708 systemd-networkd[846]: eth1: Gained IPv6LL Jul 7 00:36:06.238540 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 7 00:36:07.103600 systemd-networkd[846]: eth0: Gained IPv6LL Jul 7 00:36:08.234039 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jul 7 00:36:08.235376 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 7 00:36:08.235376 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 7 00:36:08.235376 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 7 00:36:08.235376 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 7 00:36:08.235376 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 7 00:36:08.235376 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 7 00:36:08.235376 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 7 00:36:08.235376 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 7 00:36:08.241633 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 7 00:36:08.241633 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 7 00:36:08.241633 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 7 00:36:08.241633 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 7 00:36:08.241633 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 7 00:36:08.241633 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jul 7 00:36:09.051522 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 7 00:36:09.213316 ignition[1031]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 7 00:36:09.213316 ignition[1031]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 7 00:36:09.215131 ignition[1031]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 7 00:36:09.216193 ignition[1031]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 7 00:36:09.216193 ignition[1031]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 7 00:36:09.216193 ignition[1031]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 7 00:36:09.220900 ignition[1031]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 7 00:36:09.220900 ignition[1031]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 7 00:36:09.220900 ignition[1031]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 7 00:36:09.220900 ignition[1031]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jul 7 00:36:09.220900 ignition[1031]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jul 7 00:36:09.220900 ignition[1031]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 7 00:36:09.220900 ignition[1031]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 7 00:36:09.220900 ignition[1031]: INFO : files: files passed Jul 7 00:36:09.220900 ignition[1031]: INFO : Ignition finished successfully Jul 7 00:36:09.218012 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 7 00:36:09.222595 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 7 00:36:09.225613 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 7 00:36:09.239266 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 7 00:36:09.239375 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 7 00:36:09.244831 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 7 00:36:09.245598 initrd-setup-root-after-ignition[1062]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 7 00:36:09.246714 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 7 00:36:09.248079 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 7 00:36:09.249026 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 7 00:36:09.250689 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 7 00:36:09.284733 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 7 00:36:09.284831 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 7 00:36:09.286121 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 7 00:36:09.287058 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 7 00:36:09.288236 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 7 00:36:09.288996 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 7 00:36:09.319174 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 7 00:36:09.321769 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 7 00:36:09.341341 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 7 00:36:09.342880 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 7 00:36:09.343631 systemd[1]: Stopped target timers.target - Timer Units. Jul 7 00:36:09.344835 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 7 00:36:09.344956 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 7 00:36:09.346544 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 7 00:36:09.347392 systemd[1]: Stopped target basic.target - Basic System. Jul 7 00:36:09.348538 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 7 00:36:09.349560 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 7 00:36:09.350616 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 7 00:36:09.351683 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 7 00:36:09.352741 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 7 00:36:09.353921 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 7 00:36:09.354992 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 7 00:36:09.356075 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 7 00:36:09.357388 systemd[1]: Stopped target swap.target - Swaps. Jul 7 00:36:09.358607 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 7 00:36:09.358783 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 7 00:36:09.360058 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 7 00:36:09.360915 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 7 00:36:09.361888 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 7 00:36:09.362178 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 7 00:36:09.363175 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 7 00:36:09.363359 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 7 00:36:09.364711 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 7 00:36:09.364901 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 7 00:36:09.366100 systemd[1]: ignition-files.service: Deactivated successfully. Jul 7 00:36:09.366279 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 7 00:36:09.367273 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 7 00:36:09.367457 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 7 00:36:09.370586 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 7 00:36:09.374179 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 7 00:36:09.376095 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 7 00:36:09.376304 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 7 00:36:09.379922 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 7 00:36:09.380048 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 7 00:36:09.385939 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 7 00:36:09.386027 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 7 00:36:09.397273 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 7 00:36:09.399344 ignition[1086]: INFO : Ignition 2.21.0 Jul 7 00:36:09.399344 ignition[1086]: INFO : Stage: umount Jul 7 00:36:09.399344 ignition[1086]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 7 00:36:09.399344 ignition[1086]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 7 00:36:09.399344 ignition[1086]: INFO : umount: umount passed Jul 7 00:36:09.399344 ignition[1086]: INFO : Ignition finished successfully Jul 7 00:36:09.400745 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 7 00:36:09.400869 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 7 00:36:09.402150 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 7 00:36:09.402260 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 7 00:36:09.403697 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 7 00:36:09.403764 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 7 00:36:09.404692 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 7 00:36:09.404734 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 7 00:36:09.405786 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 7 00:36:09.405834 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 7 00:36:09.406842 systemd[1]: Stopped target network.target - Network. Jul 7 00:36:09.407882 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 7 00:36:09.407936 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 7 00:36:09.409006 systemd[1]: Stopped target paths.target - Path Units. Jul 7 00:36:09.410028 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 7 00:36:09.411573 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 7 00:36:09.412351 systemd[1]: Stopped target slices.target - Slice Units. Jul 7 00:36:09.413425 systemd[1]: Stopped target sockets.target - Socket Units. Jul 7 00:36:09.414605 systemd[1]: iscsid.socket: Deactivated successfully. Jul 7 00:36:09.414647 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 7 00:36:09.415613 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 7 00:36:09.415652 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 7 00:36:09.416804 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 7 00:36:09.416848 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 7 00:36:09.418045 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 7 00:36:09.418094 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 7 00:36:09.419113 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 7 00:36:09.419164 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 7 00:36:09.420580 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 7 00:36:09.421767 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 7 00:36:09.424938 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 7 00:36:09.425061 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 7 00:36:09.428362 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 7 00:36:09.428746 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 7 00:36:09.428794 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 7 00:36:09.430557 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 7 00:36:09.435173 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 7 00:36:09.435318 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 7 00:36:09.437669 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 7 00:36:09.437775 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 7 00:36:09.438567 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 7 00:36:09.438595 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 7 00:36:09.440357 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 7 00:36:09.442061 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 7 00:36:09.442108 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 7 00:36:09.443319 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 7 00:36:09.443360 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 7 00:36:09.445851 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 7 00:36:09.445896 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 7 00:36:09.447273 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 7 00:36:09.452262 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 7 00:36:09.458366 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 7 00:36:09.458470 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 7 00:36:09.460174 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 7 00:36:09.460363 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 7 00:36:09.461623 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 7 00:36:09.461673 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 7 00:36:09.462608 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 7 00:36:09.462638 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 7 00:36:09.463850 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 7 00:36:09.463901 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 7 00:36:09.465360 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 7 00:36:09.465409 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 7 00:36:09.466523 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 7 00:36:09.466567 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 7 00:36:09.469580 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 7 00:36:09.470422 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 7 00:36:09.470475 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 7 00:36:09.473232 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 7 00:36:09.473286 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 7 00:36:09.474407 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jul 7 00:36:09.474455 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 7 00:36:09.475572 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 7 00:36:09.475612 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 7 00:36:09.476470 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 7 00:36:09.476536 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:36:09.482947 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 7 00:36:09.483046 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 7 00:36:09.484184 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 7 00:36:09.485750 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 7 00:36:09.503639 systemd[1]: Switching root. Jul 7 00:36:09.530032 systemd-journald[215]: Journal stopped Jul 7 00:36:10.423959 systemd-journald[215]: Received SIGTERM from PID 1 (systemd). Jul 7 00:36:10.424003 kernel: SELinux: policy capability network_peer_controls=1 Jul 7 00:36:10.424014 kernel: SELinux: policy capability open_perms=1 Jul 7 00:36:10.424021 kernel: SELinux: policy capability extended_socket_class=1 Jul 7 00:36:10.424031 kernel: SELinux: policy capability always_check_network=0 Jul 7 00:36:10.424041 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 7 00:36:10.424049 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 7 00:36:10.424057 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 7 00:36:10.424064 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 7 00:36:10.424072 kernel: SELinux: policy capability userspace_initial_context=0 Jul 7 00:36:10.424079 kernel: audit: type=1403 audit(1751848569.660:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 7 00:36:10.424088 systemd[1]: Successfully loaded SELinux policy in 41.722ms. Jul 7 00:36:10.424101 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 13.742ms. Jul 7 00:36:10.424114 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 7 00:36:10.424122 systemd[1]: Detected virtualization kvm. Jul 7 00:36:10.424130 systemd[1]: Detected architecture x86-64. Jul 7 00:36:10.424138 systemd[1]: Detected first boot. Jul 7 00:36:10.424146 systemd[1]: Hostname set to . Jul 7 00:36:10.424156 systemd[1]: Initializing machine ID from VM UUID. Jul 7 00:36:10.424164 zram_generator::config[1130]: No configuration found. Jul 7 00:36:10.424174 kernel: Guest personality initialized and is inactive Jul 7 00:36:10.424182 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jul 7 00:36:10.424202 kernel: Initialized host personality Jul 7 00:36:10.424211 kernel: NET: Registered PF_VSOCK protocol family Jul 7 00:36:10.424218 systemd[1]: Populated /etc with preset unit settings. Jul 7 00:36:10.424227 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 7 00:36:10.424236 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 7 00:36:10.424244 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 7 00:36:10.424252 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 7 00:36:10.424261 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 7 00:36:10.424270 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 7 00:36:10.424278 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 7 00:36:10.424286 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 7 00:36:10.424296 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 7 00:36:10.424312 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 7 00:36:10.424330 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 7 00:36:10.424346 systemd[1]: Created slice user.slice - User and Session Slice. Jul 7 00:36:10.424357 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 7 00:36:10.424366 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 7 00:36:10.424379 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 7 00:36:10.424390 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 7 00:36:10.424399 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 7 00:36:10.424408 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 7 00:36:10.424419 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jul 7 00:36:10.424427 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 7 00:36:10.424435 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 7 00:36:10.424444 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 7 00:36:10.424452 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 7 00:36:10.424460 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 7 00:36:10.424469 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 7 00:36:10.424477 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 7 00:36:10.425529 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 7 00:36:10.425547 systemd[1]: Reached target slices.target - Slice Units. Jul 7 00:36:10.425556 systemd[1]: Reached target swap.target - Swaps. Jul 7 00:36:10.425565 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 7 00:36:10.425573 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 7 00:36:10.425582 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 7 00:36:10.425590 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 7 00:36:10.425598 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 7 00:36:10.425607 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 7 00:36:10.425618 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 7 00:36:10.425627 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 7 00:36:10.425635 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 7 00:36:10.425644 systemd[1]: Mounting media.mount - External Media Directory... Jul 7 00:36:10.425652 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 7 00:36:10.425660 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 7 00:36:10.425668 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 7 00:36:10.425676 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 7 00:36:10.425685 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 7 00:36:10.425695 systemd[1]: Reached target machines.target - Containers. Jul 7 00:36:10.425703 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 7 00:36:10.425711 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 7 00:36:10.425722 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 7 00:36:10.425730 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 7 00:36:10.425738 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 7 00:36:10.425747 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 7 00:36:10.425755 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 7 00:36:10.425764 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 7 00:36:10.425773 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 7 00:36:10.425783 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 7 00:36:10.425791 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 7 00:36:10.425799 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 7 00:36:10.425808 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 7 00:36:10.425817 systemd[1]: Stopped systemd-fsck-usr.service. Jul 7 00:36:10.425825 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 7 00:36:10.425835 kernel: loop: module loaded Jul 7 00:36:10.425843 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 7 00:36:10.425852 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 7 00:36:10.425860 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 7 00:36:10.425868 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 7 00:36:10.425877 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 7 00:36:10.425886 kernel: fuse: init (API version 7.41) Jul 7 00:36:10.425894 kernel: ACPI: bus type drm_connector registered Jul 7 00:36:10.425901 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 7 00:36:10.425911 systemd[1]: verity-setup.service: Deactivated successfully. Jul 7 00:36:10.425919 systemd[1]: Stopped verity-setup.service. Jul 7 00:36:10.425929 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 7 00:36:10.425937 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 7 00:36:10.425945 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 7 00:36:10.425954 systemd[1]: Mounted media.mount - External Media Directory. Jul 7 00:36:10.425962 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 7 00:36:10.425987 systemd-journald[1214]: Collecting audit messages is disabled. Jul 7 00:36:10.426008 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 7 00:36:10.426018 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 7 00:36:10.426027 systemd-journald[1214]: Journal started Jul 7 00:36:10.426045 systemd-journald[1214]: Runtime Journal (/run/log/journal/87096b9c841947a2875ffb2bed6c9bb2) is 4.8M, max 38.6M, 33.7M free. Jul 7 00:36:10.146671 systemd[1]: Queued start job for default target multi-user.target. Jul 7 00:36:10.159177 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 7 00:36:10.159653 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 7 00:36:10.429512 systemd[1]: Started systemd-journald.service - Journal Service. Jul 7 00:36:10.430438 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 7 00:36:10.431382 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 7 00:36:10.432309 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 7 00:36:10.432591 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 7 00:36:10.433582 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 7 00:36:10.433844 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 7 00:36:10.434741 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 7 00:36:10.434977 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 7 00:36:10.435918 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 7 00:36:10.436070 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 7 00:36:10.437016 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 7 00:36:10.437286 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 7 00:36:10.438211 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 7 00:36:10.438595 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 7 00:36:10.439526 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 7 00:36:10.440375 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 7 00:36:10.441411 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 7 00:36:10.442381 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 7 00:36:10.452060 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 7 00:36:10.455561 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 7 00:36:10.457558 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 7 00:36:10.458581 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 7 00:36:10.458685 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 7 00:36:10.461953 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 7 00:36:10.470587 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 7 00:36:10.471234 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 7 00:36:10.474457 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 7 00:36:10.477637 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 7 00:36:10.479162 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 7 00:36:10.480954 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 7 00:36:10.482262 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 7 00:36:10.484161 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 7 00:36:10.486041 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 7 00:36:10.493571 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 7 00:36:10.497517 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 7 00:36:10.499028 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 7 00:36:10.501622 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 7 00:36:10.518085 kernel: loop0: detected capacity change from 0 to 8 Jul 7 00:36:10.518180 systemd-journald[1214]: Time spent on flushing to /var/log/journal/87096b9c841947a2875ffb2bed6c9bb2 is 25.365ms for 1162 entries. Jul 7 00:36:10.518180 systemd-journald[1214]: System Journal (/var/log/journal/87096b9c841947a2875ffb2bed6c9bb2) is 8M, max 584.8M, 576.8M free. Jul 7 00:36:10.552962 systemd-journald[1214]: Received client request to flush runtime journal. Jul 7 00:36:10.553074 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 7 00:36:10.518743 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 7 00:36:10.521920 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 7 00:36:10.524365 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 7 00:36:10.542221 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 7 00:36:10.554528 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 7 00:36:10.557961 systemd-tmpfiles[1256]: ACLs are not supported, ignoring. Jul 7 00:36:10.557975 systemd-tmpfiles[1256]: ACLs are not supported, ignoring. Jul 7 00:36:10.560516 kernel: loop1: detected capacity change from 0 to 113872 Jul 7 00:36:10.563585 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 7 00:36:10.569255 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 7 00:36:10.572623 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 7 00:36:10.594526 kernel: loop2: detected capacity change from 0 to 146240 Jul 7 00:36:10.605296 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 7 00:36:10.607920 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 7 00:36:10.623614 kernel: loop3: detected capacity change from 0 to 224512 Jul 7 00:36:10.627276 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. Jul 7 00:36:10.627562 systemd-tmpfiles[1278]: ACLs are not supported, ignoring. Jul 7 00:36:10.632118 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 7 00:36:10.667529 kernel: loop4: detected capacity change from 0 to 8 Jul 7 00:36:10.671685 kernel: loop5: detected capacity change from 0 to 113872 Jul 7 00:36:10.686537 kernel: loop6: detected capacity change from 0 to 146240 Jul 7 00:36:10.705540 kernel: loop7: detected capacity change from 0 to 224512 Jul 7 00:36:10.723547 (sd-merge)[1283]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jul 7 00:36:10.724581 (sd-merge)[1283]: Merged extensions into '/usr'. Jul 7 00:36:10.732149 systemd[1]: Reload requested from client PID 1255 ('systemd-sysext') (unit systemd-sysext.service)... Jul 7 00:36:10.732276 systemd[1]: Reloading... Jul 7 00:36:10.798531 zram_generator::config[1305]: No configuration found. Jul 7 00:36:10.937515 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 7 00:36:10.986968 ldconfig[1250]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 7 00:36:11.025204 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 7 00:36:11.025440 systemd[1]: Reloading finished in 292 ms. Jul 7 00:36:11.037315 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 7 00:36:11.038531 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 7 00:36:11.049580 systemd[1]: Starting ensure-sysext.service... Jul 7 00:36:11.052638 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 7 00:36:11.076569 systemd[1]: Reload requested from client PID 1352 ('systemctl') (unit ensure-sysext.service)... Jul 7 00:36:11.076582 systemd[1]: Reloading... Jul 7 00:36:11.083171 systemd-tmpfiles[1353]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 7 00:36:11.083411 systemd-tmpfiles[1353]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 7 00:36:11.083664 systemd-tmpfiles[1353]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 7 00:36:11.083888 systemd-tmpfiles[1353]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 7 00:36:11.084467 systemd-tmpfiles[1353]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 7 00:36:11.085548 systemd-tmpfiles[1353]: ACLs are not supported, ignoring. Jul 7 00:36:11.085639 systemd-tmpfiles[1353]: ACLs are not supported, ignoring. Jul 7 00:36:11.087917 systemd-tmpfiles[1353]: Detected autofs mount point /boot during canonicalization of boot. Jul 7 00:36:11.088263 systemd-tmpfiles[1353]: Skipping /boot Jul 7 00:36:11.096641 systemd-tmpfiles[1353]: Detected autofs mount point /boot during canonicalization of boot. Jul 7 00:36:11.096649 systemd-tmpfiles[1353]: Skipping /boot Jul 7 00:36:11.134529 zram_generator::config[1380]: No configuration found. Jul 7 00:36:11.214254 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 7 00:36:11.293268 systemd[1]: Reloading finished in 216 ms. Jul 7 00:36:11.303531 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 7 00:36:11.307820 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 7 00:36:11.315605 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 7 00:36:11.318539 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 7 00:36:11.323501 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 7 00:36:11.328701 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 7 00:36:11.331744 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 7 00:36:11.334412 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 7 00:36:11.347732 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 7 00:36:11.351405 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 7 00:36:11.352623 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 7 00:36:11.358582 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 7 00:36:11.369979 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 7 00:36:11.375613 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 7 00:36:11.376314 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 7 00:36:11.376401 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 7 00:36:11.376475 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 7 00:36:11.383438 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 7 00:36:11.384236 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 7 00:36:11.384600 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 7 00:36:11.385008 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 7 00:36:11.385117 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 7 00:36:11.388590 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 7 00:36:11.392473 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 7 00:36:11.393471 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 7 00:36:11.399573 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 7 00:36:11.400442 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 7 00:36:11.400574 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 7 00:36:11.408878 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 7 00:36:11.409012 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 7 00:36:11.410055 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 7 00:36:11.410278 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 7 00:36:11.412162 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 7 00:36:11.415891 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 7 00:36:11.417317 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 7 00:36:11.418571 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 7 00:36:11.418660 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 7 00:36:11.420638 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 7 00:36:11.421471 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 7 00:36:11.421847 systemd-udevd[1429]: Using default interface naming scheme 'v255'. Jul 7 00:36:11.422558 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 7 00:36:11.425804 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 7 00:36:11.429531 systemd[1]: Finished ensure-sysext.service. Jul 7 00:36:11.435693 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 7 00:36:11.439723 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 7 00:36:11.441907 augenrules[1468]: No rules Jul 7 00:36:11.444818 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 7 00:36:11.445636 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 7 00:36:11.447261 systemd[1]: audit-rules.service: Deactivated successfully. Jul 7 00:36:11.447462 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 7 00:36:11.449752 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 7 00:36:11.450000 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 7 00:36:11.451157 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 7 00:36:11.451425 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 7 00:36:11.455555 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 7 00:36:11.455613 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 7 00:36:11.461016 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 7 00:36:11.472273 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 7 00:36:11.475039 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 7 00:36:11.562964 systemd-resolved[1428]: Positive Trust Anchors: Jul 7 00:36:11.562984 systemd-resolved[1428]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 7 00:36:11.563020 systemd-resolved[1428]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 7 00:36:11.568142 systemd-resolved[1428]: Using system hostname 'ci-4344-1-1-6-69f6cda1f4'. Jul 7 00:36:11.571216 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 7 00:36:11.571973 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 7 00:36:11.586131 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jul 7 00:36:11.587438 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 7 00:36:11.588051 systemd[1]: Reached target sysinit.target - System Initialization. Jul 7 00:36:11.589000 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 7 00:36:11.590079 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 7 00:36:11.591548 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jul 7 00:36:11.596742 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 7 00:36:11.597320 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 7 00:36:11.597357 systemd[1]: Reached target paths.target - Path Units. Jul 7 00:36:11.597776 systemd[1]: Reached target time-set.target - System Time Set. Jul 7 00:36:11.598324 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 7 00:36:11.598872 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 7 00:36:11.599533 systemd[1]: Reached target timers.target - Timer Units. Jul 7 00:36:11.601783 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 7 00:36:11.604974 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 7 00:36:11.607973 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 7 00:36:11.608618 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 7 00:36:11.609552 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 7 00:36:11.611231 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 7 00:36:11.611961 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 7 00:36:11.613131 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 7 00:36:11.614002 systemd[1]: Reached target sockets.target - Socket Units. Jul 7 00:36:11.614587 systemd[1]: Reached target basic.target - Basic System. Jul 7 00:36:11.615013 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 7 00:36:11.615032 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 7 00:36:11.616867 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 7 00:36:11.623588 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 7 00:36:11.625623 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 7 00:36:11.627597 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 7 00:36:11.635029 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 7 00:36:11.635668 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 7 00:36:11.636806 systemd-networkd[1487]: lo: Link UP Jul 7 00:36:11.636984 systemd-networkd[1487]: lo: Gained carrier Jul 7 00:36:11.638624 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jul 7 00:36:11.639692 systemd-networkd[1487]: Enumeration completed Jul 7 00:36:11.643806 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 7 00:36:11.645889 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 7 00:36:11.650873 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 7 00:36:11.653561 jq[1521]: false Jul 7 00:36:11.653918 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 7 00:36:11.657026 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Refreshing passwd entry cache Jul 7 00:36:11.657205 oslogin_cache_refresh[1523]: Refreshing passwd entry cache Jul 7 00:36:11.660983 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Failure getting users, quitting Jul 7 00:36:11.661028 oslogin_cache_refresh[1523]: Failure getting users, quitting Jul 7 00:36:11.661083 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 7 00:36:11.661114 oslogin_cache_refresh[1523]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 7 00:36:11.661193 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Refreshing group entry cache Jul 7 00:36:11.661507 oslogin_cache_refresh[1523]: Refreshing group entry cache Jul 7 00:36:11.661904 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Failure getting groups, quitting Jul 7 00:36:11.662510 oslogin_cache_refresh[1523]: Failure getting groups, quitting Jul 7 00:36:11.662578 google_oslogin_nss_cache[1523]: oslogin_cache_refresh[1523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 7 00:36:11.662606 oslogin_cache_refresh[1523]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 7 00:36:11.667933 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 7 00:36:11.669048 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 7 00:36:11.669435 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 7 00:36:11.671009 systemd[1]: Starting update-engine.service - Update Engine... Jul 7 00:36:11.675952 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 7 00:36:11.678312 extend-filesystems[1522]: Found /dev/sda6 Jul 7 00:36:11.677884 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 7 00:36:11.680575 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 7 00:36:11.681223 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 7 00:36:11.681356 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 7 00:36:11.681580 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jul 7 00:36:11.681698 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jul 7 00:36:11.685467 extend-filesystems[1522]: Found /dev/sda9 Jul 7 00:36:11.687546 extend-filesystems[1522]: Checking size of /dev/sda9 Jul 7 00:36:11.686564 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 7 00:36:11.686702 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 7 00:36:11.690904 systemd[1]: Reached target network.target - Network. Jul 7 00:36:11.697068 coreos-metadata[1517]: Jul 07 00:36:11.692 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jul 7 00:36:11.697068 coreos-metadata[1517]: Jul 07 00:36:11.693 INFO Failed to fetch: error sending request for url (http://169.254.169.254/hetzner/v1/metadata) Jul 7 00:36:11.694235 systemd[1]: Starting containerd.service - containerd container runtime... Jul 7 00:36:11.698839 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 7 00:36:11.703831 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 7 00:36:11.709903 extend-filesystems[1522]: Resized partition /dev/sda9 Jul 7 00:36:11.715797 extend-filesystems[1563]: resize2fs 1.47.2 (1-Jan-2025) Jul 7 00:36:11.719932 jq[1537]: true Jul 7 00:36:11.726546 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jul 7 00:36:11.734455 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 7 00:36:11.734268 dbus-daemon[1518]: [system] SELinux support is enabled Jul 7 00:36:11.740234 update_engine[1536]: I20250707 00:36:11.738737 1536 main.cc:92] Flatcar Update Engine starting Jul 7 00:36:11.739073 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 7 00:36:11.739093 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 7 00:36:11.739635 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 7 00:36:11.739648 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 7 00:36:11.744956 tar[1539]: linux-amd64/LICENSE Jul 7 00:36:11.746229 tar[1539]: linux-amd64/helm Jul 7 00:36:11.750106 systemd[1]: motdgen.service: Deactivated successfully. Jul 7 00:36:11.750325 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 7 00:36:11.758214 update_engine[1536]: I20250707 00:36:11.758180 1536 update_check_scheduler.cc:74] Next update check in 3m36s Jul 7 00:36:11.758460 systemd[1]: Started update-engine.service - Update Engine. Jul 7 00:36:11.759572 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 7 00:36:11.764056 jq[1565]: true Jul 7 00:36:11.763734 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 7 00:36:11.770722 (ntainerd)[1574]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 7 00:36:11.835568 systemd-networkd[1487]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:36:11.835576 systemd-networkd[1487]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 7 00:36:11.846460 systemd-networkd[1487]: eth1: Link UP Jul 7 00:36:11.848189 systemd-networkd[1487]: eth1: Gained carrier Jul 7 00:36:11.848210 systemd-networkd[1487]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:36:11.859064 systemd-networkd[1487]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:36:11.859067 systemd-networkd[1487]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 7 00:36:11.867266 systemd-networkd[1487]: eth0: Link UP Jul 7 00:36:11.869142 systemd-networkd[1487]: eth0: Gained carrier Jul 7 00:36:11.869161 systemd-networkd[1487]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 7 00:36:11.889900 bash[1595]: Updated "/home/core/.ssh/authorized_keys" Jul 7 00:36:11.890855 systemd-networkd[1487]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 7 00:36:11.899403 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jul 7 00:36:11.927663 kernel: mousedev: PS/2 mouse device common for all mice Jul 7 00:36:11.893964 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 7 00:36:11.898250 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Jul 7 00:36:11.901302 systemd[1]: Starting sshkeys.service... Jul 7 00:36:11.934648 extend-filesystems[1563]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jul 7 00:36:11.934648 extend-filesystems[1563]: old_desc_blocks = 1, new_desc_blocks = 5 Jul 7 00:36:11.934648 extend-filesystems[1563]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jul 7 00:36:11.944412 extend-filesystems[1522]: Resized filesystem in /dev/sda9 Jul 7 00:36:11.935266 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 7 00:36:11.935540 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 7 00:36:11.936555 systemd-networkd[1487]: eth0: DHCPv4 address 65.108.89.120/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 7 00:36:11.937426 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Jul 7 00:36:11.950284 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 7 00:36:11.952832 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 7 00:36:11.965431 systemd-logind[1531]: New seat seat0. Jul 7 00:36:11.965888 systemd[1]: Started systemd-logind.service - User Login Management. Jul 7 00:36:12.003786 coreos-metadata[1606]: Jul 07 00:36:12.002 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jul 7 00:36:12.003994 coreos-metadata[1606]: Jul 07 00:36:12.003 INFO Fetch successful Jul 7 00:36:12.005824 unknown[1606]: wrote ssh authorized keys file for user: core Jul 7 00:36:12.027292 locksmithd[1578]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 7 00:36:12.044363 update-ssh-keys[1613]: Updated "/home/core/.ssh/authorized_keys" Jul 7 00:36:12.044602 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 7 00:36:12.050753 systemd[1]: Finished sshkeys.service. Jul 7 00:36:12.084621 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jul 7 00:36:12.086617 containerd[1574]: time="2025-07-07T00:36:12Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 7 00:36:12.086617 containerd[1574]: time="2025-07-07T00:36:12.086378410Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Jul 7 00:36:12.088480 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jul 7 00:36:12.102418 containerd[1574]: time="2025-07-07T00:36:12.102095327Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.089µs" Jul 7 00:36:12.102418 containerd[1574]: time="2025-07-07T00:36:12.102122037Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 7 00:36:12.102418 containerd[1574]: time="2025-07-07T00:36:12.102137876Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 7 00:36:12.102418 containerd[1574]: time="2025-07-07T00:36:12.102270295Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 7 00:36:12.102418 containerd[1574]: time="2025-07-07T00:36:12.102285263Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 7 00:36:12.102418 containerd[1574]: time="2025-07-07T00:36:12.102303277Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 7 00:36:12.102418 containerd[1574]: time="2025-07-07T00:36:12.102355585Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 7 00:36:12.102418 containerd[1574]: time="2025-07-07T00:36:12.102367667Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 7 00:36:12.103233 containerd[1574]: time="2025-07-07T00:36:12.102909664Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 7 00:36:12.103233 containerd[1574]: time="2025-07-07T00:36:12.102927147Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 7 00:36:12.103233 containerd[1574]: time="2025-07-07T00:36:12.102938238Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 7 00:36:12.103233 containerd[1574]: time="2025-07-07T00:36:12.102944700Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 7 00:36:12.103233 containerd[1574]: time="2025-07-07T00:36:12.103007778Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 7 00:36:12.103233 containerd[1574]: time="2025-07-07T00:36:12.103195971Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 7 00:36:12.103233 containerd[1574]: time="2025-07-07T00:36:12.103227630Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 7 00:36:12.103375 containerd[1574]: time="2025-07-07T00:36:12.103235876Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 7 00:36:12.103560 containerd[1574]: time="2025-07-07T00:36:12.103540367Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 7 00:36:12.104705 containerd[1574]: time="2025-07-07T00:36:12.104684673Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 7 00:36:12.104759 containerd[1574]: time="2025-07-07T00:36:12.104743774Z" level=info msg="metadata content store policy set" policy=shared Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109474627Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109555559Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109575997Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109587929Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109634176Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109652551Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109672949Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109692155Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109706242Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109719526Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109731900Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109747589Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109844471Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 7 00:36:12.110080 containerd[1574]: time="2025-07-07T00:36:12.109870209Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109890798Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109900515Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109908581Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109922066Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109936554Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109949758Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109963193Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109976308Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.109988421Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.110286129Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.110315424Z" level=info msg="Start snapshots syncer" Jul 7 00:36:12.110345 containerd[1574]: time="2025-07-07T00:36:12.110340481Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 7 00:36:12.115960 containerd[1574]: time="2025-07-07T00:36:12.115667141Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 7 00:36:12.115960 containerd[1574]: time="2025-07-07T00:36:12.115720842Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115791995Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115871454Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115894207Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115907031Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115917811Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115927229Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115939472Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115950893Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115972894Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115984506Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 7 00:36:12.116073 containerd[1574]: time="2025-07-07T00:36:12.115995246Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118794687Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118819754Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118831536Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118841815Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118850922Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118864678Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118876250Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118889534Z" level=info msg="runtime interface created" Jul 7 00:36:12.118896 containerd[1574]: time="2025-07-07T00:36:12.118893893Z" level=info msg="created NRI interface" Jul 7 00:36:12.119039 containerd[1574]: time="2025-07-07T00:36:12.118903400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 7 00:36:12.119039 containerd[1574]: time="2025-07-07T00:36:12.118913159Z" level=info msg="Connect containerd service" Jul 7 00:36:12.119039 containerd[1574]: time="2025-07-07T00:36:12.118942374Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 7 00:36:12.122505 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Jul 7 00:36:12.124540 containerd[1574]: time="2025-07-07T00:36:12.119479822Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 7 00:36:12.183702 sshd_keygen[1566]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 7 00:36:12.213773 kernel: ACPI: button: Power Button [PWRF] Jul 7 00:36:12.224536 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jul 7 00:36:12.224729 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jul 7 00:36:12.232047 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 7 00:36:12.236681 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 7 00:36:12.241548 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jul 7 00:36:12.241585 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jul 7 00:36:12.249841 kernel: Console: switching to colour dummy device 80x25 Jul 7 00:36:12.251252 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 7 00:36:12.251285 kernel: [drm] features: -context_init Jul 7 00:36:12.253828 kernel: [drm] number of scanouts: 1 Jul 7 00:36:12.253856 kernel: [drm] number of cap sets: 0 Jul 7 00:36:12.258923 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jul 7 00:36:12.280011 systemd[1]: issuegen.service: Deactivated successfully. Jul 7 00:36:12.280265 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 7 00:36:12.282074 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 7 00:36:12.304059 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 7 00:36:12.313809 containerd[1574]: time="2025-07-07T00:36:12.312690280Z" level=info msg="Start subscribing containerd event" Jul 7 00:36:12.313809 containerd[1574]: time="2025-07-07T00:36:12.312741366Z" level=info msg="Start recovering state" Jul 7 00:36:12.313809 containerd[1574]: time="2025-07-07T00:36:12.313104607Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 7 00:36:12.313809 containerd[1574]: time="2025-07-07T00:36:12.313157657Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 7 00:36:12.315479 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 7 00:36:12.318988 containerd[1574]: time="2025-07-07T00:36:12.318544369Z" level=info msg="Start event monitor" Jul 7 00:36:12.318988 containerd[1574]: time="2025-07-07T00:36:12.318565399Z" level=info msg="Start cni network conf syncer for default" Jul 7 00:36:12.318988 containerd[1574]: time="2025-07-07T00:36:12.318576169Z" level=info msg="Start streaming server" Jul 7 00:36:12.318988 containerd[1574]: time="2025-07-07T00:36:12.318593862Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 7 00:36:12.318988 containerd[1574]: time="2025-07-07T00:36:12.318603380Z" level=info msg="runtime interface starting up..." Jul 7 00:36:12.318988 containerd[1574]: time="2025-07-07T00:36:12.318610924Z" level=info msg="starting plugins..." Jul 7 00:36:12.318988 containerd[1574]: time="2025-07-07T00:36:12.318625661Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 7 00:36:12.318988 containerd[1574]: time="2025-07-07T00:36:12.318743392Z" level=info msg="containerd successfully booted in 0.233819s" Jul 7 00:36:12.320616 systemd[1]: Started containerd.service - containerd container runtime. Jul 7 00:36:12.335697 systemd-logind[1531]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jul 7 00:36:12.345852 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:36:12.365901 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 7 00:36:12.368596 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 7 00:36:12.374763 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jul 7 00:36:12.374977 systemd[1]: Reached target getty.target - Login Prompts. Jul 7 00:36:12.401531 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 7 00:36:12.412853 systemd-logind[1531]: Watching system buttons on /dev/input/event3 (Power Button) Jul 7 00:36:12.427569 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:36:12.443050 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 7 00:36:12.443272 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:36:12.443842 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:36:12.447541 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 7 00:36:12.449191 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 7 00:36:12.460516 kernel: EDAC MC: Ver: 3.0.0 Jul 7 00:36:12.488475 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 7 00:36:12.685338 tar[1539]: linux-amd64/README.md Jul 7 00:36:12.693257 coreos-metadata[1517]: Jul 07 00:36:12.693 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #2 Jul 7 00:36:12.693927 coreos-metadata[1517]: Jul 07 00:36:12.693 INFO Fetch successful Jul 7 00:36:12.694175 coreos-metadata[1517]: Jul 07 00:36:12.694 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jul 7 00:36:12.694509 coreos-metadata[1517]: Jul 07 00:36:12.694 INFO Fetch successful Jul 7 00:36:12.697803 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 7 00:36:12.730064 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 7 00:36:12.730600 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 7 00:36:13.695692 systemd-networkd[1487]: eth0: Gained IPv6LL Jul 7 00:36:13.696224 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Jul 7 00:36:13.697597 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 7 00:36:13.698637 systemd[1]: Reached target network-online.target - Network is Online. Jul 7 00:36:13.700761 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:36:13.702645 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 7 00:36:13.722385 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 7 00:36:13.759733 systemd-networkd[1487]: eth1: Gained IPv6LL Jul 7 00:36:13.760345 systemd-timesyncd[1464]: Network configuration changed, trying to establish connection. Jul 7 00:36:14.499409 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:36:14.499844 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 7 00:36:14.500983 systemd[1]: Startup finished in 2.978s (kernel) + 7.020s (initrd) + 4.881s (userspace) = 14.879s. Jul 7 00:36:14.502939 (kubelet)[1714]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:36:15.010505 kubelet[1714]: E0707 00:36:15.010402 1714 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:36:15.013507 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:36:15.013769 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:36:15.014464 systemd[1]: kubelet.service: Consumed 834ms CPU time, 263.1M memory peak. Jul 7 00:36:25.264269 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 7 00:36:25.266379 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:36:25.371312 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:36:25.383741 (kubelet)[1733]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:36:25.419785 kubelet[1733]: E0707 00:36:25.419728 1733 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:36:25.422770 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:36:25.422943 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:36:25.423284 systemd[1]: kubelet.service: Consumed 117ms CPU time, 109M memory peak. Jul 7 00:36:35.673610 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 7 00:36:35.675714 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:36:35.779401 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:36:35.791742 (kubelet)[1748]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:36:35.827696 kubelet[1748]: E0707 00:36:35.827634 1748 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:36:35.830139 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:36:35.830316 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:36:35.830693 systemd[1]: kubelet.service: Consumed 122ms CPU time, 110.8M memory peak. Jul 7 00:36:44.647809 systemd-timesyncd[1464]: Contacted time server 217.160.19.219:123 (2.flatcar.pool.ntp.org). Jul 7 00:36:44.647888 systemd-timesyncd[1464]: Initial clock synchronization to Mon 2025-07-07 00:36:44.647570 UTC. Jul 7 00:36:44.648461 systemd-resolved[1428]: Clock change detected. Flushing caches. Jul 7 00:36:46.591320 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 7 00:36:46.592760 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:36:46.703389 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:36:46.705760 (kubelet)[1763]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:36:46.735585 kubelet[1763]: E0707 00:36:46.735539 1763 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:36:46.737622 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:36:46.737758 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:36:46.737997 systemd[1]: kubelet.service: Consumed 108ms CPU time, 110.4M memory peak. Jul 7 00:36:56.841384 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jul 7 00:36:56.842827 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:36:56.942504 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:36:56.952947 (kubelet)[1778]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:36:56.990060 kubelet[1778]: E0707 00:36:56.990006 1778 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:36:56.991945 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:36:56.992073 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:36:56.992338 systemd[1]: kubelet.service: Consumed 113ms CPU time, 109.4M memory peak. Jul 7 00:36:57.229221 update_engine[1536]: I20250707 00:36:57.229089 1536 update_attempter.cc:509] Updating boot flags... Jul 7 00:37:07.091289 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jul 7 00:37:07.093034 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:37:07.189467 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:37:07.200903 (kubelet)[1817]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:37:07.232430 kubelet[1817]: E0707 00:37:07.232377 1817 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:37:07.234408 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:37:07.234522 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:37:07.234802 systemd[1]: kubelet.service: Consumed 107ms CPU time, 110M memory peak. Jul 7 00:37:17.341533 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Jul 7 00:37:17.343749 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:37:17.448367 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:37:17.457944 (kubelet)[1832]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:37:17.489466 kubelet[1832]: E0707 00:37:17.489404 1832 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:37:17.490763 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:37:17.490930 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:37:17.491234 systemd[1]: kubelet.service: Consumed 111ms CPU time, 110.5M memory peak. Jul 7 00:37:27.591391 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Jul 7 00:37:27.593024 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:37:27.709030 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:37:27.712996 (kubelet)[1846]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:37:27.748563 kubelet[1846]: E0707 00:37:27.748508 1846 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:37:27.750363 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:37:27.750544 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:37:27.750926 systemd[1]: kubelet.service: Consumed 129ms CPU time, 108.2M memory peak. Jul 7 00:37:37.841291 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Jul 7 00:37:37.842678 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:37:37.950581 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:37:37.962892 (kubelet)[1861]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:37:37.991650 kubelet[1861]: E0707 00:37:37.991601 1861 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:37:37.993596 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:37:37.993753 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:37:37.993999 systemd[1]: kubelet.service: Consumed 108ms CPU time, 108.7M memory peak. Jul 7 00:37:48.091455 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Jul 7 00:37:48.094207 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:37:48.227331 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:37:48.236898 (kubelet)[1876]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:37:48.286437 kubelet[1876]: E0707 00:37:48.286324 1876 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:37:48.287871 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:37:48.288059 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:37:48.288605 systemd[1]: kubelet.service: Consumed 134ms CPU time, 111.5M memory peak. Jul 7 00:37:58.341920 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Jul 7 00:37:58.344306 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:37:58.458365 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:37:58.480180 (kubelet)[1891]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:37:58.535153 kubelet[1891]: E0707 00:37:58.535073 1891 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:37:58.536681 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:37:58.536829 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:37:58.537250 systemd[1]: kubelet.service: Consumed 144ms CPU time, 110M memory peak. Jul 7 00:38:06.419566 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 7 00:38:06.420838 systemd[1]: Started sshd@0-65.108.89.120:22-152.32.189.21:47838.service - OpenSSH per-connection server daemon (152.32.189.21:47838). Jul 7 00:38:07.175795 systemd[1]: Started sshd@1-65.108.89.120:22-147.75.109.163:45604.service - OpenSSH per-connection server daemon (147.75.109.163:45604). Jul 7 00:38:07.504226 sshd[1899]: Received disconnect from 152.32.189.21 port 47838:11: Bye Bye [preauth] Jul 7 00:38:07.504226 sshd[1899]: Disconnected from authenticating user root 152.32.189.21 port 47838 [preauth] Jul 7 00:38:07.506071 systemd[1]: sshd@0-65.108.89.120:22-152.32.189.21:47838.service: Deactivated successfully. Jul 7 00:38:08.205000 sshd[1902]: Accepted publickey for core from 147.75.109.163 port 45604 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:38:08.207237 sshd-session[1902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:38:08.217568 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 7 00:38:08.219378 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 7 00:38:08.232304 systemd-logind[1531]: New session 1 of user core. Jul 7 00:38:08.249511 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 7 00:38:08.253644 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 7 00:38:08.271947 (systemd)[1908]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 7 00:38:08.274449 systemd-logind[1531]: New session c1 of user core. Jul 7 00:38:08.404748 systemd[1908]: Queued start job for default target default.target. Jul 7 00:38:08.415519 systemd[1908]: Created slice app.slice - User Application Slice. Jul 7 00:38:08.415547 systemd[1908]: Reached target paths.target - Paths. Jul 7 00:38:08.415581 systemd[1908]: Reached target timers.target - Timers. Jul 7 00:38:08.416898 systemd[1908]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 7 00:38:08.449723 systemd[1908]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 7 00:38:08.449978 systemd[1908]: Reached target sockets.target - Sockets. Jul 7 00:38:08.450091 systemd[1908]: Reached target basic.target - Basic System. Jul 7 00:38:08.450184 systemd[1908]: Reached target default.target - Main User Target. Jul 7 00:38:08.450268 systemd[1908]: Startup finished in 169ms. Jul 7 00:38:08.450359 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 7 00:38:08.458937 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 7 00:38:08.591877 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Jul 7 00:38:08.594527 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:38:08.748350 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:38:08.757961 (kubelet)[1925]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:38:08.789775 kubelet[1925]: E0707 00:38:08.789720 1925 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:38:08.791617 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:38:08.791760 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:38:08.791978 systemd[1]: kubelet.service: Consumed 132ms CPU time, 108.2M memory peak. Jul 7 00:38:09.165951 systemd[1]: Started sshd@2-65.108.89.120:22-147.75.109.163:45612.service - OpenSSH per-connection server daemon (147.75.109.163:45612). Jul 7 00:38:10.189217 sshd[1934]: Accepted publickey for core from 147.75.109.163 port 45612 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:38:10.190350 sshd-session[1934]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:38:10.194759 systemd-logind[1531]: New session 2 of user core. Jul 7 00:38:10.200843 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 7 00:38:10.882593 sshd[1936]: Connection closed by 147.75.109.163 port 45612 Jul 7 00:38:10.883140 sshd-session[1934]: pam_unix(sshd:session): session closed for user core Jul 7 00:38:10.885972 systemd[1]: sshd@2-65.108.89.120:22-147.75.109.163:45612.service: Deactivated successfully. Jul 7 00:38:10.887598 systemd[1]: session-2.scope: Deactivated successfully. Jul 7 00:38:10.889042 systemd-logind[1531]: Session 2 logged out. Waiting for processes to exit. Jul 7 00:38:10.890032 systemd-logind[1531]: Removed session 2. Jul 7 00:38:11.059367 systemd[1]: Started sshd@3-65.108.89.120:22-147.75.109.163:45616.service - OpenSSH per-connection server daemon (147.75.109.163:45616). Jul 7 00:38:12.072508 sshd[1942]: Accepted publickey for core from 147.75.109.163 port 45616 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:38:12.073730 sshd-session[1942]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:38:12.078434 systemd-logind[1531]: New session 3 of user core. Jul 7 00:38:12.083831 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 7 00:38:12.765851 sshd[1944]: Connection closed by 147.75.109.163 port 45616 Jul 7 00:38:12.766406 sshd-session[1942]: pam_unix(sshd:session): session closed for user core Jul 7 00:38:12.770047 systemd[1]: sshd@3-65.108.89.120:22-147.75.109.163:45616.service: Deactivated successfully. Jul 7 00:38:12.772060 systemd[1]: session-3.scope: Deactivated successfully. Jul 7 00:38:12.773176 systemd-logind[1531]: Session 3 logged out. Waiting for processes to exit. Jul 7 00:38:12.774601 systemd-logind[1531]: Removed session 3. Jul 7 00:38:12.940958 systemd[1]: Started sshd@4-65.108.89.120:22-147.75.109.163:45630.service - OpenSSH per-connection server daemon (147.75.109.163:45630). Jul 7 00:38:13.960942 sshd[1950]: Accepted publickey for core from 147.75.109.163 port 45630 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:38:13.962406 sshd-session[1950]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:38:13.966539 systemd-logind[1531]: New session 4 of user core. Jul 7 00:38:13.973840 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 7 00:38:14.665089 sshd[1952]: Connection closed by 147.75.109.163 port 45630 Jul 7 00:38:14.665630 sshd-session[1950]: pam_unix(sshd:session): session closed for user core Jul 7 00:38:14.669542 systemd-logind[1531]: Session 4 logged out. Waiting for processes to exit. Jul 7 00:38:14.669795 systemd[1]: sshd@4-65.108.89.120:22-147.75.109.163:45630.service: Deactivated successfully. Jul 7 00:38:14.671403 systemd[1]: session-4.scope: Deactivated successfully. Jul 7 00:38:14.672818 systemd-logind[1531]: Removed session 4. Jul 7 00:38:14.838089 systemd[1]: Started sshd@5-65.108.89.120:22-147.75.109.163:45646.service - OpenSSH per-connection server daemon (147.75.109.163:45646). Jul 7 00:38:15.881106 sshd[1958]: Accepted publickey for core from 147.75.109.163 port 45646 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:38:15.882363 sshd-session[1958]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:38:15.887347 systemd-logind[1531]: New session 5 of user core. Jul 7 00:38:15.896877 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 7 00:38:16.420920 sudo[1961]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 7 00:38:16.421134 sudo[1961]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 7 00:38:16.434014 sudo[1961]: pam_unix(sudo:session): session closed for user root Jul 7 00:38:16.596731 sshd[1960]: Connection closed by 147.75.109.163 port 45646 Jul 7 00:38:16.597459 sshd-session[1958]: pam_unix(sshd:session): session closed for user core Jul 7 00:38:16.600915 systemd[1]: sshd@5-65.108.89.120:22-147.75.109.163:45646.service: Deactivated successfully. Jul 7 00:38:16.602623 systemd[1]: session-5.scope: Deactivated successfully. Jul 7 00:38:16.603679 systemd-logind[1531]: Session 5 logged out. Waiting for processes to exit. Jul 7 00:38:16.605216 systemd-logind[1531]: Removed session 5. Jul 7 00:38:16.772257 systemd[1]: Started sshd@6-65.108.89.120:22-147.75.109.163:58920.service - OpenSSH per-connection server daemon (147.75.109.163:58920). Jul 7 00:38:17.789550 sshd[1967]: Accepted publickey for core from 147.75.109.163 port 58920 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:38:17.791007 sshd-session[1967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:38:17.795949 systemd-logind[1531]: New session 6 of user core. Jul 7 00:38:17.805864 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 7 00:38:18.328098 sudo[1971]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 7 00:38:18.328503 sudo[1971]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 7 00:38:18.334878 sudo[1971]: pam_unix(sudo:session): session closed for user root Jul 7 00:38:18.342587 sudo[1970]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 7 00:38:18.343157 sudo[1970]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 7 00:38:18.357783 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 7 00:38:18.408638 augenrules[1993]: No rules Jul 7 00:38:18.409483 systemd[1]: audit-rules.service: Deactivated successfully. Jul 7 00:38:18.409836 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 7 00:38:18.411370 sudo[1970]: pam_unix(sudo:session): session closed for user root Jul 7 00:38:18.575569 sshd[1969]: Connection closed by 147.75.109.163 port 58920 Jul 7 00:38:18.576473 sshd-session[1967]: pam_unix(sshd:session): session closed for user core Jul 7 00:38:18.581437 systemd[1]: sshd@6-65.108.89.120:22-147.75.109.163:58920.service: Deactivated successfully. Jul 7 00:38:18.583937 systemd[1]: session-6.scope: Deactivated successfully. Jul 7 00:38:18.585087 systemd-logind[1531]: Session 6 logged out. Waiting for processes to exit. Jul 7 00:38:18.587198 systemd-logind[1531]: Removed session 6. Jul 7 00:38:18.755206 systemd[1]: Started sshd@7-65.108.89.120:22-147.75.109.163:58932.service - OpenSSH per-connection server daemon (147.75.109.163:58932). Jul 7 00:38:18.841764 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Jul 7 00:38:18.845376 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:38:18.984478 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:38:18.987421 (kubelet)[2012]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:38:19.023668 kubelet[2012]: E0707 00:38:19.023620 2012 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:38:19.025345 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:38:19.025458 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:38:19.025754 systemd[1]: kubelet.service: Consumed 134ms CPU time, 110.2M memory peak. Jul 7 00:38:19.808588 sshd[2002]: Accepted publickey for core from 147.75.109.163 port 58932 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:38:19.810636 sshd-session[2002]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:38:19.820078 systemd-logind[1531]: New session 7 of user core. Jul 7 00:38:19.826936 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 7 00:38:20.351370 sudo[2020]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 7 00:38:20.351878 sudo[2020]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 7 00:38:20.646319 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 7 00:38:20.655961 (dockerd)[2037]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 7 00:38:20.860547 dockerd[2037]: time="2025-07-07T00:38:20.860455358Z" level=info msg="Starting up" Jul 7 00:38:20.862153 dockerd[2037]: time="2025-07-07T00:38:20.862104512Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 7 00:38:20.928378 dockerd[2037]: time="2025-07-07T00:38:20.928263393Z" level=info msg="Loading containers: start." Jul 7 00:38:20.937731 kernel: Initializing XFRM netlink socket Jul 7 00:38:21.132840 systemd-networkd[1487]: docker0: Link UP Jul 7 00:38:21.136326 dockerd[2037]: time="2025-07-07T00:38:21.136281602Z" level=info msg="Loading containers: done." Jul 7 00:38:21.147808 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4107956753-merged.mount: Deactivated successfully. Jul 7 00:38:21.149305 dockerd[2037]: time="2025-07-07T00:38:21.149274693Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 7 00:38:21.149364 dockerd[2037]: time="2025-07-07T00:38:21.149327933Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Jul 7 00:38:21.149421 dockerd[2037]: time="2025-07-07T00:38:21.149396070Z" level=info msg="Initializing buildkit" Jul 7 00:38:21.166758 dockerd[2037]: time="2025-07-07T00:38:21.166728350Z" level=info msg="Completed buildkit initialization" Jul 7 00:38:21.173307 dockerd[2037]: time="2025-07-07T00:38:21.173261318Z" level=info msg="Daemon has completed initialization" Jul 7 00:38:21.173947 dockerd[2037]: time="2025-07-07T00:38:21.173430426Z" level=info msg="API listen on /run/docker.sock" Jul 7 00:38:21.173522 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 7 00:38:22.185112 containerd[1574]: time="2025-07-07T00:38:22.185070876Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.6\"" Jul 7 00:38:22.728214 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4206253417.mount: Deactivated successfully. Jul 7 00:38:23.612640 containerd[1574]: time="2025-07-07T00:38:23.612573427Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:23.613535 containerd[1574]: time="2025-07-07T00:38:23.613347244Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.6: active requests=0, bytes read=28799139" Jul 7 00:38:23.614278 containerd[1574]: time="2025-07-07T00:38:23.614259004Z" level=info msg="ImageCreate event name:\"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:23.616255 containerd[1574]: time="2025-07-07T00:38:23.616239090Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0f5764551d7de4ef70489ff8a70f32df7dea00701f5545af089b60bc5ede4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:23.616899 containerd[1574]: time="2025-07-07T00:38:23.616880056Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.6\" with image id \"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.6\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0f5764551d7de4ef70489ff8a70f32df7dea00701f5545af089b60bc5ede4f6f\", size \"28795845\" in 1.431771619s" Jul 7 00:38:23.616966 containerd[1574]: time="2025-07-07T00:38:23.616954708Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.6\" returns image reference \"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\"" Jul 7 00:38:23.617391 containerd[1574]: time="2025-07-07T00:38:23.617356029Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.6\"" Jul 7 00:38:24.775935 containerd[1574]: time="2025-07-07T00:38:24.775883938Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:24.777116 containerd[1574]: time="2025-07-07T00:38:24.776814804Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.6: active requests=0, bytes read=24783934" Jul 7 00:38:24.777892 containerd[1574]: time="2025-07-07T00:38:24.777864715Z" level=info msg="ImageCreate event name:\"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:24.780135 containerd[1574]: time="2025-07-07T00:38:24.780099532Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3425f29c94a77d74cb89f38413e6274277dcf5e2bc7ab6ae953578a91e9e8356\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:24.781089 containerd[1574]: time="2025-07-07T00:38:24.781069201Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.6\" with image id \"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.6\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3425f29c94a77d74cb89f38413e6274277dcf5e2bc7ab6ae953578a91e9e8356\", size \"26385746\" in 1.1636904s" Jul 7 00:38:24.781181 containerd[1574]: time="2025-07-07T00:38:24.781162237Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.6\" returns image reference \"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\"" Jul 7 00:38:24.781748 containerd[1574]: time="2025-07-07T00:38:24.781693875Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.6\"" Jul 7 00:38:25.847729 containerd[1574]: time="2025-07-07T00:38:25.847644395Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:25.848572 containerd[1574]: time="2025-07-07T00:38:25.848545514Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.6: active requests=0, bytes read=19176938" Jul 7 00:38:25.849732 containerd[1574]: time="2025-07-07T00:38:25.849408801Z" level=info msg="ImageCreate event name:\"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:25.852014 containerd[1574]: time="2025-07-07T00:38:25.851334531Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:130f633cbd1d70e2f4655350153cb3fc469f4d5a6310b4f0b49d93fb2ba2132b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:25.852014 containerd[1574]: time="2025-07-07T00:38:25.851927265Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.6\" with image id \"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.6\", repo digest \"registry.k8s.io/kube-scheduler@sha256:130f633cbd1d70e2f4655350153cb3fc469f4d5a6310b4f0b49d93fb2ba2132b\", size \"20778768\" in 1.07018577s" Jul 7 00:38:25.852014 containerd[1574]: time="2025-07-07T00:38:25.851946732Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.6\" returns image reference \"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\"" Jul 7 00:38:25.852828 containerd[1574]: time="2025-07-07T00:38:25.852801853Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.6\"" Jul 7 00:38:26.833445 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3338162042.mount: Deactivated successfully. Jul 7 00:38:27.127859 containerd[1574]: time="2025-07-07T00:38:27.127811193Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:27.128600 containerd[1574]: time="2025-07-07T00:38:27.128568838Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.6: active requests=0, bytes read=30895391" Jul 7 00:38:27.129208 containerd[1574]: time="2025-07-07T00:38:27.129159909Z" level=info msg="ImageCreate event name:\"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:27.130578 containerd[1574]: time="2025-07-07T00:38:27.130537658Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:b13d9da413b983d130bf090b83fce12e1ccc704e95f366da743c18e964d9d7e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:27.131281 containerd[1574]: time="2025-07-07T00:38:27.130882402Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.6\" with image id \"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\", repo tag \"registry.k8s.io/kube-proxy:v1.32.6\", repo digest \"registry.k8s.io/kube-proxy@sha256:b13d9da413b983d130bf090b83fce12e1ccc704e95f366da743c18e964d9d7e9\", size \"30894382\" in 1.278057556s" Jul 7 00:38:27.131281 containerd[1574]: time="2025-07-07T00:38:27.130907800Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.6\" returns image reference \"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\"" Jul 7 00:38:27.131530 containerd[1574]: time="2025-07-07T00:38:27.131493539Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 7 00:38:27.624799 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4275448234.mount: Deactivated successfully. Jul 7 00:38:28.442575 containerd[1574]: time="2025-07-07T00:38:28.442530930Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:28.443388 containerd[1574]: time="2025-07-07T00:38:28.443204977Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Jul 7 00:38:28.444080 containerd[1574]: time="2025-07-07T00:38:28.444059527Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:28.446134 containerd[1574]: time="2025-07-07T00:38:28.446110721Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:28.446889 containerd[1574]: time="2025-07-07T00:38:28.446869458Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.315337155s" Jul 7 00:38:28.446963 containerd[1574]: time="2025-07-07T00:38:28.446951153Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jul 7 00:38:28.447478 containerd[1574]: time="2025-07-07T00:38:28.447445991Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 7 00:38:28.886842 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount329962771.mount: Deactivated successfully. Jul 7 00:38:28.894311 containerd[1574]: time="2025-07-07T00:38:28.894229206Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 7 00:38:28.895719 containerd[1574]: time="2025-07-07T00:38:28.895665817Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Jul 7 00:38:28.896483 containerd[1574]: time="2025-07-07T00:38:28.896431007Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 7 00:38:28.898857 containerd[1574]: time="2025-07-07T00:38:28.898320235Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 7 00:38:28.898857 containerd[1574]: time="2025-07-07T00:38:28.898761200Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 451.277509ms" Jul 7 00:38:28.898857 containerd[1574]: time="2025-07-07T00:38:28.898783983Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jul 7 00:38:28.899674 containerd[1574]: time="2025-07-07T00:38:28.899652559Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jul 7 00:38:29.091164 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Jul 7 00:38:29.092837 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:38:29.188028 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:38:29.190344 (kubelet)[2376]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 7 00:38:29.222616 kubelet[2376]: E0707 00:38:29.222524 2376 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 7 00:38:29.224519 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 7 00:38:29.224719 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 7 00:38:29.225104 systemd[1]: kubelet.service: Consumed 104ms CPU time, 110.3M memory peak. Jul 7 00:38:29.405491 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2563783726.mount: Deactivated successfully. Jul 7 00:38:31.627387 containerd[1574]: time="2025-07-07T00:38:31.627326829Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:31.628433 containerd[1574]: time="2025-07-07T00:38:31.628226402Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551430" Jul 7 00:38:31.629232 containerd[1574]: time="2025-07-07T00:38:31.629208090Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:31.631368 containerd[1574]: time="2025-07-07T00:38:31.631349143Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:31.632081 containerd[1574]: time="2025-07-07T00:38:31.632056511Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.732287512s" Jul 7 00:38:31.632128 containerd[1574]: time="2025-07-07T00:38:31.632085416Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jul 7 00:38:34.689772 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:38:34.689910 systemd[1]: kubelet.service: Consumed 104ms CPU time, 110.3M memory peak. Jul 7 00:38:34.691576 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:38:34.713474 systemd[1]: Reload requested from client PID 2464 ('systemctl') (unit session-7.scope)... Jul 7 00:38:34.713582 systemd[1]: Reloading... Jul 7 00:38:34.810717 zram_generator::config[2508]: No configuration found. Jul 7 00:38:34.886412 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 7 00:38:34.978177 systemd[1]: Reloading finished in 264 ms. Jul 7 00:38:35.013825 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:38:35.015938 systemd[1]: kubelet.service: Deactivated successfully. Jul 7 00:38:35.016139 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:38:35.016180 systemd[1]: kubelet.service: Consumed 67ms CPU time, 97.1M memory peak. Jul 7 00:38:35.017487 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:38:35.107789 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:38:35.112409 (kubelet)[2564]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 7 00:38:35.150112 kubelet[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 7 00:38:35.150112 kubelet[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 7 00:38:35.150112 kubelet[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 7 00:38:35.150733 kubelet[2564]: I0707 00:38:35.150146 2564 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 7 00:38:35.341935 kubelet[2564]: I0707 00:38:35.341894 2564 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jul 7 00:38:35.341935 kubelet[2564]: I0707 00:38:35.341927 2564 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 7 00:38:35.342213 kubelet[2564]: I0707 00:38:35.342189 2564 server.go:954] "Client rotation is on, will bootstrap in background" Jul 7 00:38:35.378185 kubelet[2564]: E0707 00:38:35.377706 2564 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://65.108.89.120:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 65.108.89.120:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:38:35.381858 kubelet[2564]: I0707 00:38:35.381841 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 7 00:38:35.392263 kubelet[2564]: I0707 00:38:35.392249 2564 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 7 00:38:35.396599 kubelet[2564]: I0707 00:38:35.396569 2564 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 7 00:38:35.401269 kubelet[2564]: I0707 00:38:35.401223 2564 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 7 00:38:35.401427 kubelet[2564]: I0707 00:38:35.401253 2564 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-1-1-6-69f6cda1f4","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 7 00:38:35.402887 kubelet[2564]: I0707 00:38:35.402856 2564 topology_manager.go:138] "Creating topology manager with none policy" Jul 7 00:38:35.402887 kubelet[2564]: I0707 00:38:35.402876 2564 container_manager_linux.go:304] "Creating device plugin manager" Jul 7 00:38:35.403943 kubelet[2564]: I0707 00:38:35.403915 2564 state_mem.go:36] "Initialized new in-memory state store" Jul 7 00:38:35.409088 kubelet[2564]: I0707 00:38:35.408999 2564 kubelet.go:446] "Attempting to sync node with API server" Jul 7 00:38:35.409088 kubelet[2564]: I0707 00:38:35.409029 2564 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 7 00:38:35.409784 kubelet[2564]: I0707 00:38:35.409714 2564 kubelet.go:352] "Adding apiserver pod source" Jul 7 00:38:35.409784 kubelet[2564]: I0707 00:38:35.409736 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 7 00:38:35.411906 kubelet[2564]: W0707 00:38:35.411582 2564 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://65.108.89.120:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-1-1-6-69f6cda1f4&limit=500&resourceVersion=0": dial tcp 65.108.89.120:6443: connect: connection refused Jul 7 00:38:35.411906 kubelet[2564]: E0707 00:38:35.411634 2564 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://65.108.89.120:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-1-1-6-69f6cda1f4&limit=500&resourceVersion=0\": dial tcp 65.108.89.120:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:38:35.413037 kubelet[2564]: W0707 00:38:35.412864 2564 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://65.108.89.120:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 65.108.89.120:6443: connect: connection refused Jul 7 00:38:35.413037 kubelet[2564]: E0707 00:38:35.412909 2564 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://65.108.89.120:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 65.108.89.120:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:38:35.413997 kubelet[2564]: I0707 00:38:35.413983 2564 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 7 00:38:35.416493 kubelet[2564]: I0707 00:38:35.416480 2564 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 7 00:38:35.417082 kubelet[2564]: W0707 00:38:35.417069 2564 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 7 00:38:35.419915 kubelet[2564]: I0707 00:38:35.419759 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 7 00:38:35.419915 kubelet[2564]: I0707 00:38:35.419786 2564 server.go:1287] "Started kubelet" Jul 7 00:38:35.422252 kubelet[2564]: I0707 00:38:35.421973 2564 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jul 7 00:38:35.424140 kubelet[2564]: I0707 00:38:35.424117 2564 server.go:479] "Adding debug handlers to kubelet server" Jul 7 00:38:35.425118 kubelet[2564]: I0707 00:38:35.425079 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 7 00:38:35.425353 kubelet[2564]: I0707 00:38:35.425342 2564 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 7 00:38:35.428300 kubelet[2564]: I0707 00:38:35.427759 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 7 00:38:35.430020 kubelet[2564]: E0707 00:38:35.426604 2564 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://65.108.89.120:6443/api/v1/namespaces/default/events\": dial tcp 65.108.89.120:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344-1-1-6-69f6cda1f4.184fd11ffed1e46c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344-1-1-6-69f6cda1f4,UID:ci-4344-1-1-6-69f6cda1f4,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344-1-1-6-69f6cda1f4,},FirstTimestamp:2025-07-07 00:38:35.419772012 +0000 UTC m=+0.304744809,LastTimestamp:2025-07-07 00:38:35.419772012 +0000 UTC m=+0.304744809,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344-1-1-6-69f6cda1f4,}" Jul 7 00:38:35.431938 kubelet[2564]: I0707 00:38:35.431920 2564 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 7 00:38:35.433745 kubelet[2564]: I0707 00:38:35.433492 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 7 00:38:35.433745 kubelet[2564]: E0707 00:38:35.433664 2564 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" Jul 7 00:38:35.438151 kubelet[2564]: E0707 00:38:35.438108 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.108.89.120:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-6-69f6cda1f4?timeout=10s\": dial tcp 65.108.89.120:6443: connect: connection refused" interval="200ms" Jul 7 00:38:35.439071 kubelet[2564]: I0707 00:38:35.438842 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 7 00:38:35.439143 kubelet[2564]: I0707 00:38:35.438995 2564 reconciler.go:26] "Reconciler: start to sync state" Jul 7 00:38:35.440970 kubelet[2564]: I0707 00:38:35.440954 2564 factory.go:221] Registration of the systemd container factory successfully Jul 7 00:38:35.441207 kubelet[2564]: I0707 00:38:35.441190 2564 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 7 00:38:35.450986 kubelet[2564]: I0707 00:38:35.450930 2564 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 7 00:38:35.452124 kubelet[2564]: I0707 00:38:35.452109 2564 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 7 00:38:35.452198 kubelet[2564]: I0707 00:38:35.452190 2564 status_manager.go:227] "Starting to sync pod status with apiserver" Jul 7 00:38:35.452258 kubelet[2564]: I0707 00:38:35.452250 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 7 00:38:35.452300 kubelet[2564]: I0707 00:38:35.452294 2564 kubelet.go:2382] "Starting kubelet main sync loop" Jul 7 00:38:35.452383 kubelet[2564]: E0707 00:38:35.452370 2564 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 7 00:38:35.453830 kubelet[2564]: W0707 00:38:35.452391 2564 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://65.108.89.120:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 65.108.89.120:6443: connect: connection refused Jul 7 00:38:35.453913 kubelet[2564]: E0707 00:38:35.453900 2564 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://65.108.89.120:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 65.108.89.120:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:38:35.457477 kubelet[2564]: W0707 00:38:35.457428 2564 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://65.108.89.120:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 65.108.89.120:6443: connect: connection refused Jul 7 00:38:35.457546 kubelet[2564]: E0707 00:38:35.457498 2564 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://65.108.89.120:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 65.108.89.120:6443: connect: connection refused" logger="UnhandledError" Jul 7 00:38:35.457683 kubelet[2564]: I0707 00:38:35.457670 2564 factory.go:221] Registration of the containerd container factory successfully Jul 7 00:38:35.459027 kubelet[2564]: E0707 00:38:35.458999 2564 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 7 00:38:35.470525 kubelet[2564]: I0707 00:38:35.470492 2564 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 7 00:38:35.470525 kubelet[2564]: I0707 00:38:35.470519 2564 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 7 00:38:35.470609 kubelet[2564]: I0707 00:38:35.470536 2564 state_mem.go:36] "Initialized new in-memory state store" Jul 7 00:38:35.472343 kubelet[2564]: I0707 00:38:35.472317 2564 policy_none.go:49] "None policy: Start" Jul 7 00:38:35.472343 kubelet[2564]: I0707 00:38:35.472340 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 7 00:38:35.472426 kubelet[2564]: I0707 00:38:35.472355 2564 state_mem.go:35] "Initializing new in-memory state store" Jul 7 00:38:35.477412 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 7 00:38:35.487930 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 7 00:38:35.492714 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 7 00:38:35.503801 kubelet[2564]: I0707 00:38:35.503741 2564 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 7 00:38:35.504030 kubelet[2564]: I0707 00:38:35.503954 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 7 00:38:35.504030 kubelet[2564]: I0707 00:38:35.503972 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 7 00:38:35.504326 kubelet[2564]: I0707 00:38:35.504296 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 7 00:38:35.505616 kubelet[2564]: E0707 00:38:35.505366 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 7 00:38:35.505616 kubelet[2564]: E0707 00:38:35.505403 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344-1-1-6-69f6cda1f4\" not found" Jul 7 00:38:35.566973 systemd[1]: Created slice kubepods-burstable-pode3806d3611dc8643e5c566a0aefacd7f.slice - libcontainer container kubepods-burstable-pode3806d3611dc8643e5c566a0aefacd7f.slice. Jul 7 00:38:35.581002 kubelet[2564]: E0707 00:38:35.580849 2564 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.585975 systemd[1]: Created slice kubepods-burstable-pod6c17fef99aaaa2b286221f0b2561d749.slice - libcontainer container kubepods-burstable-pod6c17fef99aaaa2b286221f0b2561d749.slice. Jul 7 00:38:35.589078 kubelet[2564]: E0707 00:38:35.589019 2564 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.592038 systemd[1]: Created slice kubepods-burstable-pod3cbe8534bb4a5cf5652584bdac973454.slice - libcontainer container kubepods-burstable-pod3cbe8534bb4a5cf5652584bdac973454.slice. Jul 7 00:38:35.595257 kubelet[2564]: E0707 00:38:35.595187 2564 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.605633 kubelet[2564]: I0707 00:38:35.605606 2564 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.605944 kubelet[2564]: E0707 00:38:35.605922 2564 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.108.89.120:6443/api/v1/nodes\": dial tcp 65.108.89.120:6443: connect: connection refused" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.639763 kubelet[2564]: I0707 00:38:35.639738 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e3806d3611dc8643e5c566a0aefacd7f-k8s-certs\") pod \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" (UID: \"e3806d3611dc8643e5c566a0aefacd7f\") " pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640024 kubelet[2564]: I0707 00:38:35.639870 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e3806d3611dc8643e5c566a0aefacd7f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" (UID: \"e3806d3611dc8643e5c566a0aefacd7f\") " pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640024 kubelet[2564]: I0707 00:38:35.639893 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3cbe8534bb4a5cf5652584bdac973454-kubeconfig\") pod \"kube-scheduler-ci-4344-1-1-6-69f6cda1f4\" (UID: \"3cbe8534bb4a5cf5652584bdac973454\") " pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640024 kubelet[2564]: I0707 00:38:35.639905 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e3806d3611dc8643e5c566a0aefacd7f-ca-certs\") pod \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" (UID: \"e3806d3611dc8643e5c566a0aefacd7f\") " pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640024 kubelet[2564]: I0707 00:38:35.639917 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-ca-certs\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640024 kubelet[2564]: I0707 00:38:35.639941 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640131 kubelet[2564]: I0707 00:38:35.639952 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-k8s-certs\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640131 kubelet[2564]: I0707 00:38:35.639963 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-kubeconfig\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640131 kubelet[2564]: I0707 00:38:35.639975 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.640131 kubelet[2564]: E0707 00:38:35.639805 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.108.89.120:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-6-69f6cda1f4?timeout=10s\": dial tcp 65.108.89.120:6443: connect: connection refused" interval="400ms" Jul 7 00:38:35.809326 kubelet[2564]: I0707 00:38:35.809250 2564 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.810009 kubelet[2564]: E0707 00:38:35.809908 2564 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.108.89.120:6443/api/v1/nodes\": dial tcp 65.108.89.120:6443: connect: connection refused" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:35.882891 containerd[1574]: time="2025-07-07T00:38:35.882778826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-1-1-6-69f6cda1f4,Uid:e3806d3611dc8643e5c566a0aefacd7f,Namespace:kube-system,Attempt:0,}" Jul 7 00:38:35.891122 containerd[1574]: time="2025-07-07T00:38:35.891077016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-1-1-6-69f6cda1f4,Uid:6c17fef99aaaa2b286221f0b2561d749,Namespace:kube-system,Attempt:0,}" Jul 7 00:38:35.896369 containerd[1574]: time="2025-07-07T00:38:35.896217547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-1-1-6-69f6cda1f4,Uid:3cbe8534bb4a5cf5652584bdac973454,Namespace:kube-system,Attempt:0,}" Jul 7 00:38:35.959156 containerd[1574]: time="2025-07-07T00:38:35.959122944Z" level=info msg="connecting to shim 7695b4a177a4f119d0d7005b02fce5f001ee0ed7c99912ad02e438c66aba5e20" address="unix:///run/containerd/s/d339e5c9b5c61f99c7e2538bb8c3d182380e64e4e279083cd17ffd74585c8181" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:38:35.962518 containerd[1574]: time="2025-07-07T00:38:35.962439124Z" level=info msg="connecting to shim f4ac03de0cbf120933fc1c4a0ca799a2364ef348844c9137be340b68a3c3f6b1" address="unix:///run/containerd/s/f42157fffd19f63e0443962ec9c008ce0ba2b82fe7f94f8ffeaf030d73bac6df" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:38:35.963136 containerd[1574]: time="2025-07-07T00:38:35.963113289Z" level=info msg="connecting to shim 4335dc9cfd5afe327d01c2fe560b3e865efecb09b3ff49127c6bfaf545ce4ed0" address="unix:///run/containerd/s/1a0837b66fb532138812105374b5c2c24fb15b1e928467cfde130c5643ba5d32" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:38:36.030836 systemd[1]: Started cri-containerd-4335dc9cfd5afe327d01c2fe560b3e865efecb09b3ff49127c6bfaf545ce4ed0.scope - libcontainer container 4335dc9cfd5afe327d01c2fe560b3e865efecb09b3ff49127c6bfaf545ce4ed0. Jul 7 00:38:36.034772 systemd[1]: Started cri-containerd-7695b4a177a4f119d0d7005b02fce5f001ee0ed7c99912ad02e438c66aba5e20.scope - libcontainer container 7695b4a177a4f119d0d7005b02fce5f001ee0ed7c99912ad02e438c66aba5e20. Jul 7 00:38:36.038568 systemd[1]: Started cri-containerd-f4ac03de0cbf120933fc1c4a0ca799a2364ef348844c9137be340b68a3c3f6b1.scope - libcontainer container f4ac03de0cbf120933fc1c4a0ca799a2364ef348844c9137be340b68a3c3f6b1. Jul 7 00:38:36.041225 kubelet[2564]: E0707 00:38:36.041187 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://65.108.89.120:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-6-69f6cda1f4?timeout=10s\": dial tcp 65.108.89.120:6443: connect: connection refused" interval="800ms" Jul 7 00:38:36.108170 containerd[1574]: time="2025-07-07T00:38:36.108059573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-1-1-6-69f6cda1f4,Uid:6c17fef99aaaa2b286221f0b2561d749,Namespace:kube-system,Attempt:0,} returns sandbox id \"7695b4a177a4f119d0d7005b02fce5f001ee0ed7c99912ad02e438c66aba5e20\"" Jul 7 00:38:36.110449 containerd[1574]: time="2025-07-07T00:38:36.110284550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-1-1-6-69f6cda1f4,Uid:e3806d3611dc8643e5c566a0aefacd7f,Namespace:kube-system,Attempt:0,} returns sandbox id \"4335dc9cfd5afe327d01c2fe560b3e865efecb09b3ff49127c6bfaf545ce4ed0\"" Jul 7 00:38:36.111312 containerd[1574]: time="2025-07-07T00:38:36.111288508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-1-1-6-69f6cda1f4,Uid:3cbe8534bb4a5cf5652584bdac973454,Namespace:kube-system,Attempt:0,} returns sandbox id \"f4ac03de0cbf120933fc1c4a0ca799a2364ef348844c9137be340b68a3c3f6b1\"" Jul 7 00:38:36.113811 containerd[1574]: time="2025-07-07T00:38:36.113778956Z" level=info msg="CreateContainer within sandbox \"4335dc9cfd5afe327d01c2fe560b3e865efecb09b3ff49127c6bfaf545ce4ed0\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 7 00:38:36.114136 containerd[1574]: time="2025-07-07T00:38:36.113867874Z" level=info msg="CreateContainer within sandbox \"7695b4a177a4f119d0d7005b02fce5f001ee0ed7c99912ad02e438c66aba5e20\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 7 00:38:36.115523 containerd[1574]: time="2025-07-07T00:38:36.115494870Z" level=info msg="CreateContainer within sandbox \"f4ac03de0cbf120933fc1c4a0ca799a2364ef348844c9137be340b68a3c3f6b1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 7 00:38:36.121466 containerd[1574]: time="2025-07-07T00:38:36.121450789Z" level=info msg="Container d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:38:36.123461 containerd[1574]: time="2025-07-07T00:38:36.123445810Z" level=info msg="Container e5634dd87d7bd27cd8863709ad63a38ea9a6c409b5c57740ef9cbd8689628626: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:38:36.124172 containerd[1574]: time="2025-07-07T00:38:36.124150843Z" level=info msg="Container 5747a57f08fd0b3fa42f3117b83a4d5bc641f4306680ca2f2e62e7ef364fb1d9: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:38:36.128752 containerd[1574]: time="2025-07-07T00:38:36.128712677Z" level=info msg="CreateContainer within sandbox \"7695b4a177a4f119d0d7005b02fce5f001ee0ed7c99912ad02e438c66aba5e20\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa\"" Jul 7 00:38:36.129303 containerd[1574]: time="2025-07-07T00:38:36.129169482Z" level=info msg="StartContainer for \"d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa\"" Jul 7 00:38:36.130043 containerd[1574]: time="2025-07-07T00:38:36.130020549Z" level=info msg="CreateContainer within sandbox \"4335dc9cfd5afe327d01c2fe560b3e865efecb09b3ff49127c6bfaf545ce4ed0\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e5634dd87d7bd27cd8863709ad63a38ea9a6c409b5c57740ef9cbd8689628626\"" Jul 7 00:38:36.130526 containerd[1574]: time="2025-07-07T00:38:36.130453829Z" level=info msg="StartContainer for \"e5634dd87d7bd27cd8863709ad63a38ea9a6c409b5c57740ef9cbd8689628626\"" Jul 7 00:38:36.131888 containerd[1574]: time="2025-07-07T00:38:36.131855819Z" level=info msg="connecting to shim e5634dd87d7bd27cd8863709ad63a38ea9a6c409b5c57740ef9cbd8689628626" address="unix:///run/containerd/s/1a0837b66fb532138812105374b5c2c24fb15b1e928467cfde130c5643ba5d32" protocol=ttrpc version=3 Jul 7 00:38:36.132063 containerd[1574]: time="2025-07-07T00:38:36.132040138Z" level=info msg="connecting to shim d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa" address="unix:///run/containerd/s/d339e5c9b5c61f99c7e2538bb8c3d182380e64e4e279083cd17ffd74585c8181" protocol=ttrpc version=3 Jul 7 00:38:36.132723 containerd[1574]: time="2025-07-07T00:38:36.132675629Z" level=info msg="CreateContainer within sandbox \"f4ac03de0cbf120933fc1c4a0ca799a2364ef348844c9137be340b68a3c3f6b1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"5747a57f08fd0b3fa42f3117b83a4d5bc641f4306680ca2f2e62e7ef364fb1d9\"" Jul 7 00:38:36.133859 containerd[1574]: time="2025-07-07T00:38:36.133812629Z" level=info msg="StartContainer for \"5747a57f08fd0b3fa42f3117b83a4d5bc641f4306680ca2f2e62e7ef364fb1d9\"" Jul 7 00:38:36.136063 containerd[1574]: time="2025-07-07T00:38:36.136033167Z" level=info msg="connecting to shim 5747a57f08fd0b3fa42f3117b83a4d5bc641f4306680ca2f2e62e7ef364fb1d9" address="unix:///run/containerd/s/f42157fffd19f63e0443962ec9c008ce0ba2b82fe7f94f8ffeaf030d73bac6df" protocol=ttrpc version=3 Jul 7 00:38:36.162824 systemd[1]: Started cri-containerd-d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa.scope - libcontainer container d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa. Jul 7 00:38:36.167113 systemd[1]: Started cri-containerd-5747a57f08fd0b3fa42f3117b83a4d5bc641f4306680ca2f2e62e7ef364fb1d9.scope - libcontainer container 5747a57f08fd0b3fa42f3117b83a4d5bc641f4306680ca2f2e62e7ef364fb1d9. Jul 7 00:38:36.168425 systemd[1]: Started cri-containerd-e5634dd87d7bd27cd8863709ad63a38ea9a6c409b5c57740ef9cbd8689628626.scope - libcontainer container e5634dd87d7bd27cd8863709ad63a38ea9a6c409b5c57740ef9cbd8689628626. Jul 7 00:38:36.213105 kubelet[2564]: I0707 00:38:36.213080 2564 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:36.214166 kubelet[2564]: E0707 00:38:36.214040 2564 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://65.108.89.120:6443/api/v1/nodes\": dial tcp 65.108.89.120:6443: connect: connection refused" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:36.228431 containerd[1574]: time="2025-07-07T00:38:36.228396848Z" level=info msg="StartContainer for \"d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa\" returns successfully" Jul 7 00:38:36.228708 containerd[1574]: time="2025-07-07T00:38:36.228464567Z" level=info msg="StartContainer for \"e5634dd87d7bd27cd8863709ad63a38ea9a6c409b5c57740ef9cbd8689628626\" returns successfully" Jul 7 00:38:36.245810 containerd[1574]: time="2025-07-07T00:38:36.245756096Z" level=info msg="StartContainer for \"5747a57f08fd0b3fa42f3117b83a4d5bc641f4306680ca2f2e62e7ef364fb1d9\" returns successfully" Jul 7 00:38:36.467294 kubelet[2564]: E0707 00:38:36.467211 2564 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:36.470157 kubelet[2564]: E0707 00:38:36.470139 2564 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:36.473266 kubelet[2564]: E0707 00:38:36.473244 2564 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.018711 kubelet[2564]: I0707 00:38:37.017857 2564 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.478660 kubelet[2564]: E0707 00:38:37.478634 2564 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.479300 kubelet[2564]: E0707 00:38:37.479281 2564 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.622619 kubelet[2564]: E0707 00:38:37.622578 2564 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344-1-1-6-69f6cda1f4\" not found" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.776544 kubelet[2564]: I0707 00:38:37.776289 2564 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.837354 kubelet[2564]: I0707 00:38:37.837297 2564 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.841080 kubelet[2564]: E0707 00:38:37.841052 2564 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.841080 kubelet[2564]: I0707 00:38:37.841070 2564 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.842628 kubelet[2564]: E0707 00:38:37.842597 2564 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.842628 kubelet[2564]: I0707 00:38:37.842614 2564 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:37.843522 kubelet[2564]: E0707 00:38:37.843494 2564 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-1-1-6-69f6cda1f4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:38.414905 kubelet[2564]: I0707 00:38:38.414855 2564 apiserver.go:52] "Watching apiserver" Jul 7 00:38:38.440241 kubelet[2564]: I0707 00:38:38.440184 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 7 00:38:38.478673 kubelet[2564]: I0707 00:38:38.478648 2564 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:38.480486 kubelet[2564]: E0707 00:38:38.480459 2564 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-1-1-6-69f6cda1f4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:38.558934 kubelet[2564]: I0707 00:38:38.558851 2564 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:39.560854 systemd[1]: Reload requested from client PID 2830 ('systemctl') (unit session-7.scope)... Jul 7 00:38:39.560874 systemd[1]: Reloading... Jul 7 00:38:39.620767 zram_generator::config[2874]: No configuration found. Jul 7 00:38:39.693903 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 7 00:38:39.806853 systemd[1]: Reloading finished in 245 ms. Jul 7 00:38:39.823205 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:38:39.837471 systemd[1]: kubelet.service: Deactivated successfully. Jul 7 00:38:39.837653 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:38:39.837693 systemd[1]: kubelet.service: Consumed 594ms CPU time, 129.6M memory peak. Jul 7 00:38:39.838961 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 7 00:38:39.928446 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 7 00:38:39.934902 (kubelet)[2925]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 7 00:38:39.976468 kubelet[2925]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 7 00:38:39.976468 kubelet[2925]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 7 00:38:39.976468 kubelet[2925]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 7 00:38:39.976819 kubelet[2925]: I0707 00:38:39.976528 2925 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 7 00:38:39.982971 kubelet[2925]: I0707 00:38:39.982948 2925 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jul 7 00:38:39.982971 kubelet[2925]: I0707 00:38:39.982971 2925 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 7 00:38:39.983426 kubelet[2925]: I0707 00:38:39.983409 2925 server.go:954] "Client rotation is on, will bootstrap in background" Jul 7 00:38:39.985157 kubelet[2925]: I0707 00:38:39.985145 2925 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 7 00:38:39.988656 kubelet[2925]: I0707 00:38:39.988635 2925 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 7 00:38:39.991957 kubelet[2925]: I0707 00:38:39.991947 2925 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 7 00:38:39.994492 kubelet[2925]: I0707 00:38:39.994481 2925 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 7 00:38:39.994737 kubelet[2925]: I0707 00:38:39.994716 2925 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 7 00:38:39.994923 kubelet[2925]: I0707 00:38:39.994804 2925 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-1-1-6-69f6cda1f4","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 7 00:38:39.995024 kubelet[2925]: I0707 00:38:39.995015 2925 topology_manager.go:138] "Creating topology manager with none policy" Jul 7 00:38:39.995065 kubelet[2925]: I0707 00:38:39.995060 2925 container_manager_linux.go:304] "Creating device plugin manager" Jul 7 00:38:39.995136 kubelet[2925]: I0707 00:38:39.995129 2925 state_mem.go:36] "Initialized new in-memory state store" Jul 7 00:38:39.995288 kubelet[2925]: I0707 00:38:39.995278 2925 kubelet.go:446] "Attempting to sync node with API server" Jul 7 00:38:39.995356 kubelet[2925]: I0707 00:38:39.995348 2925 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 7 00:38:39.995408 kubelet[2925]: I0707 00:38:39.995402 2925 kubelet.go:352] "Adding apiserver pod source" Jul 7 00:38:39.995449 kubelet[2925]: I0707 00:38:39.995443 2925 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 7 00:38:40.002271 kubelet[2925]: I0707 00:38:40.002247 2925 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 7 00:38:40.003473 kubelet[2925]: I0707 00:38:40.003445 2925 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 7 00:38:40.004839 kubelet[2925]: I0707 00:38:40.004816 2925 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 7 00:38:40.004895 kubelet[2925]: I0707 00:38:40.004848 2925 server.go:1287] "Started kubelet" Jul 7 00:38:40.010529 kubelet[2925]: I0707 00:38:40.009829 2925 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 7 00:38:40.016374 kubelet[2925]: I0707 00:38:40.016321 2925 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 7 00:38:40.018465 kubelet[2925]: I0707 00:38:40.017014 2925 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 7 00:38:40.018465 kubelet[2925]: I0707 00:38:40.018234 2925 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 7 00:38:40.018465 kubelet[2925]: E0707 00:38:40.018380 2925 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-1-1-6-69f6cda1f4\" not found" Jul 7 00:38:40.018835 kubelet[2925]: I0707 00:38:40.018815 2925 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 7 00:38:40.018950 kubelet[2925]: I0707 00:38:40.018934 2925 reconciler.go:26] "Reconciler: start to sync state" Jul 7 00:38:40.025736 kubelet[2925]: I0707 00:38:40.016259 2925 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jul 7 00:38:40.026627 kubelet[2925]: I0707 00:38:40.026613 2925 server.go:479] "Adding debug handlers to kubelet server" Jul 7 00:38:40.027377 kubelet[2925]: I0707 00:38:40.027363 2925 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 7 00:38:40.030496 kubelet[2925]: I0707 00:38:40.030427 2925 factory.go:221] Registration of the containerd container factory successfully Jul 7 00:38:40.030496 kubelet[2925]: I0707 00:38:40.030439 2925 factory.go:221] Registration of the systemd container factory successfully Jul 7 00:38:40.030496 kubelet[2925]: I0707 00:38:40.030486 2925 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 7 00:38:40.031629 kubelet[2925]: I0707 00:38:40.031601 2925 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 7 00:38:40.032735 kubelet[2925]: I0707 00:38:40.032534 2925 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 7 00:38:40.032735 kubelet[2925]: I0707 00:38:40.032558 2925 status_manager.go:227] "Starting to sync pod status with apiserver" Jul 7 00:38:40.032735 kubelet[2925]: I0707 00:38:40.032582 2925 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 7 00:38:40.032735 kubelet[2925]: I0707 00:38:40.032593 2925 kubelet.go:2382] "Starting kubelet main sync loop" Jul 7 00:38:40.032735 kubelet[2925]: E0707 00:38:40.032630 2925 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 7 00:38:40.039491 kubelet[2925]: E0707 00:38:40.039477 2925 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 7 00:38:40.074976 kubelet[2925]: I0707 00:38:40.074897 2925 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 7 00:38:40.074976 kubelet[2925]: I0707 00:38:40.074913 2925 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 7 00:38:40.074976 kubelet[2925]: I0707 00:38:40.074928 2925 state_mem.go:36] "Initialized new in-memory state store" Jul 7 00:38:40.075103 kubelet[2925]: I0707 00:38:40.075038 2925 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 7 00:38:40.075103 kubelet[2925]: I0707 00:38:40.075046 2925 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 7 00:38:40.075103 kubelet[2925]: I0707 00:38:40.075061 2925 policy_none.go:49] "None policy: Start" Jul 7 00:38:40.075103 kubelet[2925]: I0707 00:38:40.075068 2925 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 7 00:38:40.075103 kubelet[2925]: I0707 00:38:40.075075 2925 state_mem.go:35] "Initializing new in-memory state store" Jul 7 00:38:40.075180 kubelet[2925]: I0707 00:38:40.075150 2925 state_mem.go:75] "Updated machine memory state" Jul 7 00:38:40.078855 kubelet[2925]: I0707 00:38:40.078833 2925 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 7 00:38:40.079385 kubelet[2925]: I0707 00:38:40.079374 2925 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 7 00:38:40.079531 kubelet[2925]: I0707 00:38:40.079496 2925 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 7 00:38:40.079814 kubelet[2925]: I0707 00:38:40.079803 2925 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 7 00:38:40.081668 kubelet[2925]: E0707 00:38:40.081482 2925 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 7 00:38:40.134071 kubelet[2925]: I0707 00:38:40.134045 2925 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.135780 kubelet[2925]: I0707 00:38:40.135574 2925 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.137146 kubelet[2925]: I0707 00:38:40.136208 2925 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.140717 kubelet[2925]: E0707 00:38:40.140667 2925 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" already exists" pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.182171 kubelet[2925]: I0707 00:38:40.182141 2925 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.188569 kubelet[2925]: I0707 00:38:40.188534 2925 kubelet_node_status.go:124] "Node was previously registered" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.188657 kubelet[2925]: I0707 00:38:40.188587 2925 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320344 kubelet[2925]: I0707 00:38:40.320269 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e3806d3611dc8643e5c566a0aefacd7f-ca-certs\") pod \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" (UID: \"e3806d3611dc8643e5c566a0aefacd7f\") " pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320344 kubelet[2925]: I0707 00:38:40.320304 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-ca-certs\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320344 kubelet[2925]: I0707 00:38:40.320322 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-k8s-certs\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320344 kubelet[2925]: I0707 00:38:40.320346 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-kubeconfig\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320560 kubelet[2925]: I0707 00:38:40.320363 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3cbe8534bb4a5cf5652584bdac973454-kubeconfig\") pod \"kube-scheduler-ci-4344-1-1-6-69f6cda1f4\" (UID: \"3cbe8534bb4a5cf5652584bdac973454\") " pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320560 kubelet[2925]: I0707 00:38:40.320378 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e3806d3611dc8643e5c566a0aefacd7f-k8s-certs\") pod \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" (UID: \"e3806d3611dc8643e5c566a0aefacd7f\") " pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320560 kubelet[2925]: I0707 00:38:40.320392 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e3806d3611dc8643e5c566a0aefacd7f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" (UID: \"e3806d3611dc8643e5c566a0aefacd7f\") " pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320560 kubelet[2925]: I0707 00:38:40.320408 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:40.320560 kubelet[2925]: I0707 00:38:40.320423 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6c17fef99aaaa2b286221f0b2561d749-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-1-1-6-69f6cda1f4\" (UID: \"6c17fef99aaaa2b286221f0b2561d749\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:41.004084 kubelet[2925]: I0707 00:38:41.004038 2925 apiserver.go:52] "Watching apiserver" Jul 7 00:38:41.019055 kubelet[2925]: I0707 00:38:41.019035 2925 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 7 00:38:41.065725 kubelet[2925]: I0707 00:38:41.063763 2925 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:41.065873 kubelet[2925]: I0707 00:38:41.065850 2925 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:41.070715 kubelet[2925]: E0707 00:38:41.070683 2925 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-1-1-6-69f6cda1f4\" already exists" pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:41.070999 kubelet[2925]: E0707 00:38:41.070979 2925 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344-1-1-6-69f6cda1f4\" already exists" pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" Jul 7 00:38:41.090839 kubelet[2925]: I0707 00:38:41.090786 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344-1-1-6-69f6cda1f4" podStartSLOduration=3.090773553 podStartE2EDuration="3.090773553s" podCreationTimestamp="2025-07-07 00:38:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:38:41.083971988 +0000 UTC m=+1.145100827" watchObservedRunningTime="2025-07-07 00:38:41.090773553 +0000 UTC m=+1.151902393" Jul 7 00:38:41.097728 kubelet[2925]: I0707 00:38:41.097678 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344-1-1-6-69f6cda1f4" podStartSLOduration=1.09766592 podStartE2EDuration="1.09766592s" podCreationTimestamp="2025-07-07 00:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:38:41.091161916 +0000 UTC m=+1.152290756" watchObservedRunningTime="2025-07-07 00:38:41.09766592 +0000 UTC m=+1.158794760" Jul 7 00:38:41.110784 kubelet[2925]: I0707 00:38:41.110736 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344-1-1-6-69f6cda1f4" podStartSLOduration=1.110721207 podStartE2EDuration="1.110721207s" podCreationTimestamp="2025-07-07 00:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:38:41.098055967 +0000 UTC m=+1.159184807" watchObservedRunningTime="2025-07-07 00:38:41.110721207 +0000 UTC m=+1.171850048" Jul 7 00:38:42.360224 systemd[1]: Started sshd@8-65.108.89.120:22-101.36.119.98:50796.service - OpenSSH per-connection server daemon (101.36.119.98:50796). Jul 7 00:38:43.468934 sshd[2971]: Received disconnect from 101.36.119.98 port 50796:11: Bye Bye [preauth] Jul 7 00:38:43.468934 sshd[2971]: Disconnected from authenticating user root 101.36.119.98 port 50796 [preauth] Jul 7 00:38:43.471303 systemd[1]: sshd@8-65.108.89.120:22-101.36.119.98:50796.service: Deactivated successfully. Jul 7 00:38:46.299449 kubelet[2925]: I0707 00:38:46.299356 2925 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 7 00:38:46.300044 containerd[1574]: time="2025-07-07T00:38:46.299996494Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 7 00:38:46.300394 kubelet[2925]: I0707 00:38:46.300363 2925 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 7 00:38:47.358442 systemd[1]: Created slice kubepods-besteffort-pod3a0e8a1c_3528_4a94_8b49_a72932a4fb72.slice - libcontainer container kubepods-besteffort-pod3a0e8a1c_3528_4a94_8b49_a72932a4fb72.slice. Jul 7 00:38:47.364531 kubelet[2925]: I0707 00:38:47.364476 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3a0e8a1c-3528-4a94-8b49-a72932a4fb72-kube-proxy\") pod \"kube-proxy-64l6p\" (UID: \"3a0e8a1c-3528-4a94-8b49-a72932a4fb72\") " pod="kube-system/kube-proxy-64l6p" Jul 7 00:38:47.364531 kubelet[2925]: I0707 00:38:47.364515 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3a0e8a1c-3528-4a94-8b49-a72932a4fb72-xtables-lock\") pod \"kube-proxy-64l6p\" (UID: \"3a0e8a1c-3528-4a94-8b49-a72932a4fb72\") " pod="kube-system/kube-proxy-64l6p" Jul 7 00:38:47.364531 kubelet[2925]: I0707 00:38:47.364534 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3a0e8a1c-3528-4a94-8b49-a72932a4fb72-lib-modules\") pod \"kube-proxy-64l6p\" (UID: \"3a0e8a1c-3528-4a94-8b49-a72932a4fb72\") " pod="kube-system/kube-proxy-64l6p" Jul 7 00:38:47.364886 kubelet[2925]: I0707 00:38:47.364547 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4bpz\" (UniqueName: \"kubernetes.io/projected/3a0e8a1c-3528-4a94-8b49-a72932a4fb72-kube-api-access-m4bpz\") pod \"kube-proxy-64l6p\" (UID: \"3a0e8a1c-3528-4a94-8b49-a72932a4fb72\") " pod="kube-system/kube-proxy-64l6p" Jul 7 00:38:47.473329 kubelet[2925]: I0707 00:38:47.473277 2925 status_manager.go:890] "Failed to get status for pod" podUID="8bc6d1dc-56bc-4efe-917f-13e5f62253b7" pod="tigera-operator/tigera-operator-747864d56d-c4h2d" err="pods \"tigera-operator-747864d56d-c4h2d\" is forbidden: User \"system:node:ci-4344-1-1-6-69f6cda1f4\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4344-1-1-6-69f6cda1f4' and this object" Jul 7 00:38:47.480570 systemd[1]: Created slice kubepods-besteffort-pod8bc6d1dc_56bc_4efe_917f_13e5f62253b7.slice - libcontainer container kubepods-besteffort-pod8bc6d1dc_56bc_4efe_917f_13e5f62253b7.slice. Jul 7 00:38:47.566675 kubelet[2925]: I0707 00:38:47.566626 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8bc6d1dc-56bc-4efe-917f-13e5f62253b7-var-lib-calico\") pod \"tigera-operator-747864d56d-c4h2d\" (UID: \"8bc6d1dc-56bc-4efe-917f-13e5f62253b7\") " pod="tigera-operator/tigera-operator-747864d56d-c4h2d" Jul 7 00:38:47.566675 kubelet[2925]: I0707 00:38:47.566669 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp589\" (UniqueName: \"kubernetes.io/projected/8bc6d1dc-56bc-4efe-917f-13e5f62253b7-kube-api-access-lp589\") pod \"tigera-operator-747864d56d-c4h2d\" (UID: \"8bc6d1dc-56bc-4efe-917f-13e5f62253b7\") " pod="tigera-operator/tigera-operator-747864d56d-c4h2d" Jul 7 00:38:47.665767 containerd[1574]: time="2025-07-07T00:38:47.665664432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-64l6p,Uid:3a0e8a1c-3528-4a94-8b49-a72932a4fb72,Namespace:kube-system,Attempt:0,}" Jul 7 00:38:47.686817 containerd[1574]: time="2025-07-07T00:38:47.686644657Z" level=info msg="connecting to shim 4652c3b535975b54c1c8d249c71c9ddcf5c8f5dd8793ed7579c930d2e9d95f2a" address="unix:///run/containerd/s/715ed72c027be1d09272879fccdf0000a359126f0b01afa591f83eee3ce74830" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:38:47.713865 systemd[1]: Started cri-containerd-4652c3b535975b54c1c8d249c71c9ddcf5c8f5dd8793ed7579c930d2e9d95f2a.scope - libcontainer container 4652c3b535975b54c1c8d249c71c9ddcf5c8f5dd8793ed7579c930d2e9d95f2a. Jul 7 00:38:47.736792 containerd[1574]: time="2025-07-07T00:38:47.736691565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-64l6p,Uid:3a0e8a1c-3528-4a94-8b49-a72932a4fb72,Namespace:kube-system,Attempt:0,} returns sandbox id \"4652c3b535975b54c1c8d249c71c9ddcf5c8f5dd8793ed7579c930d2e9d95f2a\"" Jul 7 00:38:47.741305 containerd[1574]: time="2025-07-07T00:38:47.740816227Z" level=info msg="CreateContainer within sandbox \"4652c3b535975b54c1c8d249c71c9ddcf5c8f5dd8793ed7579c930d2e9d95f2a\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 7 00:38:47.757381 containerd[1574]: time="2025-07-07T00:38:47.756839009Z" level=info msg="Container d37c012d9239ba59b450f94721748389864107f1dcb57ba22f41b9e8a41e5cc0: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:38:47.766722 containerd[1574]: time="2025-07-07T00:38:47.766324256Z" level=info msg="CreateContainer within sandbox \"4652c3b535975b54c1c8d249c71c9ddcf5c8f5dd8793ed7579c930d2e9d95f2a\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d37c012d9239ba59b450f94721748389864107f1dcb57ba22f41b9e8a41e5cc0\"" Jul 7 00:38:47.767707 containerd[1574]: time="2025-07-07T00:38:47.767567251Z" level=info msg="StartContainer for \"d37c012d9239ba59b450f94721748389864107f1dcb57ba22f41b9e8a41e5cc0\"" Jul 7 00:38:47.770367 containerd[1574]: time="2025-07-07T00:38:47.770338830Z" level=info msg="connecting to shim d37c012d9239ba59b450f94721748389864107f1dcb57ba22f41b9e8a41e5cc0" address="unix:///run/containerd/s/715ed72c027be1d09272879fccdf0000a359126f0b01afa591f83eee3ce74830" protocol=ttrpc version=3 Jul 7 00:38:47.788878 systemd[1]: Started cri-containerd-d37c012d9239ba59b450f94721748389864107f1dcb57ba22f41b9e8a41e5cc0.scope - libcontainer container d37c012d9239ba59b450f94721748389864107f1dcb57ba22f41b9e8a41e5cc0. Jul 7 00:38:47.789883 containerd[1574]: time="2025-07-07T00:38:47.789861295Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-c4h2d,Uid:8bc6d1dc-56bc-4efe-917f-13e5f62253b7,Namespace:tigera-operator,Attempt:0,}" Jul 7 00:38:47.807528 containerd[1574]: time="2025-07-07T00:38:47.807449991Z" level=info msg="connecting to shim 6002560c68c019b3531b9065b47e969a3b6e5bd348f17df71e060ab43cb39070" address="unix:///run/containerd/s/8e8c573145a79817b036a2fb81776c45971040aaa96ce5ce956db00a8bef5ccf" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:38:47.826928 systemd[1]: Started cri-containerd-6002560c68c019b3531b9065b47e969a3b6e5bd348f17df71e060ab43cb39070.scope - libcontainer container 6002560c68c019b3531b9065b47e969a3b6e5bd348f17df71e060ab43cb39070. Jul 7 00:38:47.831790 containerd[1574]: time="2025-07-07T00:38:47.831743348Z" level=info msg="StartContainer for \"d37c012d9239ba59b450f94721748389864107f1dcb57ba22f41b9e8a41e5cc0\" returns successfully" Jul 7 00:38:47.877143 containerd[1574]: time="2025-07-07T00:38:47.877109875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-c4h2d,Uid:8bc6d1dc-56bc-4efe-917f-13e5f62253b7,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"6002560c68c019b3531b9065b47e969a3b6e5bd348f17df71e060ab43cb39070\"" Jul 7 00:38:47.879010 containerd[1574]: time="2025-07-07T00:38:47.878979072Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 7 00:38:48.115143 kubelet[2925]: I0707 00:38:48.115075 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-64l6p" podStartSLOduration=1.115055017 podStartE2EDuration="1.115055017s" podCreationTimestamp="2025-07-07 00:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:38:48.105208611 +0000 UTC m=+8.166337451" watchObservedRunningTime="2025-07-07 00:38:48.115055017 +0000 UTC m=+8.176183856" Jul 7 00:38:48.484880 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4178467750.mount: Deactivated successfully. Jul 7 00:38:49.540178 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3358806442.mount: Deactivated successfully. Jul 7 00:38:49.948961 containerd[1574]: time="2025-07-07T00:38:49.948898242Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:49.950010 containerd[1574]: time="2025-07-07T00:38:49.949982878Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Jul 7 00:38:49.950908 containerd[1574]: time="2025-07-07T00:38:49.950843562Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:49.952840 containerd[1574]: time="2025-07-07T00:38:49.952794932Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:38:49.953729 containerd[1574]: time="2025-07-07T00:38:49.953233390Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.074230572s" Jul 7 00:38:49.953729 containerd[1574]: time="2025-07-07T00:38:49.953258266Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Jul 7 00:38:49.955354 containerd[1574]: time="2025-07-07T00:38:49.955330516Z" level=info msg="CreateContainer within sandbox \"6002560c68c019b3531b9065b47e969a3b6e5bd348f17df71e060ab43cb39070\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 7 00:38:49.964765 containerd[1574]: time="2025-07-07T00:38:49.964141153Z" level=info msg="Container 3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:38:49.967914 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount122818934.mount: Deactivated successfully. Jul 7 00:38:49.981448 containerd[1574]: time="2025-07-07T00:38:49.981388667Z" level=info msg="CreateContainer within sandbox \"6002560c68c019b3531b9065b47e969a3b6e5bd348f17df71e060ab43cb39070\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d\"" Jul 7 00:38:49.982159 containerd[1574]: time="2025-07-07T00:38:49.982126049Z" level=info msg="StartContainer for \"3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d\"" Jul 7 00:38:49.983461 containerd[1574]: time="2025-07-07T00:38:49.983391105Z" level=info msg="connecting to shim 3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d" address="unix:///run/containerd/s/8e8c573145a79817b036a2fb81776c45971040aaa96ce5ce956db00a8bef5ccf" protocol=ttrpc version=3 Jul 7 00:38:50.000827 systemd[1]: Started cri-containerd-3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d.scope - libcontainer container 3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d. Jul 7 00:38:50.025425 containerd[1574]: time="2025-07-07T00:38:50.025370662Z" level=info msg="StartContainer for \"3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d\" returns successfully" Jul 7 00:38:54.374362 kubelet[2925]: I0707 00:38:54.374246 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-c4h2d" podStartSLOduration=5.298626026 podStartE2EDuration="7.374225712s" podCreationTimestamp="2025-07-07 00:38:47 +0000 UTC" firstStartedPulling="2025-07-07 00:38:47.878301132 +0000 UTC m=+7.939429972" lastFinishedPulling="2025-07-07 00:38:49.953900818 +0000 UTC m=+10.015029658" observedRunningTime="2025-07-07 00:38:50.11012688 +0000 UTC m=+10.171255721" watchObservedRunningTime="2025-07-07 00:38:54.374225712 +0000 UTC m=+14.435354582" Jul 7 00:38:55.739315 sudo[2020]: pam_unix(sudo:session): session closed for user root Jul 7 00:38:55.908860 sshd[2019]: Connection closed by 147.75.109.163 port 58932 Jul 7 00:38:55.912041 sshd-session[2002]: pam_unix(sshd:session): session closed for user core Jul 7 00:38:55.915023 systemd[1]: sshd@7-65.108.89.120:22-147.75.109.163:58932.service: Deactivated successfully. Jul 7 00:38:55.917910 systemd[1]: session-7.scope: Deactivated successfully. Jul 7 00:38:55.918371 systemd[1]: session-7.scope: Consumed 4.391s CPU time, 159.9M memory peak. Jul 7 00:38:55.921211 systemd-logind[1531]: Session 7 logged out. Waiting for processes to exit. Jul 7 00:38:55.923083 systemd-logind[1531]: Removed session 7. Jul 7 00:38:58.405754 systemd[1]: Created slice kubepods-besteffort-pod9dc7c476_075e_4ea4_9673_b5a8176faf20.slice - libcontainer container kubepods-besteffort-pod9dc7c476_075e_4ea4_9673_b5a8176faf20.slice. Jul 7 00:38:58.432315 kubelet[2925]: I0707 00:38:58.432271 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9dc7c476-075e-4ea4-9673-b5a8176faf20-tigera-ca-bundle\") pod \"calico-typha-577bbff886-gcts8\" (UID: \"9dc7c476-075e-4ea4-9673-b5a8176faf20\") " pod="calico-system/calico-typha-577bbff886-gcts8" Jul 7 00:38:58.432315 kubelet[2925]: I0707 00:38:58.432309 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9dc7c476-075e-4ea4-9673-b5a8176faf20-typha-certs\") pod \"calico-typha-577bbff886-gcts8\" (UID: \"9dc7c476-075e-4ea4-9673-b5a8176faf20\") " pod="calico-system/calico-typha-577bbff886-gcts8" Jul 7 00:38:58.432618 kubelet[2925]: I0707 00:38:58.432325 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqc2d\" (UniqueName: \"kubernetes.io/projected/9dc7c476-075e-4ea4-9673-b5a8176faf20-kube-api-access-xqc2d\") pod \"calico-typha-577bbff886-gcts8\" (UID: \"9dc7c476-075e-4ea4-9673-b5a8176faf20\") " pod="calico-system/calico-typha-577bbff886-gcts8" Jul 7 00:38:58.711991 containerd[1574]: time="2025-07-07T00:38:58.711662546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-577bbff886-gcts8,Uid:9dc7c476-075e-4ea4-9673-b5a8176faf20,Namespace:calico-system,Attempt:0,}" Jul 7 00:38:58.737726 containerd[1574]: time="2025-07-07T00:38:58.736727744Z" level=info msg="connecting to shim 24c788b52465f34c6bf048ad1105ed3a87d6150387781d88a1bf921877c6a4f0" address="unix:///run/containerd/s/75bf39900a03b33193bb8f519f3786b63ca3c8aed4f9cee81e139c7d3917d8fd" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:38:58.741893 systemd[1]: Created slice kubepods-besteffort-pod8aa4891a_c3c3_4087_9960_f697c2647cc8.slice - libcontainer container kubepods-besteffort-pod8aa4891a_c3c3_4087_9960_f697c2647cc8.slice. Jul 7 00:38:58.762912 systemd[1]: Started cri-containerd-24c788b52465f34c6bf048ad1105ed3a87d6150387781d88a1bf921877c6a4f0.scope - libcontainer container 24c788b52465f34c6bf048ad1105ed3a87d6150387781d88a1bf921877c6a4f0. Jul 7 00:38:58.809990 containerd[1574]: time="2025-07-07T00:38:58.809946141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-577bbff886-gcts8,Uid:9dc7c476-075e-4ea4-9673-b5a8176faf20,Namespace:calico-system,Attempt:0,} returns sandbox id \"24c788b52465f34c6bf048ad1105ed3a87d6150387781d88a1bf921877c6a4f0\"" Jul 7 00:38:58.811536 containerd[1574]: time="2025-07-07T00:38:58.811487686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 7 00:38:58.836574 kubelet[2925]: I0707 00:38:58.836543 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-var-run-calico\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.836714 kubelet[2925]: I0707 00:38:58.836658 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-policysync\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.836714 kubelet[2925]: I0707 00:38:58.836678 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-cni-net-dir\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.836928 kubelet[2925]: I0707 00:38:58.836812 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-cni-log-dir\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.836928 kubelet[2925]: I0707 00:38:58.836861 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-lib-modules\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.836928 kubelet[2925]: I0707 00:38:58.836878 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8aa4891a-c3c3-4087-9960-f697c2647cc8-node-certs\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.836928 kubelet[2925]: I0707 00:38:58.836897 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-var-lib-calico\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.837168 kubelet[2925]: I0707 00:38:58.837054 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-flexvol-driver-host\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.837168 kubelet[2925]: I0707 00:38:58.837220 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgfn2\" (UniqueName: \"kubernetes.io/projected/8aa4891a-c3c3-4087-9960-f697c2647cc8-kube-api-access-bgfn2\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.837168 kubelet[2925]: I0707 00:38:58.837241 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-cni-bin-dir\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.837168 kubelet[2925]: I0707 00:38:58.837255 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa4891a-c3c3-4087-9960-f697c2647cc8-tigera-ca-bundle\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.837398 kubelet[2925]: I0707 00:38:58.837386 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8aa4891a-c3c3-4087-9960-f697c2647cc8-xtables-lock\") pod \"calico-node-6zljk\" (UID: \"8aa4891a-c3c3-4087-9960-f697c2647cc8\") " pod="calico-system/calico-node-6zljk" Jul 7 00:38:58.941952 kubelet[2925]: E0707 00:38:58.941827 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.941952 kubelet[2925]: W0707 00:38:58.941848 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.943092 kubelet[2925]: E0707 00:38:58.942276 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.943092 kubelet[2925]: W0707 00:38:58.942437 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.945068 kubelet[2925]: E0707 00:38:58.945050 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.945222 kubelet[2925]: E0707 00:38:58.945129 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.945848 kubelet[2925]: E0707 00:38:58.945836 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.946141 kubelet[2925]: W0707 00:38:58.946127 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.946216 kubelet[2925]: E0707 00:38:58.946205 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.948908 kubelet[2925]: E0707 00:38:58.946909 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.949398 kubelet[2925]: W0707 00:38:58.949256 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.951001 kubelet[2925]: E0707 00:38:58.950872 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.951615 kubelet[2925]: E0707 00:38:58.951076 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.951615 kubelet[2925]: W0707 00:38:58.951372 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.951615 kubelet[2925]: E0707 00:38:58.951558 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.952380 kubelet[2925]: E0707 00:38:58.952022 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.952380 kubelet[2925]: W0707 00:38:58.952032 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.952380 kubelet[2925]: E0707 00:38:58.952352 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.952961 kubelet[2925]: E0707 00:38:58.952394 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.952961 kubelet[2925]: W0707 00:38:58.952400 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.952961 kubelet[2925]: E0707 00:38:58.952440 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.952961 kubelet[2925]: E0707 00:38:58.952582 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.952961 kubelet[2925]: W0707 00:38:58.952591 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.952961 kubelet[2925]: E0707 00:38:58.952683 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.953337 kubelet[2925]: E0707 00:38:58.952876 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.953337 kubelet[2925]: W0707 00:38:58.953335 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.953425 kubelet[2925]: E0707 00:38:58.953380 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.953618 kubelet[2925]: E0707 00:38:58.953594 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.953618 kubelet[2925]: W0707 00:38:58.953611 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.954079 kubelet[2925]: E0707 00:38:58.953750 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.954079 kubelet[2925]: E0707 00:38:58.953883 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.954079 kubelet[2925]: W0707 00:38:58.953892 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.954079 kubelet[2925]: E0707 00:38:58.954032 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.954220 kubelet[2925]: E0707 00:38:58.954129 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.954220 kubelet[2925]: W0707 00:38:58.954137 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.954220 kubelet[2925]: E0707 00:38:58.954206 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.954573 kubelet[2925]: E0707 00:38:58.954346 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.954573 kubelet[2925]: W0707 00:38:58.954386 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.954573 kubelet[2925]: E0707 00:38:58.954472 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.954734 kubelet[2925]: E0707 00:38:58.954622 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.954734 kubelet[2925]: W0707 00:38:58.954676 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.954792 kubelet[2925]: E0707 00:38:58.954779 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.954981 kubelet[2925]: E0707 00:38:58.954960 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.955035 kubelet[2925]: W0707 00:38:58.955018 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.955144 kubelet[2925]: E0707 00:38:58.955115 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.955811 kubelet[2925]: E0707 00:38:58.955772 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.955811 kubelet[2925]: W0707 00:38:58.955790 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.955898 kubelet[2925]: E0707 00:38:58.955835 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.956135 kubelet[2925]: E0707 00:38:58.955946 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.956135 kubelet[2925]: W0707 00:38:58.955958 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.956135 kubelet[2925]: E0707 00:38:58.956040 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.956135 kubelet[2925]: E0707 00:38:58.956121 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.956135 kubelet[2925]: W0707 00:38:58.956128 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.956822 kubelet[2925]: E0707 00:38:58.956166 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.956822 kubelet[2925]: E0707 00:38:58.956351 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.956822 kubelet[2925]: W0707 00:38:58.956359 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.956822 kubelet[2925]: E0707 00:38:58.956455 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.956822 kubelet[2925]: E0707 00:38:58.956619 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.956822 kubelet[2925]: W0707 00:38:58.956627 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.956822 kubelet[2925]: E0707 00:38:58.956740 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.957686 kubelet[2925]: E0707 00:38:58.956914 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.957686 kubelet[2925]: W0707 00:38:58.956923 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.957686 kubelet[2925]: E0707 00:38:58.956940 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.957686 kubelet[2925]: E0707 00:38:58.957106 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.957686 kubelet[2925]: W0707 00:38:58.957114 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.957686 kubelet[2925]: E0707 00:38:58.957136 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:58.957686 kubelet[2925]: E0707 00:38:58.957330 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:58.957686 kubelet[2925]: W0707 00:38:58.957339 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:58.957686 kubelet[2925]: E0707 00:38:58.957347 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.018465 kubelet[2925]: E0707 00:38:59.018098 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sw6gt" podUID="eb44bbbc-039e-4742-9b7b-5f5bfbef4e00" Jul 7 00:38:59.027011 kubelet[2925]: E0707 00:38:59.026903 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.027011 kubelet[2925]: W0707 00:38:59.026921 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.027011 kubelet[2925]: E0707 00:38:59.026953 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.027118 kubelet[2925]: E0707 00:38:59.027091 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.027118 kubelet[2925]: W0707 00:38:59.027098 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.027118 kubelet[2925]: E0707 00:38:59.027105 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.027568 kubelet[2925]: E0707 00:38:59.027232 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.027568 kubelet[2925]: W0707 00:38:59.027243 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.027568 kubelet[2925]: E0707 00:38:59.027249 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.027568 kubelet[2925]: E0707 00:38:59.027413 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.027568 kubelet[2925]: W0707 00:38:59.027421 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.027568 kubelet[2925]: E0707 00:38:59.027442 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.027910 kubelet[2925]: E0707 00:38:59.027594 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.027910 kubelet[2925]: W0707 00:38:59.027601 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.027910 kubelet[2925]: E0707 00:38:59.027608 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.027910 kubelet[2925]: E0707 00:38:59.027749 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.027910 kubelet[2925]: W0707 00:38:59.027758 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.027910 kubelet[2925]: E0707 00:38:59.027764 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.028201 kubelet[2925]: E0707 00:38:59.028052 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.028201 kubelet[2925]: W0707 00:38:59.028064 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.028201 kubelet[2925]: E0707 00:38:59.028071 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.028201 kubelet[2925]: E0707 00:38:59.028204 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.028295 kubelet[2925]: W0707 00:38:59.028211 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.028295 kubelet[2925]: E0707 00:38:59.028217 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.028551 kubelet[2925]: E0707 00:38:59.028499 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.028551 kubelet[2925]: W0707 00:38:59.028511 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.028551 kubelet[2925]: E0707 00:38:59.028518 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.028756 kubelet[2925]: E0707 00:38:59.028662 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.028756 kubelet[2925]: W0707 00:38:59.028672 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.028756 kubelet[2925]: E0707 00:38:59.028678 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.029322 kubelet[2925]: E0707 00:38:59.029205 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.029322 kubelet[2925]: W0707 00:38:59.029217 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.029322 kubelet[2925]: E0707 00:38:59.029225 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.029400 kubelet[2925]: E0707 00:38:59.029338 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.029400 kubelet[2925]: W0707 00:38:59.029345 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.029400 kubelet[2925]: E0707 00:38:59.029362 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.029535 kubelet[2925]: E0707 00:38:59.029476 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.029535 kubelet[2925]: W0707 00:38:59.029486 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.029535 kubelet[2925]: E0707 00:38:59.029492 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.029715 kubelet[2925]: E0707 00:38:59.029623 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.029715 kubelet[2925]: W0707 00:38:59.029629 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.029715 kubelet[2925]: E0707 00:38:59.029635 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.029773 kubelet[2925]: E0707 00:38:59.029757 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.029773 kubelet[2925]: W0707 00:38:59.029764 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.029773 kubelet[2925]: E0707 00:38:59.029770 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.030711 kubelet[2925]: E0707 00:38:59.029875 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.030711 kubelet[2925]: W0707 00:38:59.029894 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.030711 kubelet[2925]: E0707 00:38:59.029901 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.030711 kubelet[2925]: E0707 00:38:59.030030 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.030711 kubelet[2925]: W0707 00:38:59.030039 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.030711 kubelet[2925]: E0707 00:38:59.030046 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.030848 kubelet[2925]: E0707 00:38:59.030833 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.030848 kubelet[2925]: W0707 00:38:59.030846 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.030913 kubelet[2925]: E0707 00:38:59.030854 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.031005 kubelet[2925]: E0707 00:38:59.030986 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.031005 kubelet[2925]: W0707 00:38:59.030999 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.031005 kubelet[2925]: E0707 00:38:59.031005 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.031139 kubelet[2925]: E0707 00:38:59.031118 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.031178 kubelet[2925]: W0707 00:38:59.031147 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.031178 kubelet[2925]: E0707 00:38:59.031154 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.038952 kubelet[2925]: E0707 00:38:59.038934 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.038952 kubelet[2925]: W0707 00:38:59.038946 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.039016 kubelet[2925]: E0707 00:38:59.038956 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.039016 kubelet[2925]: I0707 00:38:59.038996 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb44bbbc-039e-4742-9b7b-5f5bfbef4e00-kubelet-dir\") pod \"csi-node-driver-sw6gt\" (UID: \"eb44bbbc-039e-4742-9b7b-5f5bfbef4e00\") " pod="calico-system/csi-node-driver-sw6gt" Jul 7 00:38:59.039225 kubelet[2925]: E0707 00:38:59.039190 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.039225 kubelet[2925]: W0707 00:38:59.039211 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.039381 kubelet[2925]: E0707 00:38:59.039241 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.039381 kubelet[2925]: I0707 00:38:59.039266 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/eb44bbbc-039e-4742-9b7b-5f5bfbef4e00-varrun\") pod \"csi-node-driver-sw6gt\" (UID: \"eb44bbbc-039e-4742-9b7b-5f5bfbef4e00\") " pod="calico-system/csi-node-driver-sw6gt" Jul 7 00:38:59.039516 kubelet[2925]: E0707 00:38:59.039498 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.039516 kubelet[2925]: W0707 00:38:59.039507 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.039592 kubelet[2925]: E0707 00:38:59.039525 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.039648 kubelet[2925]: E0707 00:38:59.039629 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.039688 kubelet[2925]: W0707 00:38:59.039641 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.039989 kubelet[2925]: E0707 00:38:59.039939 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.040048 kubelet[2925]: E0707 00:38:59.040029 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.040048 kubelet[2925]: W0707 00:38:59.040043 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.040092 kubelet[2925]: E0707 00:38:59.040057 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.040092 kubelet[2925]: I0707 00:38:59.040069 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fsnt\" (UniqueName: \"kubernetes.io/projected/eb44bbbc-039e-4742-9b7b-5f5bfbef4e00-kube-api-access-8fsnt\") pod \"csi-node-driver-sw6gt\" (UID: \"eb44bbbc-039e-4742-9b7b-5f5bfbef4e00\") " pod="calico-system/csi-node-driver-sw6gt" Jul 7 00:38:59.040512 kubelet[2925]: E0707 00:38:59.040490 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.040512 kubelet[2925]: W0707 00:38:59.040504 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.040571 kubelet[2925]: E0707 00:38:59.040549 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.040591 kubelet[2925]: I0707 00:38:59.040574 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eb44bbbc-039e-4742-9b7b-5f5bfbef4e00-registration-dir\") pod \"csi-node-driver-sw6gt\" (UID: \"eb44bbbc-039e-4742-9b7b-5f5bfbef4e00\") " pod="calico-system/csi-node-driver-sw6gt" Jul 7 00:38:59.040881 kubelet[2925]: E0707 00:38:59.040860 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.040881 kubelet[2925]: W0707 00:38:59.040874 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.040954 kubelet[2925]: E0707 00:38:59.040922 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.040954 kubelet[2925]: I0707 00:38:59.040938 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eb44bbbc-039e-4742-9b7b-5f5bfbef4e00-socket-dir\") pod \"csi-node-driver-sw6gt\" (UID: \"eb44bbbc-039e-4742-9b7b-5f5bfbef4e00\") " pod="calico-system/csi-node-driver-sw6gt" Jul 7 00:38:59.041253 kubelet[2925]: E0707 00:38:59.041223 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.041253 kubelet[2925]: W0707 00:38:59.041235 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.041491 kubelet[2925]: E0707 00:38:59.041301 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.041780 kubelet[2925]: E0707 00:38:59.041757 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.041780 kubelet[2925]: W0707 00:38:59.041771 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.041837 kubelet[2925]: E0707 00:38:59.041785 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.042043 kubelet[2925]: E0707 00:38:59.042017 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.042043 kubelet[2925]: W0707 00:38:59.042035 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.042091 kubelet[2925]: E0707 00:38:59.042059 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.042342 kubelet[2925]: E0707 00:38:59.042317 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.042342 kubelet[2925]: W0707 00:38:59.042332 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.042792 kubelet[2925]: E0707 00:38:59.042768 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.043229 kubelet[2925]: E0707 00:38:59.043194 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.043229 kubelet[2925]: W0707 00:38:59.043212 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.043229 kubelet[2925]: E0707 00:38:59.043220 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.043463 kubelet[2925]: E0707 00:38:59.043442 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.043463 kubelet[2925]: W0707 00:38:59.043456 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.043463 kubelet[2925]: E0707 00:38:59.043463 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.043644 kubelet[2925]: E0707 00:38:59.043619 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.043680 kubelet[2925]: W0707 00:38:59.043664 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.043680 kubelet[2925]: E0707 00:38:59.043673 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.043855 kubelet[2925]: E0707 00:38:59.043821 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.043855 kubelet[2925]: W0707 00:38:59.043836 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.043855 kubelet[2925]: E0707 00:38:59.043844 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.052890 containerd[1574]: time="2025-07-07T00:38:59.052777050Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6zljk,Uid:8aa4891a-c3c3-4087-9960-f697c2647cc8,Namespace:calico-system,Attempt:0,}" Jul 7 00:38:59.095514 containerd[1574]: time="2025-07-07T00:38:59.095471525Z" level=info msg="connecting to shim 986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03" address="unix:///run/containerd/s/ffeb1b9361bda4756c9c5fd4703a203e36a90a31073855e0ab84f5e79e656f0f" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:38:59.114917 systemd[1]: Started cri-containerd-986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03.scope - libcontainer container 986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03. Jul 7 00:38:59.142142 kubelet[2925]: E0707 00:38:59.141808 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.142142 kubelet[2925]: W0707 00:38:59.141825 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.142142 kubelet[2925]: E0707 00:38:59.141842 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.142142 kubelet[2925]: E0707 00:38:59.141998 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.142142 kubelet[2925]: W0707 00:38:59.142005 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.142142 kubelet[2925]: E0707 00:38:59.142022 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.143874 kubelet[2925]: E0707 00:38:59.142404 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.143874 kubelet[2925]: W0707 00:38:59.142412 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.143874 kubelet[2925]: E0707 00:38:59.142441 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.143874 kubelet[2925]: E0707 00:38:59.142793 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.143874 kubelet[2925]: W0707 00:38:59.142801 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.143874 kubelet[2925]: E0707 00:38:59.143027 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.143874 kubelet[2925]: W0707 00:38:59.143034 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.143874 kubelet[2925]: E0707 00:38:59.143043 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.143874 kubelet[2925]: E0707 00:38:59.143253 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.144406 kubelet[2925]: E0707 00:38:59.144230 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.144406 kubelet[2925]: W0707 00:38:59.144238 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.144406 kubelet[2925]: E0707 00:38:59.144247 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.144406 kubelet[2925]: E0707 00:38:59.144611 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.144406 kubelet[2925]: W0707 00:38:59.144619 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.144933 kubelet[2925]: E0707 00:38:59.144830 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.145168 kubelet[2925]: E0707 00:38:59.144992 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.145168 kubelet[2925]: W0707 00:38:59.145007 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.145243 kubelet[2925]: E0707 00:38:59.145172 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.145243 kubelet[2925]: W0707 00:38:59.145179 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.145408 kubelet[2925]: E0707 00:38:59.145389 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.145472 kubelet[2925]: E0707 00:38:59.145411 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.145894 kubelet[2925]: E0707 00:38:59.145865 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.145930 kubelet[2925]: W0707 00:38:59.145899 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.146729 kubelet[2925]: E0707 00:38:59.146065 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.146729 kubelet[2925]: E0707 00:38:59.146303 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.146729 kubelet[2925]: W0707 00:38:59.146310 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.146729 kubelet[2925]: E0707 00:38:59.146627 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.146823 kubelet[2925]: E0707 00:38:59.146746 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.146823 kubelet[2925]: W0707 00:38:59.146754 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.146910 kubelet[2925]: E0707 00:38:59.146889 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.147376 kubelet[2925]: E0707 00:38:59.147333 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.147376 kubelet[2925]: W0707 00:38:59.147347 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.147625 kubelet[2925]: E0707 00:38:59.147584 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.147662 kubelet[2925]: E0707 00:38:59.147634 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.147662 kubelet[2925]: W0707 00:38:59.147640 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.147971 kubelet[2925]: E0707 00:38:59.147893 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.147971 kubelet[2925]: E0707 00:38:59.147939 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.147971 kubelet[2925]: W0707 00:38:59.147945 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.148251 kubelet[2925]: E0707 00:38:59.148098 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.148251 kubelet[2925]: E0707 00:38:59.148212 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.148251 kubelet[2925]: W0707 00:38:59.148224 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.148572 kubelet[2925]: E0707 00:38:59.148463 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.148721 kubelet[2925]: E0707 00:38:59.148639 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.148721 kubelet[2925]: W0707 00:38:59.148650 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.148721 kubelet[2925]: E0707 00:38:59.148660 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.149775 kubelet[2925]: E0707 00:38:59.149754 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.149775 kubelet[2925]: W0707 00:38:59.149770 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.149955 kubelet[2925]: E0707 00:38:59.149889 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.149983 kubelet[2925]: E0707 00:38:59.149959 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.149983 kubelet[2925]: W0707 00:38:59.149965 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.150191 kubelet[2925]: E0707 00:38:59.150045 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.150191 kubelet[2925]: E0707 00:38:59.150104 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.150191 kubelet[2925]: W0707 00:38:59.150110 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.150191 kubelet[2925]: E0707 00:38:59.150188 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.150462 kubelet[2925]: E0707 00:38:59.150308 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.150462 kubelet[2925]: W0707 00:38:59.150314 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.150462 kubelet[2925]: E0707 00:38:59.150414 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.150622 kubelet[2925]: E0707 00:38:59.150493 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.150622 kubelet[2925]: W0707 00:38:59.150503 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.150622 kubelet[2925]: E0707 00:38:59.150517 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.150860 kubelet[2925]: E0707 00:38:59.150667 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.150860 kubelet[2925]: W0707 00:38:59.150673 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.150860 kubelet[2925]: E0707 00:38:59.150680 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.150860 kubelet[2925]: E0707 00:38:59.150829 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.150860 kubelet[2925]: W0707 00:38:59.150835 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.151733 kubelet[2925]: E0707 00:38:59.150864 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.151733 kubelet[2925]: E0707 00:38:59.151077 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.151733 kubelet[2925]: W0707 00:38:59.151089 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.151733 kubelet[2925]: E0707 00:38:59.151121 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.155496 kubelet[2925]: E0707 00:38:59.155474 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:38:59.155496 kubelet[2925]: W0707 00:38:59.155487 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:38:59.155496 kubelet[2925]: E0707 00:38:59.155496 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:38:59.157636 containerd[1574]: time="2025-07-07T00:38:59.157578043Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-6zljk,Uid:8aa4891a-c3c3-4087-9960-f697c2647cc8,Namespace:calico-system,Attempt:0,} returns sandbox id \"986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03\"" Jul 7 00:39:00.486496 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1694877090.mount: Deactivated successfully. Jul 7 00:39:00.905403 containerd[1574]: time="2025-07-07T00:39:00.905340299Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:00.906315 containerd[1574]: time="2025-07-07T00:39:00.906275762Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Jul 7 00:39:00.907233 containerd[1574]: time="2025-07-07T00:39:00.907186247Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:00.909304 containerd[1574]: time="2025-07-07T00:39:00.909228604Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:00.909634 containerd[1574]: time="2025-07-07T00:39:00.909615122Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.098101738s" Jul 7 00:39:00.909719 containerd[1574]: time="2025-07-07T00:39:00.909687719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Jul 7 00:39:00.911025 containerd[1574]: time="2025-07-07T00:39:00.910960437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 7 00:39:00.924131 containerd[1574]: time="2025-07-07T00:39:00.923535636Z" level=info msg="CreateContainer within sandbox \"24c788b52465f34c6bf048ad1105ed3a87d6150387781d88a1bf921877c6a4f0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 7 00:39:00.933739 containerd[1574]: time="2025-07-07T00:39:00.932285149Z" level=info msg="Container 3f5e2657af4b9bfd3fa68e5e76de997bd1e4491cb5293fd3b96c26055d711df2: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:00.939467 containerd[1574]: time="2025-07-07T00:39:00.939429276Z" level=info msg="CreateContainer within sandbox \"24c788b52465f34c6bf048ad1105ed3a87d6150387781d88a1bf921877c6a4f0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3f5e2657af4b9bfd3fa68e5e76de997bd1e4491cb5293fd3b96c26055d711df2\"" Jul 7 00:39:00.940025 containerd[1574]: time="2025-07-07T00:39:00.939979503Z" level=info msg="StartContainer for \"3f5e2657af4b9bfd3fa68e5e76de997bd1e4491cb5293fd3b96c26055d711df2\"" Jul 7 00:39:00.941442 containerd[1574]: time="2025-07-07T00:39:00.941373168Z" level=info msg="connecting to shim 3f5e2657af4b9bfd3fa68e5e76de997bd1e4491cb5293fd3b96c26055d711df2" address="unix:///run/containerd/s/75bf39900a03b33193bb8f519f3786b63ca3c8aed4f9cee81e139c7d3917d8fd" protocol=ttrpc version=3 Jul 7 00:39:00.969994 systemd[1]: Started cri-containerd-3f5e2657af4b9bfd3fa68e5e76de997bd1e4491cb5293fd3b96c26055d711df2.scope - libcontainer container 3f5e2657af4b9bfd3fa68e5e76de997bd1e4491cb5293fd3b96c26055d711df2. Jul 7 00:39:01.026328 containerd[1574]: time="2025-07-07T00:39:01.026270053Z" level=info msg="StartContainer for \"3f5e2657af4b9bfd3fa68e5e76de997bd1e4491cb5293fd3b96c26055d711df2\" returns successfully" Jul 7 00:39:01.035127 kubelet[2925]: E0707 00:39:01.035087 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sw6gt" podUID="eb44bbbc-039e-4742-9b7b-5f5bfbef4e00" Jul 7 00:39:01.139402 kubelet[2925]: I0707 00:39:01.139336 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-577bbff886-gcts8" podStartSLOduration=1.039585481 podStartE2EDuration="3.139323291s" podCreationTimestamp="2025-07-07 00:38:58 +0000 UTC" firstStartedPulling="2025-07-07 00:38:58.810974619 +0000 UTC m=+18.872103459" lastFinishedPulling="2025-07-07 00:39:00.910712429 +0000 UTC m=+20.971841269" observedRunningTime="2025-07-07 00:39:01.138500763 +0000 UTC m=+21.199629603" watchObservedRunningTime="2025-07-07 00:39:01.139323291 +0000 UTC m=+21.200452122" Jul 7 00:39:01.144925 kubelet[2925]: E0707 00:39:01.144888 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.144925 kubelet[2925]: W0707 00:39:01.144911 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.144925 kubelet[2925]: E0707 00:39:01.144929 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.145907 kubelet[2925]: E0707 00:39:01.145873 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.145907 kubelet[2925]: W0707 00:39:01.145889 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.145907 kubelet[2925]: E0707 00:39:01.145898 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.146080 kubelet[2925]: E0707 00:39:01.146062 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.149751 kubelet[2925]: W0707 00:39:01.149724 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.149751 kubelet[2925]: E0707 00:39:01.149750 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.151905 kubelet[2925]: E0707 00:39:01.151790 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.151905 kubelet[2925]: W0707 00:39:01.151813 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.151905 kubelet[2925]: E0707 00:39:01.151837 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.152861 kubelet[2925]: E0707 00:39:01.152849 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.152924 kubelet[2925]: W0707 00:39:01.152914 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.152973 kubelet[2925]: E0707 00:39:01.152965 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.153169 kubelet[2925]: E0707 00:39:01.153118 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.153169 kubelet[2925]: W0707 00:39:01.153130 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.153169 kubelet[2925]: E0707 00:39:01.153140 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.153381 kubelet[2925]: E0707 00:39:01.153332 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.153381 kubelet[2925]: W0707 00:39:01.153341 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.153381 kubelet[2925]: E0707 00:39:01.153349 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.153577 kubelet[2925]: E0707 00:39:01.153565 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.153673 kubelet[2925]: W0707 00:39:01.153622 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.153673 kubelet[2925]: E0707 00:39:01.153633 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.153866 kubelet[2925]: E0707 00:39:01.153824 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.153866 kubelet[2925]: W0707 00:39:01.153833 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.153866 kubelet[2925]: E0707 00:39:01.153839 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.154059 kubelet[2925]: E0707 00:39:01.154020 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.154059 kubelet[2925]: W0707 00:39:01.154029 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.154059 kubelet[2925]: E0707 00:39:01.154036 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.154248 kubelet[2925]: E0707 00:39:01.154206 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.154248 kubelet[2925]: W0707 00:39:01.154214 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.154248 kubelet[2925]: E0707 00:39:01.154221 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.154479 kubelet[2925]: E0707 00:39:01.154399 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.154479 kubelet[2925]: W0707 00:39:01.154408 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.154479 kubelet[2925]: E0707 00:39:01.154427 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.155862 kubelet[2925]: E0707 00:39:01.155813 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.155975 kubelet[2925]: W0707 00:39:01.155907 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.155975 kubelet[2925]: E0707 00:39:01.155919 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.156095 kubelet[2925]: E0707 00:39:01.156086 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.156174 kubelet[2925]: W0707 00:39:01.156137 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.156174 kubelet[2925]: E0707 00:39:01.156147 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.156367 kubelet[2925]: E0707 00:39:01.156304 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.156367 kubelet[2925]: W0707 00:39:01.156312 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.156367 kubelet[2925]: E0707 00:39:01.156319 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.159653 kubelet[2925]: E0707 00:39:01.159619 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.159653 kubelet[2925]: W0707 00:39:01.159629 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.159653 kubelet[2925]: E0707 00:39:01.159638 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.160030 kubelet[2925]: E0707 00:39:01.160009 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.160030 kubelet[2925]: W0707 00:39:01.160020 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.160146 kubelet[2925]: E0707 00:39:01.160103 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.160338 kubelet[2925]: E0707 00:39:01.160329 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.160405 kubelet[2925]: W0707 00:39:01.160389 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.160526 kubelet[2925]: E0707 00:39:01.160467 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.160746 kubelet[2925]: E0707 00:39:01.160735 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.160818 kubelet[2925]: W0707 00:39:01.160807 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.160892 kubelet[2925]: E0707 00:39:01.160881 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.161116 kubelet[2925]: E0707 00:39:01.161102 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.161191 kubelet[2925]: W0707 00:39:01.161179 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.161545 kubelet[2925]: E0707 00:39:01.161531 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.161785 kubelet[2925]: E0707 00:39:01.161750 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.161785 kubelet[2925]: W0707 00:39:01.161763 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.161905 kubelet[2925]: E0707 00:39:01.161894 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.162908 kubelet[2925]: E0707 00:39:01.162897 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.163018 kubelet[2925]: W0707 00:39:01.162983 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.163141 kubelet[2925]: E0707 00:39:01.163121 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.163337 kubelet[2925]: E0707 00:39:01.163315 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.163337 kubelet[2925]: W0707 00:39:01.163325 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.163508 kubelet[2925]: E0707 00:39:01.163433 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.163666 kubelet[2925]: E0707 00:39:01.163630 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.163666 kubelet[2925]: W0707 00:39:01.163640 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.164374 kubelet[2925]: E0707 00:39:01.164355 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.164598 kubelet[2925]: E0707 00:39:01.164553 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.164598 kubelet[2925]: W0707 00:39:01.164581 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.164789 kubelet[2925]: E0707 00:39:01.164729 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.164979 kubelet[2925]: E0707 00:39:01.164956 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.164979 kubelet[2925]: W0707 00:39:01.164966 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.165824 kubelet[2925]: E0707 00:39:01.165798 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.165912 kubelet[2925]: E0707 00:39:01.165904 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.166015 kubelet[2925]: W0707 00:39:01.165949 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.166015 kubelet[2925]: E0707 00:39:01.165962 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.166638 kubelet[2925]: E0707 00:39:01.166197 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.166638 kubelet[2925]: W0707 00:39:01.166204 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.166638 kubelet[2925]: E0707 00:39:01.166287 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.166899 kubelet[2925]: E0707 00:39:01.166889 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.167181 kubelet[2925]: W0707 00:39:01.166946 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.167299 kubelet[2925]: E0707 00:39:01.167289 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.167394 kubelet[2925]: W0707 00:39:01.167344 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.168994 kubelet[2925]: E0707 00:39:01.168755 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.168994 kubelet[2925]: E0707 00:39:01.168910 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.169094 kubelet[2925]: E0707 00:39:01.169083 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.169143 kubelet[2925]: W0707 00:39:01.169135 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.169192 kubelet[2925]: E0707 00:39:01.169184 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.169441 kubelet[2925]: E0707 00:39:01.169431 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.169491 kubelet[2925]: W0707 00:39:01.169483 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.169575 kubelet[2925]: E0707 00:39:01.169539 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:01.169775 kubelet[2925]: E0707 00:39:01.169742 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:01.169775 kubelet[2925]: W0707 00:39:01.169751 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:01.169775 kubelet[2925]: E0707 00:39:01.169758 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.126371 kubelet[2925]: I0707 00:39:02.126333 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:02.163139 kubelet[2925]: E0707 00:39:02.163104 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.163139 kubelet[2925]: W0707 00:39:02.163125 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.163139 kubelet[2925]: E0707 00:39:02.163144 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.163295 kubelet[2925]: E0707 00:39:02.163281 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.163295 kubelet[2925]: W0707 00:39:02.163288 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.163343 kubelet[2925]: E0707 00:39:02.163296 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.163467 kubelet[2925]: E0707 00:39:02.163441 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.163467 kubelet[2925]: W0707 00:39:02.163459 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.163467 kubelet[2925]: E0707 00:39:02.163470 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.163615 kubelet[2925]: E0707 00:39:02.163603 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.163615 kubelet[2925]: W0707 00:39:02.163614 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.163678 kubelet[2925]: E0707 00:39:02.163622 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.163791 kubelet[2925]: E0707 00:39:02.163771 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.163791 kubelet[2925]: W0707 00:39:02.163785 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.163856 kubelet[2925]: E0707 00:39:02.163793 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.163917 kubelet[2925]: E0707 00:39:02.163906 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.163917 kubelet[2925]: W0707 00:39:02.163916 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.163974 kubelet[2925]: E0707 00:39:02.163923 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.164041 kubelet[2925]: E0707 00:39:02.164028 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.164041 kubelet[2925]: W0707 00:39:02.164039 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.164095 kubelet[2925]: E0707 00:39:02.164046 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.164202 kubelet[2925]: E0707 00:39:02.164180 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.164202 kubelet[2925]: W0707 00:39:02.164196 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.164279 kubelet[2925]: E0707 00:39:02.164206 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.164358 kubelet[2925]: E0707 00:39:02.164341 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.164358 kubelet[2925]: W0707 00:39:02.164352 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.164428 kubelet[2925]: E0707 00:39:02.164360 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.164507 kubelet[2925]: E0707 00:39:02.164487 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.164507 kubelet[2925]: W0707 00:39:02.164503 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.164562 kubelet[2925]: E0707 00:39:02.164511 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.164639 kubelet[2925]: E0707 00:39:02.164627 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.164639 kubelet[2925]: W0707 00:39:02.164637 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.164693 kubelet[2925]: E0707 00:39:02.164646 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.164797 kubelet[2925]: E0707 00:39:02.164784 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.164797 kubelet[2925]: W0707 00:39:02.164795 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.164855 kubelet[2925]: E0707 00:39:02.164803 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.164934 kubelet[2925]: E0707 00:39:02.164921 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.164934 kubelet[2925]: W0707 00:39:02.164931 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.164987 kubelet[2925]: E0707 00:39:02.164938 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.165066 kubelet[2925]: E0707 00:39:02.165048 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.165066 kubelet[2925]: W0707 00:39:02.165063 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.165123 kubelet[2925]: E0707 00:39:02.165070 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.165198 kubelet[2925]: E0707 00:39:02.165177 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.165198 kubelet[2925]: W0707 00:39:02.165194 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.165247 kubelet[2925]: E0707 00:39:02.165201 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.167517 kubelet[2925]: E0707 00:39:02.167497 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.167517 kubelet[2925]: W0707 00:39:02.167511 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.167593 kubelet[2925]: E0707 00:39:02.167520 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.167687 kubelet[2925]: E0707 00:39:02.167667 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.167687 kubelet[2925]: W0707 00:39:02.167679 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.167687 kubelet[2925]: E0707 00:39:02.167686 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.167850 kubelet[2925]: E0707 00:39:02.167830 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.167850 kubelet[2925]: W0707 00:39:02.167842 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.167850 kubelet[2925]: E0707 00:39:02.167850 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.168005 kubelet[2925]: E0707 00:39:02.167992 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.168005 kubelet[2925]: W0707 00:39:02.168003 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.168086 kubelet[2925]: E0707 00:39:02.168015 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.168141 kubelet[2925]: E0707 00:39:02.168119 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.168141 kubelet[2925]: W0707 00:39:02.168126 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.168141 kubelet[2925]: E0707 00:39:02.168137 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.168341 kubelet[2925]: E0707 00:39:02.168317 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.168341 kubelet[2925]: W0707 00:39:02.168339 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.168422 kubelet[2925]: E0707 00:39:02.168364 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.168657 kubelet[2925]: E0707 00:39:02.168633 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.168657 kubelet[2925]: W0707 00:39:02.168647 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.168752 kubelet[2925]: E0707 00:39:02.168659 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.168843 kubelet[2925]: E0707 00:39:02.168831 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.168843 kubelet[2925]: W0707 00:39:02.168841 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.168904 kubelet[2925]: E0707 00:39:02.168853 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.168988 kubelet[2925]: E0707 00:39:02.168967 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.168988 kubelet[2925]: W0707 00:39:02.168980 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.168988 kubelet[2925]: E0707 00:39:02.168988 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.169136 kubelet[2925]: E0707 00:39:02.169115 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.169136 kubelet[2925]: W0707 00:39:02.169127 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.169188 kubelet[2925]: E0707 00:39:02.169139 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.169276 kubelet[2925]: E0707 00:39:02.169258 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.169276 kubelet[2925]: W0707 00:39:02.169269 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.169332 kubelet[2925]: E0707 00:39:02.169276 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.169515 kubelet[2925]: E0707 00:39:02.169501 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.169515 kubelet[2925]: W0707 00:39:02.169513 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.169592 kubelet[2925]: E0707 00:39:02.169536 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.169794 kubelet[2925]: E0707 00:39:02.169775 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.169847 kubelet[2925]: W0707 00:39:02.169795 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.169847 kubelet[2925]: E0707 00:39:02.169835 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.170355 kubelet[2925]: E0707 00:39:02.170333 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.170355 kubelet[2925]: W0707 00:39:02.170347 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.170436 kubelet[2925]: E0707 00:39:02.170361 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.170519 kubelet[2925]: E0707 00:39:02.170493 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.170519 kubelet[2925]: W0707 00:39:02.170507 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.170519 kubelet[2925]: E0707 00:39:02.170518 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.170669 kubelet[2925]: E0707 00:39:02.170659 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.170669 kubelet[2925]: W0707 00:39:02.170666 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.170818 kubelet[2925]: E0707 00:39:02.170674 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.170951 kubelet[2925]: E0707 00:39:02.170939 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.170951 kubelet[2925]: W0707 00:39:02.170950 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.171010 kubelet[2925]: E0707 00:39:02.170962 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.171096 kubelet[2925]: E0707 00:39:02.171085 2925 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 7 00:39:02.171096 kubelet[2925]: W0707 00:39:02.171095 2925 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 7 00:39:02.171153 kubelet[2925]: E0707 00:39:02.171103 2925 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 7 00:39:02.874273 containerd[1574]: time="2025-07-07T00:39:02.874232164Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:02.883980 containerd[1574]: time="2025-07-07T00:39:02.874981566Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Jul 7 00:39:02.883980 containerd[1574]: time="2025-07-07T00:39:02.875870280Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:02.884253 containerd[1574]: time="2025-07-07T00:39:02.877311454Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.966327972s" Jul 7 00:39:02.884253 containerd[1574]: time="2025-07-07T00:39:02.884197293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Jul 7 00:39:02.884367 containerd[1574]: time="2025-07-07T00:39:02.884346894Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:02.886427 containerd[1574]: time="2025-07-07T00:39:02.886376357Z" level=info msg="CreateContainer within sandbox \"986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 7 00:39:02.892884 containerd[1574]: time="2025-07-07T00:39:02.892751294Z" level=info msg="Container 47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:02.908006 containerd[1574]: time="2025-07-07T00:39:02.907980618Z" level=info msg="CreateContainer within sandbox \"986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7\"" Jul 7 00:39:02.908571 containerd[1574]: time="2025-07-07T00:39:02.908554148Z" level=info msg="StartContainer for \"47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7\"" Jul 7 00:39:02.909660 containerd[1574]: time="2025-07-07T00:39:02.909639472Z" level=info msg="connecting to shim 47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7" address="unix:///run/containerd/s/ffeb1b9361bda4756c9c5fd4703a203e36a90a31073855e0ab84f5e79e656f0f" protocol=ttrpc version=3 Jul 7 00:39:02.927815 systemd[1]: Started cri-containerd-47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7.scope - libcontainer container 47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7. Jul 7 00:39:02.977139 systemd[1]: cri-containerd-47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7.scope: Deactivated successfully. Jul 7 00:39:02.991592 containerd[1574]: time="2025-07-07T00:39:02.990342051Z" level=info msg="StartContainer for \"47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7\" returns successfully" Jul 7 00:39:03.020620 containerd[1574]: time="2025-07-07T00:39:03.020581772Z" level=info msg="received exit event container_id:\"47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7\" id:\"47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7\" pid:3651 exited_at:{seconds:1751848742 nanos:979478740}" Jul 7 00:39:03.021108 containerd[1574]: time="2025-07-07T00:39:03.021082715Z" level=info msg="TaskExit event in podsandbox handler container_id:\"47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7\" id:\"47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7\" pid:3651 exited_at:{seconds:1751848742 nanos:979478740}" Jul 7 00:39:03.033785 kubelet[2925]: E0707 00:39:03.033749 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sw6gt" podUID="eb44bbbc-039e-4742-9b7b-5f5bfbef4e00" Jul 7 00:39:03.049360 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-47eb1b76a97f34704567b11f65bab5436c13bd3409ced8d31f4be386ebdf82e7-rootfs.mount: Deactivated successfully. Jul 7 00:39:03.133967 containerd[1574]: time="2025-07-07T00:39:03.133847652Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 7 00:39:05.034002 kubelet[2925]: E0707 00:39:05.033953 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-sw6gt" podUID="eb44bbbc-039e-4742-9b7b-5f5bfbef4e00" Jul 7 00:39:05.678231 containerd[1574]: time="2025-07-07T00:39:05.678197443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:05.679110 containerd[1574]: time="2025-07-07T00:39:05.679083791Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Jul 7 00:39:05.679886 containerd[1574]: time="2025-07-07T00:39:05.679851347Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:05.681262 containerd[1574]: time="2025-07-07T00:39:05.681222829Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:05.681714 containerd[1574]: time="2025-07-07T00:39:05.681668289Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 2.547780661s" Jul 7 00:39:05.681768 containerd[1574]: time="2025-07-07T00:39:05.681690399Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Jul 7 00:39:05.683771 containerd[1574]: time="2025-07-07T00:39:05.683747142Z" level=info msg="CreateContainer within sandbox \"986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 7 00:39:05.692715 containerd[1574]: time="2025-07-07T00:39:05.691537481Z" level=info msg="Container 557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:05.704978 containerd[1574]: time="2025-07-07T00:39:05.704936060Z" level=info msg="CreateContainer within sandbox \"986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1\"" Jul 7 00:39:05.706193 containerd[1574]: time="2025-07-07T00:39:05.706178680Z" level=info msg="StartContainer for \"557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1\"" Jul 7 00:39:05.707224 containerd[1574]: time="2025-07-07T00:39:05.707192979Z" level=info msg="connecting to shim 557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1" address="unix:///run/containerd/s/ffeb1b9361bda4756c9c5fd4703a203e36a90a31073855e0ab84f5e79e656f0f" protocol=ttrpc version=3 Jul 7 00:39:05.727813 systemd[1]: Started cri-containerd-557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1.scope - libcontainer container 557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1. Jul 7 00:39:05.773254 containerd[1574]: time="2025-07-07T00:39:05.772814731Z" level=info msg="StartContainer for \"557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1\" returns successfully" Jul 7 00:39:06.132344 systemd[1]: cri-containerd-557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1.scope: Deactivated successfully. Jul 7 00:39:06.133982 systemd[1]: cri-containerd-557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1.scope: Consumed 323ms CPU time, 160.6M memory peak, 11.6M read from disk, 171.2M written to disk. Jul 7 00:39:06.157611 containerd[1574]: time="2025-07-07T00:39:06.157365364Z" level=info msg="received exit event container_id:\"557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1\" id:\"557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1\" pid:3707 exited_at:{seconds:1751848746 nanos:156633345}" Jul 7 00:39:06.157611 containerd[1574]: time="2025-07-07T00:39:06.157588574Z" level=info msg="TaskExit event in podsandbox handler container_id:\"557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1\" id:\"557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1\" pid:3707 exited_at:{seconds:1751848746 nanos:156633345}" Jul 7 00:39:06.182170 kubelet[2925]: I0707 00:39:06.182152 2925 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jul 7 00:39:06.192886 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-557aaf745196d40ceef3cf733e021c7bc03b62fde20390df38bc6ba728768ed1-rootfs.mount: Deactivated successfully. Jul 7 00:39:06.250630 systemd[1]: Created slice kubepods-burstable-pod41fb2441_eddf_4cab_a9b5_21c52d4bf3e5.slice - libcontainer container kubepods-burstable-pod41fb2441_eddf_4cab_a9b5_21c52d4bf3e5.slice. Jul 7 00:39:06.275150 systemd[1]: Created slice kubepods-burstable-podcf319c1e_3b73_474b_9ae9_b5573fcf8751.slice - libcontainer container kubepods-burstable-podcf319c1e_3b73_474b_9ae9_b5573fcf8751.slice. Jul 7 00:39:06.286354 systemd[1]: Created slice kubepods-besteffort-pod53c1c128_9d8a_456b_a958_8f6cb766ad0b.slice - libcontainer container kubepods-besteffort-pod53c1c128_9d8a_456b_a958_8f6cb766ad0b.slice. Jul 7 00:39:06.292594 systemd[1]: Created slice kubepods-besteffort-podc58790dd_630c_4e7b_9398_a00ca059225e.slice - libcontainer container kubepods-besteffort-podc58790dd_630c_4e7b_9398_a00ca059225e.slice. Jul 7 00:39:06.299288 systemd[1]: Created slice kubepods-besteffort-pod370e9121_6380_494d_bb21_0a3a2886c927.slice - libcontainer container kubepods-besteffort-pod370e9121_6380_494d_bb21_0a3a2886c927.slice. Jul 7 00:39:06.299919 kubelet[2925]: I0707 00:39:06.299887 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d64b203e-aa4d-4795-9e34-4cec7d892738-calico-apiserver-certs\") pod \"calico-apiserver-5c47bcf6f-c64ch\" (UID: \"d64b203e-aa4d-4795-9e34-4cec7d892738\") " pod="calico-apiserver/calico-apiserver-5c47bcf6f-c64ch" Jul 7 00:39:06.299919 kubelet[2925]: I0707 00:39:06.299916 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/370e9121-6380-494d-bb21-0a3a2886c927-goldmane-key-pair\") pod \"goldmane-768f4c5c69-z472l\" (UID: \"370e9121-6380-494d-bb21-0a3a2886c927\") " pod="calico-system/goldmane-768f4c5c69-z472l" Jul 7 00:39:06.300320 kubelet[2925]: I0707 00:39:06.299931 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/370e9121-6380-494d-bb21-0a3a2886c927-config\") pod \"goldmane-768f4c5c69-z472l\" (UID: \"370e9121-6380-494d-bb21-0a3a2886c927\") " pod="calico-system/goldmane-768f4c5c69-z472l" Jul 7 00:39:06.300320 kubelet[2925]: I0707 00:39:06.299944 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53c1c128-9d8a-456b-a958-8f6cb766ad0b-whisker-ca-bundle\") pod \"whisker-c88c9c88c-8vqt6\" (UID: \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\") " pod="calico-system/whisker-c88c9c88c-8vqt6" Jul 7 00:39:06.300320 kubelet[2925]: I0707 00:39:06.299961 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf319c1e-3b73-474b-9ae9-b5573fcf8751-config-volume\") pod \"coredns-668d6bf9bc-757mw\" (UID: \"cf319c1e-3b73-474b-9ae9-b5573fcf8751\") " pod="kube-system/coredns-668d6bf9bc-757mw" Jul 7 00:39:06.300320 kubelet[2925]: I0707 00:39:06.299974 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4tx6\" (UniqueName: \"kubernetes.io/projected/14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17-kube-api-access-z4tx6\") pod \"calico-kube-controllers-77d57f8f97-z9fxb\" (UID: \"14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17\") " pod="calico-system/calico-kube-controllers-77d57f8f97-z9fxb" Jul 7 00:39:06.300320 kubelet[2925]: I0707 00:39:06.299988 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksvkz\" (UniqueName: \"kubernetes.io/projected/cf319c1e-3b73-474b-9ae9-b5573fcf8751-kube-api-access-ksvkz\") pod \"coredns-668d6bf9bc-757mw\" (UID: \"cf319c1e-3b73-474b-9ae9-b5573fcf8751\") " pod="kube-system/coredns-668d6bf9bc-757mw" Jul 7 00:39:06.301059 kubelet[2925]: I0707 00:39:06.300000 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41fb2441-eddf-4cab-a9b5-21c52d4bf3e5-config-volume\") pod \"coredns-668d6bf9bc-q45lm\" (UID: \"41fb2441-eddf-4cab-a9b5-21c52d4bf3e5\") " pod="kube-system/coredns-668d6bf9bc-q45lm" Jul 7 00:39:06.301059 kubelet[2925]: I0707 00:39:06.300011 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmpkp\" (UniqueName: \"kubernetes.io/projected/c58790dd-630c-4e7b-9398-a00ca059225e-kube-api-access-kmpkp\") pod \"calico-apiserver-5c47bcf6f-2m6k2\" (UID: \"c58790dd-630c-4e7b-9398-a00ca059225e\") " pod="calico-apiserver/calico-apiserver-5c47bcf6f-2m6k2" Jul 7 00:39:06.301059 kubelet[2925]: I0707 00:39:06.300023 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/370e9121-6380-494d-bb21-0a3a2886c927-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-z472l\" (UID: \"370e9121-6380-494d-bb21-0a3a2886c927\") " pod="calico-system/goldmane-768f4c5c69-z472l" Jul 7 00:39:06.301059 kubelet[2925]: I0707 00:39:06.300034 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4n9m\" (UniqueName: \"kubernetes.io/projected/53c1c128-9d8a-456b-a958-8f6cb766ad0b-kube-api-access-d4n9m\") pod \"whisker-c88c9c88c-8vqt6\" (UID: \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\") " pod="calico-system/whisker-c88c9c88c-8vqt6" Jul 7 00:39:06.301059 kubelet[2925]: I0707 00:39:06.300051 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/aebb6cf1-faa7-47e9-a8f2-33827fc086e0-calico-apiserver-certs\") pod \"calico-apiserver-6f6ddc54db-zkq79\" (UID: \"aebb6cf1-faa7-47e9-a8f2-33827fc086e0\") " pod="calico-apiserver/calico-apiserver-6f6ddc54db-zkq79" Jul 7 00:39:06.301158 kubelet[2925]: I0707 00:39:06.300066 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57w6f\" (UniqueName: \"kubernetes.io/projected/d64b203e-aa4d-4795-9e34-4cec7d892738-kube-api-access-57w6f\") pod \"calico-apiserver-5c47bcf6f-c64ch\" (UID: \"d64b203e-aa4d-4795-9e34-4cec7d892738\") " pod="calico-apiserver/calico-apiserver-5c47bcf6f-c64ch" Jul 7 00:39:06.301158 kubelet[2925]: I0707 00:39:06.300078 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c58790dd-630c-4e7b-9398-a00ca059225e-calico-apiserver-certs\") pod \"calico-apiserver-5c47bcf6f-2m6k2\" (UID: \"c58790dd-630c-4e7b-9398-a00ca059225e\") " pod="calico-apiserver/calico-apiserver-5c47bcf6f-2m6k2" Jul 7 00:39:06.301158 kubelet[2925]: I0707 00:39:06.300090 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p5gs\" (UniqueName: \"kubernetes.io/projected/370e9121-6380-494d-bb21-0a3a2886c927-kube-api-access-2p5gs\") pod \"goldmane-768f4c5c69-z472l\" (UID: \"370e9121-6380-494d-bb21-0a3a2886c927\") " pod="calico-system/goldmane-768f4c5c69-z472l" Jul 7 00:39:06.301158 kubelet[2925]: I0707 00:39:06.300101 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/53c1c128-9d8a-456b-a958-8f6cb766ad0b-whisker-backend-key-pair\") pod \"whisker-c88c9c88c-8vqt6\" (UID: \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\") " pod="calico-system/whisker-c88c9c88c-8vqt6" Jul 7 00:39:06.301158 kubelet[2925]: I0707 00:39:06.300131 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wmzd\" (UniqueName: \"kubernetes.io/projected/aebb6cf1-faa7-47e9-a8f2-33827fc086e0-kube-api-access-6wmzd\") pod \"calico-apiserver-6f6ddc54db-zkq79\" (UID: \"aebb6cf1-faa7-47e9-a8f2-33827fc086e0\") " pod="calico-apiserver/calico-apiserver-6f6ddc54db-zkq79" Jul 7 00:39:06.301884 kubelet[2925]: I0707 00:39:06.300147 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf9tk\" (UniqueName: \"kubernetes.io/projected/41fb2441-eddf-4cab-a9b5-21c52d4bf3e5-kube-api-access-gf9tk\") pod \"coredns-668d6bf9bc-q45lm\" (UID: \"41fb2441-eddf-4cab-a9b5-21c52d4bf3e5\") " pod="kube-system/coredns-668d6bf9bc-q45lm" Jul 7 00:39:06.301884 kubelet[2925]: I0707 00:39:06.300169 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17-tigera-ca-bundle\") pod \"calico-kube-controllers-77d57f8f97-z9fxb\" (UID: \"14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17\") " pod="calico-system/calico-kube-controllers-77d57f8f97-z9fxb" Jul 7 00:39:06.308052 systemd[1]: Created slice kubepods-besteffort-podd64b203e_aa4d_4795_9e34_4cec7d892738.slice - libcontainer container kubepods-besteffort-podd64b203e_aa4d_4795_9e34_4cec7d892738.slice. Jul 7 00:39:06.315200 systemd[1]: Created slice kubepods-besteffort-podaebb6cf1_faa7_47e9_a8f2_33827fc086e0.slice - libcontainer container kubepods-besteffort-podaebb6cf1_faa7_47e9_a8f2_33827fc086e0.slice. Jul 7 00:39:06.320479 systemd[1]: Created slice kubepods-besteffort-pod14fe5431_92e4_4f3e_ac2a_1bbd02ba7a17.slice - libcontainer container kubepods-besteffort-pod14fe5431_92e4_4f3e_ac2a_1bbd02ba7a17.slice. Jul 7 00:39:06.571557 containerd[1574]: time="2025-07-07T00:39:06.571196090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-q45lm,Uid:41fb2441-eddf-4cab-a9b5-21c52d4bf3e5,Namespace:kube-system,Attempt:0,}" Jul 7 00:39:06.585187 containerd[1574]: time="2025-07-07T00:39:06.585135514Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-757mw,Uid:cf319c1e-3b73-474b-9ae9-b5573fcf8751,Namespace:kube-system,Attempt:0,}" Jul 7 00:39:06.603036 containerd[1574]: time="2025-07-07T00:39:06.602493796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c47bcf6f-2m6k2,Uid:c58790dd-630c-4e7b-9398-a00ca059225e,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:39:06.620658 containerd[1574]: time="2025-07-07T00:39:06.620600708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c47bcf6f-c64ch,Uid:d64b203e-aa4d-4795-9e34-4cec7d892738,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:39:06.621224 containerd[1574]: time="2025-07-07T00:39:06.621178134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-z472l,Uid:370e9121-6380-494d-bb21-0a3a2886c927,Namespace:calico-system,Attempt:0,}" Jul 7 00:39:06.622397 containerd[1574]: time="2025-07-07T00:39:06.622128785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f6ddc54db-zkq79,Uid:aebb6cf1-faa7-47e9-a8f2-33827fc086e0,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:39:06.624336 containerd[1574]: time="2025-07-07T00:39:06.624201647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77d57f8f97-z9fxb,Uid:14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17,Namespace:calico-system,Attempt:0,}" Jul 7 00:39:06.626510 containerd[1574]: time="2025-07-07T00:39:06.626355532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c88c9c88c-8vqt6,Uid:53c1c128-9d8a-456b-a958-8f6cb766ad0b,Namespace:calico-system,Attempt:0,}" Jul 7 00:39:06.805180 containerd[1574]: time="2025-07-07T00:39:06.805126090Z" level=error msg="Failed to destroy network for sandbox \"269b11f9ae8034e243946b8b45d000e93d2df3eaed4723b05351a50c3ae1cb62\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.807317 systemd[1]: run-netns-cni\x2dc2996387\x2df558\x2d288d\x2d31e4\x2d75ead992e43b.mount: Deactivated successfully. Jul 7 00:39:06.810094 containerd[1574]: time="2025-07-07T00:39:06.810041614Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f6ddc54db-zkq79,Uid:aebb6cf1-faa7-47e9-a8f2-33827fc086e0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"269b11f9ae8034e243946b8b45d000e93d2df3eaed4723b05351a50c3ae1cb62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.811224 kubelet[2925]: E0707 00:39:06.811173 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"269b11f9ae8034e243946b8b45d000e93d2df3eaed4723b05351a50c3ae1cb62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.811298 kubelet[2925]: E0707 00:39:06.811264 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"269b11f9ae8034e243946b8b45d000e93d2df3eaed4723b05351a50c3ae1cb62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f6ddc54db-zkq79" Jul 7 00:39:06.811298 kubelet[2925]: E0707 00:39:06.811289 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"269b11f9ae8034e243946b8b45d000e93d2df3eaed4723b05351a50c3ae1cb62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f6ddc54db-zkq79" Jul 7 00:39:06.811393 kubelet[2925]: E0707 00:39:06.811344 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f6ddc54db-zkq79_calico-apiserver(aebb6cf1-faa7-47e9-a8f2-33827fc086e0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f6ddc54db-zkq79_calico-apiserver(aebb6cf1-faa7-47e9-a8f2-33827fc086e0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"269b11f9ae8034e243946b8b45d000e93d2df3eaed4723b05351a50c3ae1cb62\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f6ddc54db-zkq79" podUID="aebb6cf1-faa7-47e9-a8f2-33827fc086e0" Jul 7 00:39:06.820469 containerd[1574]: time="2025-07-07T00:39:06.820430555Z" level=error msg="Failed to destroy network for sandbox \"d0b794ae82fb9c1ec852634f586b29e1050764a48d10c112d58593d3d0b5556f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.822459 systemd[1]: run-netns-cni\x2d0aef4396\x2d229d\x2dcfd6\x2dce0d\x2dbabd6051ec5d.mount: Deactivated successfully. Jul 7 00:39:06.825023 containerd[1574]: time="2025-07-07T00:39:06.824999206Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c47bcf6f-2m6k2,Uid:c58790dd-630c-4e7b-9398-a00ca059225e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0b794ae82fb9c1ec852634f586b29e1050764a48d10c112d58593d3d0b5556f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.825312 kubelet[2925]: E0707 00:39:06.825267 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0b794ae82fb9c1ec852634f586b29e1050764a48d10c112d58593d3d0b5556f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.825431 kubelet[2925]: E0707 00:39:06.825416 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0b794ae82fb9c1ec852634f586b29e1050764a48d10c112d58593d3d0b5556f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c47bcf6f-2m6k2" Jul 7 00:39:06.825512 kubelet[2925]: E0707 00:39:06.825498 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0b794ae82fb9c1ec852634f586b29e1050764a48d10c112d58593d3d0b5556f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c47bcf6f-2m6k2" Jul 7 00:39:06.826010 kubelet[2925]: E0707 00:39:06.825621 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c47bcf6f-2m6k2_calico-apiserver(c58790dd-630c-4e7b-9398-a00ca059225e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c47bcf6f-2m6k2_calico-apiserver(c58790dd-630c-4e7b-9398-a00ca059225e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d0b794ae82fb9c1ec852634f586b29e1050764a48d10c112d58593d3d0b5556f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c47bcf6f-2m6k2" podUID="c58790dd-630c-4e7b-9398-a00ca059225e" Jul 7 00:39:06.837368 containerd[1574]: time="2025-07-07T00:39:06.835634972Z" level=error msg="Failed to destroy network for sandbox \"8c49dec0e283da0ec237a03592b19ed71d670fb1e4d9721ac19023e5b823c562\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.837221 systemd[1]: run-netns-cni\x2dd2b5175a\x2d7a60\x2d4bb8\x2d991a\x2de76a71589471.mount: Deactivated successfully. Jul 7 00:39:06.839989 containerd[1574]: time="2025-07-07T00:39:06.839682041Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-z472l,Uid:370e9121-6380-494d-bb21-0a3a2886c927,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c49dec0e283da0ec237a03592b19ed71d670fb1e4d9721ac19023e5b823c562\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.840259 kubelet[2925]: E0707 00:39:06.840211 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c49dec0e283da0ec237a03592b19ed71d670fb1e4d9721ac19023e5b823c562\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.840361 kubelet[2925]: E0707 00:39:06.840348 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c49dec0e283da0ec237a03592b19ed71d670fb1e4d9721ac19023e5b823c562\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-z472l" Jul 7 00:39:06.841019 kubelet[2925]: E0707 00:39:06.840438 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c49dec0e283da0ec237a03592b19ed71d670fb1e4d9721ac19023e5b823c562\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-z472l" Jul 7 00:39:06.841019 kubelet[2925]: E0707 00:39:06.840477 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-z472l_calico-system(370e9121-6380-494d-bb21-0a3a2886c927)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-z472l_calico-system(370e9121-6380-494d-bb21-0a3a2886c927)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c49dec0e283da0ec237a03592b19ed71d670fb1e4d9721ac19023e5b823c562\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-z472l" podUID="370e9121-6380-494d-bb21-0a3a2886c927" Jul 7 00:39:06.852494 containerd[1574]: time="2025-07-07T00:39:06.852456893Z" level=error msg="Failed to destroy network for sandbox \"a99082dec4cc62589141b30384ab334691abbd7149a6dcffd22766aa8ccc6985\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.854635 containerd[1574]: time="2025-07-07T00:39:06.854570593Z" level=error msg="Failed to destroy network for sandbox \"38139d129c05a51b350486a9eb87fe640d055212c2dfa77c93f5aa1e5a2ad391\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.855591 systemd[1]: run-netns-cni\x2d184a48a0\x2df9c6\x2da9c8\x2d2802\x2df8b0969943e6.mount: Deactivated successfully. Jul 7 00:39:06.858302 containerd[1574]: time="2025-07-07T00:39:06.858245271Z" level=error msg="Failed to destroy network for sandbox \"e51512c6be784a0938ee5ab634cf20f53d9bca95b23f221664950bed030e9680\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.858943 containerd[1574]: time="2025-07-07T00:39:06.858910163Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c88c9c88c-8vqt6,Uid:53c1c128-9d8a-456b-a958-8f6cb766ad0b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a99082dec4cc62589141b30384ab334691abbd7149a6dcffd22766aa8ccc6985\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.859953 kubelet[2925]: E0707 00:39:06.859896 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a99082dec4cc62589141b30384ab334691abbd7149a6dcffd22766aa8ccc6985\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.860005 kubelet[2925]: E0707 00:39:06.859956 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a99082dec4cc62589141b30384ab334691abbd7149a6dcffd22766aa8ccc6985\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-c88c9c88c-8vqt6" Jul 7 00:39:06.860005 kubelet[2925]: E0707 00:39:06.859984 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a99082dec4cc62589141b30384ab334691abbd7149a6dcffd22766aa8ccc6985\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-c88c9c88c-8vqt6" Jul 7 00:39:06.860154 kubelet[2925]: E0707 00:39:06.860016 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-c88c9c88c-8vqt6_calico-system(53c1c128-9d8a-456b-a958-8f6cb766ad0b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-c88c9c88c-8vqt6_calico-system(53c1c128-9d8a-456b-a958-8f6cb766ad0b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a99082dec4cc62589141b30384ab334691abbd7149a6dcffd22766aa8ccc6985\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-c88c9c88c-8vqt6" podUID="53c1c128-9d8a-456b-a958-8f6cb766ad0b" Jul 7 00:39:06.861149 kubelet[2925]: E0707 00:39:06.860969 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38139d129c05a51b350486a9eb87fe640d055212c2dfa77c93f5aa1e5a2ad391\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.861287 containerd[1574]: time="2025-07-07T00:39:06.860688672Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77d57f8f97-z9fxb,Uid:14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"38139d129c05a51b350486a9eb87fe640d055212c2dfa77c93f5aa1e5a2ad391\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.861334 kubelet[2925]: E0707 00:39:06.861146 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38139d129c05a51b350486a9eb87fe640d055212c2dfa77c93f5aa1e5a2ad391\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77d57f8f97-z9fxb" Jul 7 00:39:06.861334 kubelet[2925]: E0707 00:39:06.861163 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38139d129c05a51b350486a9eb87fe640d055212c2dfa77c93f5aa1e5a2ad391\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-77d57f8f97-z9fxb" Jul 7 00:39:06.861334 kubelet[2925]: E0707 00:39:06.861196 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-77d57f8f97-z9fxb_calico-system(14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-77d57f8f97-z9fxb_calico-system(14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"38139d129c05a51b350486a9eb87fe640d055212c2dfa77c93f5aa1e5a2ad391\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-77d57f8f97-z9fxb" podUID="14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17" Jul 7 00:39:06.861557 containerd[1574]: time="2025-07-07T00:39:06.861523173Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-q45lm,Uid:41fb2441-eddf-4cab-a9b5-21c52d4bf3e5,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e51512c6be784a0938ee5ab634cf20f53d9bca95b23f221664950bed030e9680\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.861682 kubelet[2925]: E0707 00:39:06.861663 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e51512c6be784a0938ee5ab634cf20f53d9bca95b23f221664950bed030e9680\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.861818 kubelet[2925]: E0707 00:39:06.861693 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e51512c6be784a0938ee5ab634cf20f53d9bca95b23f221664950bed030e9680\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-q45lm" Jul 7 00:39:06.861849 kubelet[2925]: E0707 00:39:06.861823 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e51512c6be784a0938ee5ab634cf20f53d9bca95b23f221664950bed030e9680\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-q45lm" Jul 7 00:39:06.862073 kubelet[2925]: E0707 00:39:06.861849 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-q45lm_kube-system(41fb2441-eddf-4cab-a9b5-21c52d4bf3e5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-q45lm_kube-system(41fb2441-eddf-4cab-a9b5-21c52d4bf3e5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e51512c6be784a0938ee5ab634cf20f53d9bca95b23f221664950bed030e9680\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-q45lm" podUID="41fb2441-eddf-4cab-a9b5-21c52d4bf3e5" Jul 7 00:39:06.865324 containerd[1574]: time="2025-07-07T00:39:06.865281379Z" level=error msg="Failed to destroy network for sandbox \"59dd39b6c0285fa5912325d15ef5dfd4dd97beef87c71a8736b731b28b9a491a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.866927 containerd[1574]: time="2025-07-07T00:39:06.866846806Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-757mw,Uid:cf319c1e-3b73-474b-9ae9-b5573fcf8751,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"59dd39b6c0285fa5912325d15ef5dfd4dd97beef87c71a8736b731b28b9a491a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.867128 kubelet[2925]: E0707 00:39:06.867095 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59dd39b6c0285fa5912325d15ef5dfd4dd97beef87c71a8736b731b28b9a491a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.867128 kubelet[2925]: E0707 00:39:06.867130 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59dd39b6c0285fa5912325d15ef5dfd4dd97beef87c71a8736b731b28b9a491a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-757mw" Jul 7 00:39:06.867217 kubelet[2925]: E0707 00:39:06.867143 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59dd39b6c0285fa5912325d15ef5dfd4dd97beef87c71a8736b731b28b9a491a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-757mw" Jul 7 00:39:06.868110 kubelet[2925]: E0707 00:39:06.868078 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-757mw_kube-system(cf319c1e-3b73-474b-9ae9-b5573fcf8751)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-757mw_kube-system(cf319c1e-3b73-474b-9ae9-b5573fcf8751)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"59dd39b6c0285fa5912325d15ef5dfd4dd97beef87c71a8736b731b28b9a491a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-757mw" podUID="cf319c1e-3b73-474b-9ae9-b5573fcf8751" Jul 7 00:39:06.869401 containerd[1574]: time="2025-07-07T00:39:06.869351962Z" level=error msg="Failed to destroy network for sandbox \"335be06532ce7f541873bfb793dc1d47ad845fec8dddec9c2a85d894754f0add\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.870266 containerd[1574]: time="2025-07-07T00:39:06.870227361Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c47bcf6f-c64ch,Uid:d64b203e-aa4d-4795-9e34-4cec7d892738,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"335be06532ce7f541873bfb793dc1d47ad845fec8dddec9c2a85d894754f0add\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.870365 kubelet[2925]: E0707 00:39:06.870336 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"335be06532ce7f541873bfb793dc1d47ad845fec8dddec9c2a85d894754f0add\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:06.870365 kubelet[2925]: E0707 00:39:06.870362 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"335be06532ce7f541873bfb793dc1d47ad845fec8dddec9c2a85d894754f0add\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c47bcf6f-c64ch" Jul 7 00:39:06.870443 kubelet[2925]: E0707 00:39:06.870383 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"335be06532ce7f541873bfb793dc1d47ad845fec8dddec9c2a85d894754f0add\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c47bcf6f-c64ch" Jul 7 00:39:06.870443 kubelet[2925]: E0707 00:39:06.870425 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c47bcf6f-c64ch_calico-apiserver(d64b203e-aa4d-4795-9e34-4cec7d892738)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c47bcf6f-c64ch_calico-apiserver(d64b203e-aa4d-4795-9e34-4cec7d892738)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"335be06532ce7f541873bfb793dc1d47ad845fec8dddec9c2a85d894754f0add\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c47bcf6f-c64ch" podUID="d64b203e-aa4d-4795-9e34-4cec7d892738" Jul 7 00:39:07.042633 systemd[1]: Created slice kubepods-besteffort-podeb44bbbc_039e_4742_9b7b_5f5bfbef4e00.slice - libcontainer container kubepods-besteffort-podeb44bbbc_039e_4742_9b7b_5f5bfbef4e00.slice. Jul 7 00:39:07.048249 containerd[1574]: time="2025-07-07T00:39:07.048155906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sw6gt,Uid:eb44bbbc-039e-4742-9b7b-5f5bfbef4e00,Namespace:calico-system,Attempt:0,}" Jul 7 00:39:07.106682 containerd[1574]: time="2025-07-07T00:39:07.106566021Z" level=error msg="Failed to destroy network for sandbox \"df0516f749cef65319940e699b65a3d411d5003a0e4661575f8617003705cc5a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:07.107868 containerd[1574]: time="2025-07-07T00:39:07.107818238Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sw6gt,Uid:eb44bbbc-039e-4742-9b7b-5f5bfbef4e00,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"df0516f749cef65319940e699b65a3d411d5003a0e4661575f8617003705cc5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:07.108083 kubelet[2925]: E0707 00:39:07.108027 2925 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df0516f749cef65319940e699b65a3d411d5003a0e4661575f8617003705cc5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 7 00:39:07.108083 kubelet[2925]: E0707 00:39:07.108078 2925 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df0516f749cef65319940e699b65a3d411d5003a0e4661575f8617003705cc5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sw6gt" Jul 7 00:39:07.108180 kubelet[2925]: E0707 00:39:07.108099 2925 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df0516f749cef65319940e699b65a3d411d5003a0e4661575f8617003705cc5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-sw6gt" Jul 7 00:39:07.108180 kubelet[2925]: E0707 00:39:07.108137 2925 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-sw6gt_calico-system(eb44bbbc-039e-4742-9b7b-5f5bfbef4e00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-sw6gt_calico-system(eb44bbbc-039e-4742-9b7b-5f5bfbef4e00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"df0516f749cef65319940e699b65a3d411d5003a0e4661575f8617003705cc5a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-sw6gt" podUID="eb44bbbc-039e-4742-9b7b-5f5bfbef4e00" Jul 7 00:39:07.157772 containerd[1574]: time="2025-07-07T00:39:07.156843205Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 7 00:39:07.693670 systemd[1]: run-netns-cni\x2ded463c68\x2d3977\x2dcea0\x2d01ef\x2dc5194c435970.mount: Deactivated successfully. Jul 7 00:39:07.693804 systemd[1]: run-netns-cni\x2d09a42c8e\x2dee94\x2dd433\x2d4b85\x2d9f370acf15bc.mount: Deactivated successfully. Jul 7 00:39:07.693864 systemd[1]: run-netns-cni\x2d0b0c7920\x2df19a\x2d4ab4\x2d0fa8\x2dca0d82ebb3aa.mount: Deactivated successfully. Jul 7 00:39:07.693915 systemd[1]: run-netns-cni\x2df6471f15\x2df02e\x2d146e\x2d33d0\x2d77b08b8c09fd.mount: Deactivated successfully. Jul 7 00:39:11.183748 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1670956088.mount: Deactivated successfully. Jul 7 00:39:11.317064 containerd[1574]: time="2025-07-07T00:39:11.271430303Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Jul 7 00:39:11.346948 containerd[1574]: time="2025-07-07T00:39:11.346911525Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:11.389778 containerd[1574]: time="2025-07-07T00:39:11.389723531Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:11.390804 containerd[1574]: time="2025-07-07T00:39:11.390775621Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 4.233868766s" Jul 7 00:39:11.390890 containerd[1574]: time="2025-07-07T00:39:11.390877973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Jul 7 00:39:11.391211 containerd[1574]: time="2025-07-07T00:39:11.391193927Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:11.410487 containerd[1574]: time="2025-07-07T00:39:11.410459808Z" level=info msg="CreateContainer within sandbox \"986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 7 00:39:11.460656 containerd[1574]: time="2025-07-07T00:39:11.460365704Z" level=info msg="Container 993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:11.462427 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount959018927.mount: Deactivated successfully. Jul 7 00:39:11.501926 containerd[1574]: time="2025-07-07T00:39:11.501877374Z" level=info msg="CreateContainer within sandbox \"986aaf732d165e33f27f4e6501f03636363433147bb161b6c0e0821a72693f03\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\"" Jul 7 00:39:11.502532 containerd[1574]: time="2025-07-07T00:39:11.502477914Z" level=info msg="StartContainer for \"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\"" Jul 7 00:39:11.507066 containerd[1574]: time="2025-07-07T00:39:11.506999864Z" level=info msg="connecting to shim 993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e" address="unix:///run/containerd/s/ffeb1b9361bda4756c9c5fd4703a203e36a90a31073855e0ab84f5e79e656f0f" protocol=ttrpc version=3 Jul 7 00:39:11.567952 systemd[1]: Started cri-containerd-993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e.scope - libcontainer container 993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e. Jul 7 00:39:11.621994 containerd[1574]: time="2025-07-07T00:39:11.621945792Z" level=info msg="StartContainer for \"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" returns successfully" Jul 7 00:39:11.705193 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 7 00:39:11.706098 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 7 00:39:11.940617 kubelet[2925]: I0707 00:39:11.940388 2925 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4n9m\" (UniqueName: \"kubernetes.io/projected/53c1c128-9d8a-456b-a958-8f6cb766ad0b-kube-api-access-d4n9m\") pod \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\" (UID: \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\") " Jul 7 00:39:11.940952 kubelet[2925]: I0707 00:39:11.940660 2925 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/53c1c128-9d8a-456b-a958-8f6cb766ad0b-whisker-backend-key-pair\") pod \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\" (UID: \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\") " Jul 7 00:39:11.941105 kubelet[2925]: I0707 00:39:11.940691 2925 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53c1c128-9d8a-456b-a958-8f6cb766ad0b-whisker-ca-bundle\") pod \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\" (UID: \"53c1c128-9d8a-456b-a958-8f6cb766ad0b\") " Jul 7 00:39:11.942138 kubelet[2925]: I0707 00:39:11.942083 2925 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c1c128-9d8a-456b-a958-8f6cb766ad0b-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "53c1c128-9d8a-456b-a958-8f6cb766ad0b" (UID: "53c1c128-9d8a-456b-a958-8f6cb766ad0b"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jul 7 00:39:11.944139 kubelet[2925]: I0707 00:39:11.944103 2925 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c1c128-9d8a-456b-a958-8f6cb766ad0b-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "53c1c128-9d8a-456b-a958-8f6cb766ad0b" (UID: "53c1c128-9d8a-456b-a958-8f6cb766ad0b"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 7 00:39:11.946179 kubelet[2925]: I0707 00:39:11.946145 2925 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c1c128-9d8a-456b-a958-8f6cb766ad0b-kube-api-access-d4n9m" (OuterVolumeSpecName: "kube-api-access-d4n9m") pod "53c1c128-9d8a-456b-a958-8f6cb766ad0b" (UID: "53c1c128-9d8a-456b-a958-8f6cb766ad0b"). InnerVolumeSpecName "kube-api-access-d4n9m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 7 00:39:12.040357 systemd[1]: Removed slice kubepods-besteffort-pod53c1c128_9d8a_456b_a958_8f6cb766ad0b.slice - libcontainer container kubepods-besteffort-pod53c1c128_9d8a_456b_a958_8f6cb766ad0b.slice. Jul 7 00:39:12.041939 kubelet[2925]: I0707 00:39:12.041818 2925 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d4n9m\" (UniqueName: \"kubernetes.io/projected/53c1c128-9d8a-456b-a958-8f6cb766ad0b-kube-api-access-d4n9m\") on node \"ci-4344-1-1-6-69f6cda1f4\" DevicePath \"\"" Jul 7 00:39:12.041939 kubelet[2925]: I0707 00:39:12.041839 2925 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/53c1c128-9d8a-456b-a958-8f6cb766ad0b-whisker-backend-key-pair\") on node \"ci-4344-1-1-6-69f6cda1f4\" DevicePath \"\"" Jul 7 00:39:12.041939 kubelet[2925]: I0707 00:39:12.041849 2925 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53c1c128-9d8a-456b-a958-8f6cb766ad0b-whisker-ca-bundle\") on node \"ci-4344-1-1-6-69f6cda1f4\" DevicePath \"\"" Jul 7 00:39:12.191402 systemd[1]: var-lib-kubelet-pods-53c1c128\x2d9d8a\x2d456b\x2da958\x2d8f6cb766ad0b-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dd4n9m.mount: Deactivated successfully. Jul 7 00:39:12.191853 systemd[1]: var-lib-kubelet-pods-53c1c128\x2d9d8a\x2d456b\x2da958\x2d8f6cb766ad0b-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 7 00:39:12.218854 kubelet[2925]: I0707 00:39:12.218625 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-6zljk" podStartSLOduration=1.980993258 podStartE2EDuration="14.218595865s" podCreationTimestamp="2025-07-07 00:38:58 +0000 UTC" firstStartedPulling="2025-07-07 00:38:59.158637539 +0000 UTC m=+19.219766379" lastFinishedPulling="2025-07-07 00:39:11.396240146 +0000 UTC m=+31.457368986" observedRunningTime="2025-07-07 00:39:12.213524852 +0000 UTC m=+32.274653692" watchObservedRunningTime="2025-07-07 00:39:12.218595865 +0000 UTC m=+32.279724715" Jul 7 00:39:12.343602 systemd[1]: Created slice kubepods-besteffort-pod0e7e8dd9_8e65_4fa2_a2d8_a3e0dacb904b.slice - libcontainer container kubepods-besteffort-pod0e7e8dd9_8e65_4fa2_a2d8_a3e0dacb904b.slice. Jul 7 00:39:12.345047 kubelet[2925]: I0707 00:39:12.344851 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b-whisker-ca-bundle\") pod \"whisker-6db9c5b85c-sbcnx\" (UID: \"0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b\") " pod="calico-system/whisker-6db9c5b85c-sbcnx" Jul 7 00:39:12.345047 kubelet[2925]: I0707 00:39:12.344936 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lv5c\" (UniqueName: \"kubernetes.io/projected/0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b-kube-api-access-2lv5c\") pod \"whisker-6db9c5b85c-sbcnx\" (UID: \"0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b\") " pod="calico-system/whisker-6db9c5b85c-sbcnx" Jul 7 00:39:12.345047 kubelet[2925]: I0707 00:39:12.344953 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b-whisker-backend-key-pair\") pod \"whisker-6db9c5b85c-sbcnx\" (UID: \"0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b\") " pod="calico-system/whisker-6db9c5b85c-sbcnx" Jul 7 00:39:12.431897 containerd[1574]: time="2025-07-07T00:39:12.431833717Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"6e87238e5f8dff846f27d7a4af6e69b1c8d5d30bc5d71e0fbd8871df23439d40\" pid:4086 exit_status:1 exited_at:{seconds:1751848752 nanos:400556034}" Jul 7 00:39:12.649185 containerd[1574]: time="2025-07-07T00:39:12.649120090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6db9c5b85c-sbcnx,Uid:0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b,Namespace:calico-system,Attempt:0,}" Jul 7 00:39:12.930546 systemd-networkd[1487]: cali85474ee6287: Link UP Jul 7 00:39:12.933640 systemd-networkd[1487]: cali85474ee6287: Gained carrier Jul 7 00:39:12.951406 containerd[1574]: 2025-07-07 00:39:12.673 [INFO][4101] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:39:12.951406 containerd[1574]: 2025-07-07 00:39:12.700 [INFO][4101] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0 whisker-6db9c5b85c- calico-system 0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b 869 0 2025-07-07 00:39:12 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6db9c5b85c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 whisker-6db9c5b85c-sbcnx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali85474ee6287 [] [] }} ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Namespace="calico-system" Pod="whisker-6db9c5b85c-sbcnx" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-" Jul 7 00:39:12.951406 containerd[1574]: 2025-07-07 00:39:12.700 [INFO][4101] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Namespace="calico-system" Pod="whisker-6db9c5b85c-sbcnx" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" Jul 7 00:39:12.951406 containerd[1574]: 2025-07-07 00:39:12.865 [INFO][4112] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" HandleID="k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.871 [INFO][4112] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" HandleID="k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00049e170), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"whisker-6db9c5b85c-sbcnx", "timestamp":"2025-07-07 00:39:12.865878378 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.871 [INFO][4112] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.872 [INFO][4112] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.872 [INFO][4112] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.885 [INFO][4112] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.895 [INFO][4112] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.900 [INFO][4112] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.902 [INFO][4112] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.951596 containerd[1574]: 2025-07-07 00:39:12.904 [INFO][4112] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.952176 containerd[1574]: 2025-07-07 00:39:12.904 [INFO][4112] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.952176 containerd[1574]: 2025-07-07 00:39:12.906 [INFO][4112] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46 Jul 7 00:39:12.952176 containerd[1574]: 2025-07-07 00:39:12.910 [INFO][4112] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.952176 containerd[1574]: 2025-07-07 00:39:12.917 [INFO][4112] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.129/26] block=192.168.117.128/26 handle="k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.952176 containerd[1574]: 2025-07-07 00:39:12.918 [INFO][4112] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.129/26] handle="k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:12.952176 containerd[1574]: 2025-07-07 00:39:12.918 [INFO][4112] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:12.952176 containerd[1574]: 2025-07-07 00:39:12.918 [INFO][4112] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.129/26] IPv6=[] ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" HandleID="k8s-pod-network.da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" Jul 7 00:39:12.952313 containerd[1574]: 2025-07-07 00:39:12.920 [INFO][4101] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Namespace="calico-system" Pod="whisker-6db9c5b85c-sbcnx" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0", GenerateName:"whisker-6db9c5b85c-", Namespace:"calico-system", SelfLink:"", UID:"0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 39, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6db9c5b85c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"whisker-6db9c5b85c-sbcnx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.117.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali85474ee6287", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:12.952313 containerd[1574]: 2025-07-07 00:39:12.921 [INFO][4101] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.129/32] ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Namespace="calico-system" Pod="whisker-6db9c5b85c-sbcnx" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" Jul 7 00:39:12.952396 containerd[1574]: 2025-07-07 00:39:12.921 [INFO][4101] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali85474ee6287 ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Namespace="calico-system" Pod="whisker-6db9c5b85c-sbcnx" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" Jul 7 00:39:12.952396 containerd[1574]: 2025-07-07 00:39:12.934 [INFO][4101] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Namespace="calico-system" Pod="whisker-6db9c5b85c-sbcnx" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" Jul 7 00:39:12.952452 containerd[1574]: 2025-07-07 00:39:12.935 [INFO][4101] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Namespace="calico-system" Pod="whisker-6db9c5b85c-sbcnx" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0", GenerateName:"whisker-6db9c5b85c-", Namespace:"calico-system", SelfLink:"", UID:"0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b", ResourceVersion:"869", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 39, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6db9c5b85c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46", Pod:"whisker-6db9c5b85c-sbcnx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.117.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali85474ee6287", MAC:"42:42:c5:8a:2d:fc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:12.952506 containerd[1574]: 2025-07-07 00:39:12.949 [INFO][4101] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" Namespace="calico-system" Pod="whisker-6db9c5b85c-sbcnx" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-whisker--6db9c5b85c--sbcnx-eth0" Jul 7 00:39:13.037316 containerd[1574]: time="2025-07-07T00:39:13.037248332Z" level=info msg="connecting to shim da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46" address="unix:///run/containerd/s/676d59777f2e012a1d05029595867ab033bb72beb6eab7f601ea8176e6f05951" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:13.058916 systemd[1]: Started cri-containerd-da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46.scope - libcontainer container da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46. Jul 7 00:39:13.162685 containerd[1574]: time="2025-07-07T00:39:13.162625961Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6db9c5b85c-sbcnx,Uid:0e7e8dd9-8e65-4fa2-a2d8-a3e0dacb904b,Namespace:calico-system,Attempt:0,} returns sandbox id \"da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46\"" Jul 7 00:39:13.166190 containerd[1574]: time="2025-07-07T00:39:13.166172244Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 7 00:39:13.391785 containerd[1574]: time="2025-07-07T00:39:13.391747416Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"4ec96eda630c0e5df89b7914fbf5bc22ff89f400aebbb4d067bcd88dd37670de\" pid:4268 exit_status:1 exited_at:{seconds:1751848753 nanos:388192758}" Jul 7 00:39:14.038605 kubelet[2925]: I0707 00:39:14.038543 2925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c1c128-9d8a-456b-a958-8f6cb766ad0b" path="/var/lib/kubelet/pods/53c1c128-9d8a-456b-a958-8f6cb766ad0b/volumes" Jul 7 00:39:14.489055 systemd-networkd[1487]: cali85474ee6287: Gained IPv6LL Jul 7 00:39:14.831145 containerd[1574]: time="2025-07-07T00:39:14.830950944Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:14.832470 containerd[1574]: time="2025-07-07T00:39:14.832359995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Jul 7 00:39:14.836390 containerd[1574]: time="2025-07-07T00:39:14.833435038Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:14.836390 containerd[1574]: time="2025-07-07T00:39:14.836094262Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:14.836922 containerd[1574]: time="2025-07-07T00:39:14.836744846Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.670547585s" Jul 7 00:39:14.836922 containerd[1574]: time="2025-07-07T00:39:14.836769854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Jul 7 00:39:14.839541 containerd[1574]: time="2025-07-07T00:39:14.839517665Z" level=info msg="CreateContainer within sandbox \"da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 7 00:39:14.845713 containerd[1574]: time="2025-07-07T00:39:14.844335630Z" level=info msg="Container 59b33975936d5701b9e45bd34c5e8c5aea387cd1b87c2b6ca39e517395a1818d: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:14.864117 containerd[1574]: time="2025-07-07T00:39:14.864045108Z" level=info msg="CreateContainer within sandbox \"da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"59b33975936d5701b9e45bd34c5e8c5aea387cd1b87c2b6ca39e517395a1818d\"" Jul 7 00:39:14.864570 containerd[1574]: time="2025-07-07T00:39:14.864542093Z" level=info msg="StartContainer for \"59b33975936d5701b9e45bd34c5e8c5aea387cd1b87c2b6ca39e517395a1818d\"" Jul 7 00:39:14.865558 containerd[1574]: time="2025-07-07T00:39:14.865534239Z" level=info msg="connecting to shim 59b33975936d5701b9e45bd34c5e8c5aea387cd1b87c2b6ca39e517395a1818d" address="unix:///run/containerd/s/676d59777f2e012a1d05029595867ab033bb72beb6eab7f601ea8176e6f05951" protocol=ttrpc version=3 Jul 7 00:39:14.882819 systemd[1]: Started cri-containerd-59b33975936d5701b9e45bd34c5e8c5aea387cd1b87c2b6ca39e517395a1818d.scope - libcontainer container 59b33975936d5701b9e45bd34c5e8c5aea387cd1b87c2b6ca39e517395a1818d. Jul 7 00:39:14.918589 containerd[1574]: time="2025-07-07T00:39:14.918544188Z" level=info msg="StartContainer for \"59b33975936d5701b9e45bd34c5e8c5aea387cd1b87c2b6ca39e517395a1818d\" returns successfully" Jul 7 00:39:14.920051 containerd[1574]: time="2025-07-07T00:39:14.919981844Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 7 00:39:16.933092 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3398769963.mount: Deactivated successfully. Jul 7 00:39:16.945513 containerd[1574]: time="2025-07-07T00:39:16.945465236Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:16.946370 containerd[1574]: time="2025-07-07T00:39:16.946329302Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Jul 7 00:39:16.947213 containerd[1574]: time="2025-07-07T00:39:16.947176507Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:16.948809 containerd[1574]: time="2025-07-07T00:39:16.948790533Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:16.949621 containerd[1574]: time="2025-07-07T00:39:16.949243376Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 2.02923918s" Jul 7 00:39:16.949621 containerd[1574]: time="2025-07-07T00:39:16.949277991Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Jul 7 00:39:16.951840 containerd[1574]: time="2025-07-07T00:39:16.951806899Z" level=info msg="CreateContainer within sandbox \"da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 7 00:39:16.958574 containerd[1574]: time="2025-07-07T00:39:16.958543362Z" level=info msg="Container b9fb0b43b00cc726fc40fad3d173262e6c84b22e6c14f5f19bac2474a48aa0ef: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:16.971838 containerd[1574]: time="2025-07-07T00:39:16.971801288Z" level=info msg="CreateContainer within sandbox \"da189e794de88c2722add774b5932098a7ee9ec49c1b6f8513d77ba129f66f46\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"b9fb0b43b00cc726fc40fad3d173262e6c84b22e6c14f5f19bac2474a48aa0ef\"" Jul 7 00:39:16.972421 containerd[1574]: time="2025-07-07T00:39:16.972354508Z" level=info msg="StartContainer for \"b9fb0b43b00cc726fc40fad3d173262e6c84b22e6c14f5f19bac2474a48aa0ef\"" Jul 7 00:39:16.973493 containerd[1574]: time="2025-07-07T00:39:16.973448807Z" level=info msg="connecting to shim b9fb0b43b00cc726fc40fad3d173262e6c84b22e6c14f5f19bac2474a48aa0ef" address="unix:///run/containerd/s/676d59777f2e012a1d05029595867ab033bb72beb6eab7f601ea8176e6f05951" protocol=ttrpc version=3 Jul 7 00:39:16.995811 systemd[1]: Started cri-containerd-b9fb0b43b00cc726fc40fad3d173262e6c84b22e6c14f5f19bac2474a48aa0ef.scope - libcontainer container b9fb0b43b00cc726fc40fad3d173262e6c84b22e6c14f5f19bac2474a48aa0ef. Jul 7 00:39:17.041927 containerd[1574]: time="2025-07-07T00:39:17.041875717Z" level=info msg="StartContainer for \"b9fb0b43b00cc726fc40fad3d173262e6c84b22e6c14f5f19bac2474a48aa0ef\" returns successfully" Jul 7 00:39:17.219415 kubelet[2925]: I0707 00:39:17.219166 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6db9c5b85c-sbcnx" podStartSLOduration=1.434403719 podStartE2EDuration="5.21902624s" podCreationTimestamp="2025-07-07 00:39:12 +0000 UTC" firstStartedPulling="2025-07-07 00:39:13.165608664 +0000 UTC m=+33.226737504" lastFinishedPulling="2025-07-07 00:39:16.950231184 +0000 UTC m=+37.011360025" observedRunningTime="2025-07-07 00:39:17.215876074 +0000 UTC m=+37.277004924" watchObservedRunningTime="2025-07-07 00:39:17.21902624 +0000 UTC m=+37.280155080" Jul 7 00:39:17.637061 kubelet[2925]: I0707 00:39:17.636823 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:18.036348 containerd[1574]: time="2025-07-07T00:39:18.036208262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77d57f8f97-z9fxb,Uid:14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17,Namespace:calico-system,Attempt:0,}" Jul 7 00:39:18.131563 systemd-networkd[1487]: cali44cf8ae0ce1: Link UP Jul 7 00:39:18.133096 systemd-networkd[1487]: cali44cf8ae0ce1: Gained carrier Jul 7 00:39:18.142830 containerd[1574]: 2025-07-07 00:39:18.061 [INFO][4471] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 7 00:39:18.142830 containerd[1574]: 2025-07-07 00:39:18.073 [INFO][4471] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0 calico-kube-controllers-77d57f8f97- calico-system 14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17 802 0 2025-07-07 00:38:59 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:77d57f8f97 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 calico-kube-controllers-77d57f8f97-z9fxb eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali44cf8ae0ce1 [] [] }} ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Namespace="calico-system" Pod="calico-kube-controllers-77d57f8f97-z9fxb" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-" Jul 7 00:39:18.142830 containerd[1574]: 2025-07-07 00:39:18.073 [INFO][4471] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Namespace="calico-system" Pod="calico-kube-controllers-77d57f8f97-z9fxb" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" Jul 7 00:39:18.142830 containerd[1574]: 2025-07-07 00:39:18.099 [INFO][4480] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" HandleID="k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.100 [INFO][4480] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" HandleID="k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"calico-kube-controllers-77d57f8f97-z9fxb", "timestamp":"2025-07-07 00:39:18.09989914 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.100 [INFO][4480] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.100 [INFO][4480] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.100 [INFO][4480] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.105 [INFO][4480] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.109 [INFO][4480] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.115 [INFO][4480] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.117 [INFO][4480] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.142996 containerd[1574]: 2025-07-07 00:39:18.118 [INFO][4480] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.143472 containerd[1574]: 2025-07-07 00:39:18.118 [INFO][4480] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.143472 containerd[1574]: 2025-07-07 00:39:18.119 [INFO][4480] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee Jul 7 00:39:18.143472 containerd[1574]: 2025-07-07 00:39:18.122 [INFO][4480] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.143472 containerd[1574]: 2025-07-07 00:39:18.126 [INFO][4480] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.130/26] block=192.168.117.128/26 handle="k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.143472 containerd[1574]: 2025-07-07 00:39:18.126 [INFO][4480] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.130/26] handle="k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:18.143472 containerd[1574]: 2025-07-07 00:39:18.126 [INFO][4480] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:18.143472 containerd[1574]: 2025-07-07 00:39:18.126 [INFO][4480] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.130/26] IPv6=[] ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" HandleID="k8s-pod-network.ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" Jul 7 00:39:18.143601 containerd[1574]: 2025-07-07 00:39:18.129 [INFO][4471] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Namespace="calico-system" Pod="calico-kube-controllers-77d57f8f97-z9fxb" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0", GenerateName:"calico-kube-controllers-77d57f8f97-", Namespace:"calico-system", SelfLink:"", UID:"14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"77d57f8f97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"calico-kube-controllers-77d57f8f97-z9fxb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.117.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali44cf8ae0ce1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:18.143658 containerd[1574]: 2025-07-07 00:39:18.129 [INFO][4471] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.130/32] ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Namespace="calico-system" Pod="calico-kube-controllers-77d57f8f97-z9fxb" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" Jul 7 00:39:18.143658 containerd[1574]: 2025-07-07 00:39:18.129 [INFO][4471] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali44cf8ae0ce1 ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Namespace="calico-system" Pod="calico-kube-controllers-77d57f8f97-z9fxb" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" Jul 7 00:39:18.143658 containerd[1574]: 2025-07-07 00:39:18.131 [INFO][4471] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Namespace="calico-system" Pod="calico-kube-controllers-77d57f8f97-z9fxb" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" Jul 7 00:39:18.143724 containerd[1574]: 2025-07-07 00:39:18.132 [INFO][4471] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Namespace="calico-system" Pod="calico-kube-controllers-77d57f8f97-z9fxb" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0", GenerateName:"calico-kube-controllers-77d57f8f97-", Namespace:"calico-system", SelfLink:"", UID:"14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"77d57f8f97", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee", Pod:"calico-kube-controllers-77d57f8f97-z9fxb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.117.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali44cf8ae0ce1", MAC:"b2:4f:01:e1:4e:e0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:18.143777 containerd[1574]: 2025-07-07 00:39:18.139 [INFO][4471] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" Namespace="calico-system" Pod="calico-kube-controllers-77d57f8f97-z9fxb" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--kube--controllers--77d57f8f97--z9fxb-eth0" Jul 7 00:39:18.170135 containerd[1574]: time="2025-07-07T00:39:18.170088023Z" level=info msg="connecting to shim ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee" address="unix:///run/containerd/s/aa488c55d55c92480ff2abfe06f0bf3e161c0f906b591d3d0ecc5892f7f0c7b2" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:18.191816 systemd[1]: Started cri-containerd-ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee.scope - libcontainer container ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee. Jul 7 00:39:18.231878 containerd[1574]: time="2025-07-07T00:39:18.231805501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-77d57f8f97-z9fxb,Uid:14fe5431-92e4-4f3e-ac2a-1bbd02ba7a17,Namespace:calico-system,Attempt:0,} returns sandbox id \"ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee\"" Jul 7 00:39:18.234225 containerd[1574]: time="2025-07-07T00:39:18.234199805Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 7 00:39:18.875621 systemd-networkd[1487]: vxlan.calico: Link UP Jul 7 00:39:18.875628 systemd-networkd[1487]: vxlan.calico: Gained carrier Jul 7 00:39:19.035113 containerd[1574]: time="2025-07-07T00:39:19.035020590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c47bcf6f-2m6k2,Uid:c58790dd-630c-4e7b-9398-a00ca059225e,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:39:19.035575 containerd[1574]: time="2025-07-07T00:39:19.034692182Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sw6gt,Uid:eb44bbbc-039e-4742-9b7b-5f5bfbef4e00,Namespace:calico-system,Attempt:0,}" Jul 7 00:39:19.035881 containerd[1574]: time="2025-07-07T00:39:19.035847706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-z472l,Uid:370e9121-6380-494d-bb21-0a3a2886c927,Namespace:calico-system,Attempt:0,}" Jul 7 00:39:19.241305 systemd-networkd[1487]: calie7d1b07cf61: Link UP Jul 7 00:39:19.241456 systemd-networkd[1487]: calie7d1b07cf61: Gained carrier Jul 7 00:39:19.255520 containerd[1574]: 2025-07-07 00:39:19.118 [INFO][4600] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0 csi-node-driver- calico-system eb44bbbc-039e-4742-9b7b-5f5bfbef4e00 701 0 2025-07-07 00:38:59 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 csi-node-driver-sw6gt eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calie7d1b07cf61 [] [] }} ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Namespace="calico-system" Pod="csi-node-driver-sw6gt" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-" Jul 7 00:39:19.255520 containerd[1574]: 2025-07-07 00:39:19.118 [INFO][4600] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Namespace="calico-system" Pod="csi-node-driver-sw6gt" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" Jul 7 00:39:19.255520 containerd[1574]: 2025-07-07 00:39:19.180 [INFO][4649] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" HandleID="k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.180 [INFO][4649] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" HandleID="k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d56e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"csi-node-driver-sw6gt", "timestamp":"2025-07-07 00:39:19.179252032 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.181 [INFO][4649] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.181 [INFO][4649] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.181 [INFO][4649] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.201 [INFO][4649] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.207 [INFO][4649] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.213 [INFO][4649] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.214 [INFO][4649] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.256296 containerd[1574]: 2025-07-07 00:39:19.216 [INFO][4649] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.257198 containerd[1574]: 2025-07-07 00:39:19.216 [INFO][4649] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.257198 containerd[1574]: 2025-07-07 00:39:19.217 [INFO][4649] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f Jul 7 00:39:19.257198 containerd[1574]: 2025-07-07 00:39:19.222 [INFO][4649] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.257198 containerd[1574]: 2025-07-07 00:39:19.229 [INFO][4649] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.131/26] block=192.168.117.128/26 handle="k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.257198 containerd[1574]: 2025-07-07 00:39:19.230 [INFO][4649] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.131/26] handle="k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.257198 containerd[1574]: 2025-07-07 00:39:19.230 [INFO][4649] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:19.257198 containerd[1574]: 2025-07-07 00:39:19.231 [INFO][4649] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.131/26] IPv6=[] ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" HandleID="k8s-pod-network.f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" Jul 7 00:39:19.257899 containerd[1574]: 2025-07-07 00:39:19.236 [INFO][4600] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Namespace="calico-system" Pod="csi-node-driver-sw6gt" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eb44bbbc-039e-4742-9b7b-5f5bfbef4e00", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"csi-node-driver-sw6gt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.117.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie7d1b07cf61", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:19.258306 containerd[1574]: 2025-07-07 00:39:19.236 [INFO][4600] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.131/32] ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Namespace="calico-system" Pod="csi-node-driver-sw6gt" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" Jul 7 00:39:19.258306 containerd[1574]: 2025-07-07 00:39:19.237 [INFO][4600] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie7d1b07cf61 ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Namespace="calico-system" Pod="csi-node-driver-sw6gt" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" Jul 7 00:39:19.258306 containerd[1574]: 2025-07-07 00:39:19.241 [INFO][4600] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Namespace="calico-system" Pod="csi-node-driver-sw6gt" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" Jul 7 00:39:19.258366 containerd[1574]: 2025-07-07 00:39:19.241 [INFO][4600] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Namespace="calico-system" Pod="csi-node-driver-sw6gt" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eb44bbbc-039e-4742-9b7b-5f5bfbef4e00", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f", Pod:"csi-node-driver-sw6gt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.117.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calie7d1b07cf61", MAC:"a6:18:e2:f4:ab:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:19.258415 containerd[1574]: 2025-07-07 00:39:19.251 [INFO][4600] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" Namespace="calico-system" Pod="csi-node-driver-sw6gt" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-csi--node--driver--sw6gt-eth0" Jul 7 00:39:19.287592 containerd[1574]: time="2025-07-07T00:39:19.286999601Z" level=info msg="connecting to shim f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f" address="unix:///run/containerd/s/9fa82d8c44e7c11f8470837af1a511741618cd229fa37dd07353f450016871a3" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:19.329825 systemd[1]: Started cri-containerd-f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f.scope - libcontainer container f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f. Jul 7 00:39:19.340590 systemd-networkd[1487]: cali55d1d815954: Link UP Jul 7 00:39:19.340756 systemd-networkd[1487]: cali55d1d815954: Gained carrier Jul 7 00:39:19.362007 containerd[1574]: 2025-07-07 00:39:19.137 [INFO][4605] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0 goldmane-768f4c5c69- calico-system 370e9121-6380-494d-bb21-0a3a2886c927 800 0 2025-07-07 00:38:58 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 goldmane-768f4c5c69-z472l eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali55d1d815954 [] [] }} ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Namespace="calico-system" Pod="goldmane-768f4c5c69-z472l" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-" Jul 7 00:39:19.362007 containerd[1574]: 2025-07-07 00:39:19.139 [INFO][4605] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Namespace="calico-system" Pod="goldmane-768f4c5c69-z472l" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" Jul 7 00:39:19.362007 containerd[1574]: 2025-07-07 00:39:19.192 [INFO][4660] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" HandleID="k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.192 [INFO][4660] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" HandleID="k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4f20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"goldmane-768f4c5c69-z472l", "timestamp":"2025-07-07 00:39:19.192537252 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.192 [INFO][4660] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.231 [INFO][4660] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.231 [INFO][4660] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.301 [INFO][4660] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.308 [INFO][4660] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.314 [INFO][4660] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.316 [INFO][4660] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.362802 containerd[1574]: 2025-07-07 00:39:19.318 [INFO][4660] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.363894 containerd[1574]: 2025-07-07 00:39:19.318 [INFO][4660] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.363894 containerd[1574]: 2025-07-07 00:39:19.320 [INFO][4660] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66 Jul 7 00:39:19.363894 containerd[1574]: 2025-07-07 00:39:19.323 [INFO][4660] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.363894 containerd[1574]: 2025-07-07 00:39:19.332 [INFO][4660] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.132/26] block=192.168.117.128/26 handle="k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.363894 containerd[1574]: 2025-07-07 00:39:19.332 [INFO][4660] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.132/26] handle="k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.363894 containerd[1574]: 2025-07-07 00:39:19.332 [INFO][4660] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:19.363894 containerd[1574]: 2025-07-07 00:39:19.332 [INFO][4660] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.132/26] IPv6=[] ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" HandleID="k8s-pod-network.6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" Jul 7 00:39:19.365001 containerd[1574]: 2025-07-07 00:39:19.335 [INFO][4605] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Namespace="calico-system" Pod="goldmane-768f4c5c69-z472l" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"370e9121-6380-494d-bb21-0a3a2886c927", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"goldmane-768f4c5c69-z472l", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.117.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali55d1d815954", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:19.365062 containerd[1574]: 2025-07-07 00:39:19.336 [INFO][4605] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.132/32] ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Namespace="calico-system" Pod="goldmane-768f4c5c69-z472l" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" Jul 7 00:39:19.365062 containerd[1574]: 2025-07-07 00:39:19.336 [INFO][4605] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali55d1d815954 ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Namespace="calico-system" Pod="goldmane-768f4c5c69-z472l" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" Jul 7 00:39:19.365062 containerd[1574]: 2025-07-07 00:39:19.339 [INFO][4605] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Namespace="calico-system" Pod="goldmane-768f4c5c69-z472l" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" Jul 7 00:39:19.365117 containerd[1574]: 2025-07-07 00:39:19.342 [INFO][4605] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Namespace="calico-system" Pod="goldmane-768f4c5c69-z472l" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"370e9121-6380-494d-bb21-0a3a2886c927", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66", Pod:"goldmane-768f4c5c69-z472l", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.117.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali55d1d815954", MAC:"9a:d5:76:26:2b:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:19.365160 containerd[1574]: 2025-07-07 00:39:19.355 [INFO][4605] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" Namespace="calico-system" Pod="goldmane-768f4c5c69-z472l" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-goldmane--768f4c5c69--z472l-eth0" Jul 7 00:39:19.376039 containerd[1574]: time="2025-07-07T00:39:19.375959849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-sw6gt,Uid:eb44bbbc-039e-4742-9b7b-5f5bfbef4e00,Namespace:calico-system,Attempt:0,} returns sandbox id \"f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f\"" Jul 7 00:39:19.400724 containerd[1574]: time="2025-07-07T00:39:19.400295097Z" level=info msg="connecting to shim 6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66" address="unix:///run/containerd/s/5e7aa7e9dd097417631e4514c082fd781ecb75085a0b1718a3a80a65a6b07a1a" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:19.425822 systemd[1]: Started cri-containerd-6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66.scope - libcontainer container 6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66. Jul 7 00:39:19.449742 systemd-networkd[1487]: cali9bd30d67777: Link UP Jul 7 00:39:19.450957 systemd-networkd[1487]: cali9bd30d67777: Gained carrier Jul 7 00:39:19.467373 containerd[1574]: 2025-07-07 00:39:19.131 [INFO][4595] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0 calico-apiserver-5c47bcf6f- calico-apiserver c58790dd-630c-4e7b-9398-a00ca059225e 801 0 2025-07-07 00:38:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c47bcf6f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 calico-apiserver-5c47bcf6f-2m6k2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9bd30d67777 [] [] }} ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-2m6k2" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-" Jul 7 00:39:19.467373 containerd[1574]: 2025-07-07 00:39:19.132 [INFO][4595] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-2m6k2" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:39:19.467373 containerd[1574]: 2025-07-07 00:39:19.200 [INFO][4654] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.201 [INFO][4654] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cd6c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"calico-apiserver-5c47bcf6f-2m6k2", "timestamp":"2025-07-07 00:39:19.199803802 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.201 [INFO][4654] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.332 [INFO][4654] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.332 [INFO][4654] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.402 [INFO][4654] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.407 [INFO][4654] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.414 [INFO][4654] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.417 [INFO][4654] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.467534 containerd[1574]: 2025-07-07 00:39:19.420 [INFO][4654] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.468562 containerd[1574]: 2025-07-07 00:39:19.420 [INFO][4654] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.468562 containerd[1574]: 2025-07-07 00:39:19.421 [INFO][4654] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea Jul 7 00:39:19.468562 containerd[1574]: 2025-07-07 00:39:19.428 [INFO][4654] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.468562 containerd[1574]: 2025-07-07 00:39:19.437 [INFO][4654] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.133/26] block=192.168.117.128/26 handle="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.468562 containerd[1574]: 2025-07-07 00:39:19.437 [INFO][4654] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.133/26] handle="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:19.468562 containerd[1574]: 2025-07-07 00:39:19.437 [INFO][4654] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:19.468562 containerd[1574]: 2025-07-07 00:39:19.437 [INFO][4654] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.133/26] IPv6=[] ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:39:19.468683 containerd[1574]: 2025-07-07 00:39:19.442 [INFO][4595] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-2m6k2" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0", GenerateName:"calico-apiserver-5c47bcf6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"c58790dd-630c-4e7b-9398-a00ca059225e", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c47bcf6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"calico-apiserver-5c47bcf6f-2m6k2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.117.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9bd30d67777", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:19.469343 containerd[1574]: 2025-07-07 00:39:19.442 [INFO][4595] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.133/32] ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-2m6k2" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:39:19.469343 containerd[1574]: 2025-07-07 00:39:19.442 [INFO][4595] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9bd30d67777 ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-2m6k2" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:39:19.469343 containerd[1574]: 2025-07-07 00:39:19.450 [INFO][4595] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-2m6k2" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:39:19.469676 containerd[1574]: 2025-07-07 00:39:19.451 [INFO][4595] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-2m6k2" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0", GenerateName:"calico-apiserver-5c47bcf6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"c58790dd-630c-4e7b-9398-a00ca059225e", ResourceVersion:"801", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c47bcf6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea", Pod:"calico-apiserver-5c47bcf6f-2m6k2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.117.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9bd30d67777", MAC:"da:06:88:21:e1:ea", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:19.469761 containerd[1574]: 2025-07-07 00:39:19.462 [INFO][4595] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-2m6k2" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:39:19.491104 containerd[1574]: time="2025-07-07T00:39:19.491068555Z" level=info msg="connecting to shim b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" address="unix:///run/containerd/s/18118860ea5e3747f7a0bf4e0f68a0fb2d2682b276c5c60719b18f53ffc28c53" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:19.515814 systemd[1]: Started cri-containerd-b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea.scope - libcontainer container b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea. Jul 7 00:39:19.524597 containerd[1574]: time="2025-07-07T00:39:19.524530322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-z472l,Uid:370e9121-6380-494d-bb21-0a3a2886c927,Namespace:calico-system,Attempt:0,} returns sandbox id \"6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66\"" Jul 7 00:39:19.561943 containerd[1574]: time="2025-07-07T00:39:19.561909468Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c47bcf6f-2m6k2,Uid:c58790dd-630c-4e7b-9398-a00ca059225e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\"" Jul 7 00:39:19.991138 systemd-networkd[1487]: cali44cf8ae0ce1: Gained IPv6LL Jul 7 00:39:20.037666 containerd[1574]: time="2025-07-07T00:39:20.037048033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c47bcf6f-c64ch,Uid:d64b203e-aa4d-4795-9e34-4cec7d892738,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:39:20.178741 systemd-networkd[1487]: cali5020c1349e4: Link UP Jul 7 00:39:20.179229 systemd-networkd[1487]: cali5020c1349e4: Gained carrier Jul 7 00:39:20.195448 containerd[1574]: 2025-07-07 00:39:20.107 [INFO][4868] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0 calico-apiserver-5c47bcf6f- calico-apiserver d64b203e-aa4d-4795-9e34-4cec7d892738 803 0 2025-07-07 00:38:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c47bcf6f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 calico-apiserver-5c47bcf6f-c64ch eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5020c1349e4 [] [] }} ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-c64ch" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-" Jul 7 00:39:20.195448 containerd[1574]: 2025-07-07 00:39:20.107 [INFO][4868] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-c64ch" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:20.195448 containerd[1574]: 2025-07-07 00:39:20.136 [INFO][4877] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.136 [INFO][4877] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5600), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"calico-apiserver-5c47bcf6f-c64ch", "timestamp":"2025-07-07 00:39:20.136086038 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.136 [INFO][4877] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.136 [INFO][4877] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.136 [INFO][4877] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.142 [INFO][4877] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.146 [INFO][4877] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.151 [INFO][4877] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.153 [INFO][4877] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195618 containerd[1574]: 2025-07-07 00:39:20.156 [INFO][4877] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195859 containerd[1574]: 2025-07-07 00:39:20.156 [INFO][4877] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195859 containerd[1574]: 2025-07-07 00:39:20.158 [INFO][4877] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1 Jul 7 00:39:20.195859 containerd[1574]: 2025-07-07 00:39:20.163 [INFO][4877] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195859 containerd[1574]: 2025-07-07 00:39:20.168 [INFO][4877] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.134/26] block=192.168.117.128/26 handle="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195859 containerd[1574]: 2025-07-07 00:39:20.168 [INFO][4877] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.134/26] handle="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:20.195859 containerd[1574]: 2025-07-07 00:39:20.168 [INFO][4877] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:20.195859 containerd[1574]: 2025-07-07 00:39:20.168 [INFO][4877] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.134/26] IPv6=[] ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:20.197153 containerd[1574]: 2025-07-07 00:39:20.174 [INFO][4868] cni-plugin/k8s.go 418: Populated endpoint ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-c64ch" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0", GenerateName:"calico-apiserver-5c47bcf6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"d64b203e-aa4d-4795-9e34-4cec7d892738", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c47bcf6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"calico-apiserver-5c47bcf6f-c64ch", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.117.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5020c1349e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:20.197208 containerd[1574]: 2025-07-07 00:39:20.176 [INFO][4868] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.134/32] ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-c64ch" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:20.197208 containerd[1574]: 2025-07-07 00:39:20.176 [INFO][4868] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5020c1349e4 ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-c64ch" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:20.197208 containerd[1574]: 2025-07-07 00:39:20.179 [INFO][4868] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-c64ch" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:20.197644 containerd[1574]: 2025-07-07 00:39:20.179 [INFO][4868] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-c64ch" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0", GenerateName:"calico-apiserver-5c47bcf6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"d64b203e-aa4d-4795-9e34-4cec7d892738", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c47bcf6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1", Pod:"calico-apiserver-5c47bcf6f-c64ch", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.117.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5020c1349e4", MAC:"c6:a1:7b:58:45:f9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:20.197765 containerd[1574]: 2025-07-07 00:39:20.192 [INFO][4868] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Namespace="calico-apiserver" Pod="calico-apiserver-5c47bcf6f-c64ch" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:20.236232 containerd[1574]: time="2025-07-07T00:39:20.236183497Z" level=info msg="connecting to shim da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" address="unix:///run/containerd/s/ff28752695859d42356e4841330350a9dee398db8f17f2837f2dd8b24c4054c3" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:20.247327 systemd-networkd[1487]: vxlan.calico: Gained IPv6LL Jul 7 00:39:20.274846 systemd[1]: Started cri-containerd-da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1.scope - libcontainer container da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1. Jul 7 00:39:20.356454 containerd[1574]: time="2025-07-07T00:39:20.356414788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c47bcf6f-c64ch,Uid:d64b203e-aa4d-4795-9e34-4cec7d892738,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\"" Jul 7 00:39:20.567056 systemd-networkd[1487]: cali55d1d815954: Gained IPv6LL Jul 7 00:39:20.605033 containerd[1574]: time="2025-07-07T00:39:20.604979592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:20.605741 containerd[1574]: time="2025-07-07T00:39:20.605716608Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Jul 7 00:39:20.606713 containerd[1574]: time="2025-07-07T00:39:20.606629245Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:20.609080 containerd[1574]: time="2025-07-07T00:39:20.608398673Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 2.374057533s" Jul 7 00:39:20.609080 containerd[1574]: time="2025-07-07T00:39:20.608427397Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Jul 7 00:39:20.610106 containerd[1574]: time="2025-07-07T00:39:20.609750495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 7 00:39:20.613091 containerd[1574]: time="2025-07-07T00:39:20.613070692Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:20.635089 containerd[1574]: time="2025-07-07T00:39:20.635029647Z" level=info msg="CreateContainer within sandbox \"ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 7 00:39:20.641311 containerd[1574]: time="2025-07-07T00:39:20.641252081Z" level=info msg="Container 6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:20.647282 containerd[1574]: time="2025-07-07T00:39:20.647230086Z" level=info msg="CreateContainer within sandbox \"ccb25e3f5da51fafc84b7ba521daf4c97ca3fd8f9e44a5f1bc22e65f988034ee\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\"" Jul 7 00:39:20.647881 containerd[1574]: time="2025-07-07T00:39:20.647851827Z" level=info msg="StartContainer for \"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\"" Jul 7 00:39:20.649045 containerd[1574]: time="2025-07-07T00:39:20.649012149Z" level=info msg="connecting to shim 6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452" address="unix:///run/containerd/s/aa488c55d55c92480ff2abfe06f0bf3e161c0f906b591d3d0ecc5892f7f0c7b2" protocol=ttrpc version=3 Jul 7 00:39:20.671599 systemd[1]: Started cri-containerd-6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452.scope - libcontainer container 6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452. Jul 7 00:39:20.695284 systemd-networkd[1487]: calie7d1b07cf61: Gained IPv6LL Jul 7 00:39:20.720378 containerd[1574]: time="2025-07-07T00:39:20.720340481Z" level=info msg="StartContainer for \"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" returns successfully" Jul 7 00:39:20.759967 systemd-networkd[1487]: cali9bd30d67777: Gained IPv6LL Jul 7 00:39:21.034735 containerd[1574]: time="2025-07-07T00:39:21.034563628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f6ddc54db-zkq79,Uid:aebb6cf1-faa7-47e9-a8f2-33827fc086e0,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:39:21.034990 containerd[1574]: time="2025-07-07T00:39:21.034971175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-q45lm,Uid:41fb2441-eddf-4cab-a9b5-21c52d4bf3e5,Namespace:kube-system,Attempt:0,}" Jul 7 00:39:21.035554 containerd[1574]: time="2025-07-07T00:39:21.035349115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-757mw,Uid:cf319c1e-3b73-474b-9ae9-b5573fcf8751,Namespace:kube-system,Attempt:0,}" Jul 7 00:39:21.191415 systemd-networkd[1487]: cali44c5efc7edc: Link UP Jul 7 00:39:21.192407 systemd-networkd[1487]: cali44c5efc7edc: Gained carrier Jul 7 00:39:21.213830 containerd[1574]: 2025-07-07 00:39:21.116 [INFO][4983] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0 coredns-668d6bf9bc- kube-system 41fb2441-eddf-4cab-a9b5-21c52d4bf3e5 792 0 2025-07-07 00:38:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 coredns-668d6bf9bc-q45lm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali44c5efc7edc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Namespace="kube-system" Pod="coredns-668d6bf9bc-q45lm" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-" Jul 7 00:39:21.213830 containerd[1574]: 2025-07-07 00:39:21.117 [INFO][4983] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Namespace="kube-system" Pod="coredns-668d6bf9bc-q45lm" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" Jul 7 00:39:21.213830 containerd[1574]: 2025-07-07 00:39:21.150 [INFO][5027] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" HandleID="k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.150 [INFO][5027] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" HandleID="k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f730), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"coredns-668d6bf9bc-q45lm", "timestamp":"2025-07-07 00:39:21.150629032 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.150 [INFO][5027] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.150 [INFO][5027] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.150 [INFO][5027] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.159 [INFO][5027] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.164 [INFO][5027] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.168 [INFO][5027] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.170 [INFO][5027] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.214775 containerd[1574]: 2025-07-07 00:39:21.172 [INFO][5027] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.215625 containerd[1574]: 2025-07-07 00:39:21.172 [INFO][5027] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.215625 containerd[1574]: 2025-07-07 00:39:21.173 [INFO][5027] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335 Jul 7 00:39:21.215625 containerd[1574]: 2025-07-07 00:39:21.177 [INFO][5027] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.215625 containerd[1574]: 2025-07-07 00:39:21.183 [INFO][5027] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.135/26] block=192.168.117.128/26 handle="k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.215625 containerd[1574]: 2025-07-07 00:39:21.183 [INFO][5027] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.135/26] handle="k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.215625 containerd[1574]: 2025-07-07 00:39:21.183 [INFO][5027] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:21.215625 containerd[1574]: 2025-07-07 00:39:21.183 [INFO][5027] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.135/26] IPv6=[] ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" HandleID="k8s-pod-network.aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" Jul 7 00:39:21.215841 containerd[1574]: 2025-07-07 00:39:21.186 [INFO][4983] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Namespace="kube-system" Pod="coredns-668d6bf9bc-q45lm" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"41fb2441-eddf-4cab-a9b5-21c52d4bf3e5", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"coredns-668d6bf9bc-q45lm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.117.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44c5efc7edc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:21.215841 containerd[1574]: 2025-07-07 00:39:21.186 [INFO][4983] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.135/32] ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Namespace="kube-system" Pod="coredns-668d6bf9bc-q45lm" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" Jul 7 00:39:21.215841 containerd[1574]: 2025-07-07 00:39:21.187 [INFO][4983] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali44c5efc7edc ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Namespace="kube-system" Pod="coredns-668d6bf9bc-q45lm" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" Jul 7 00:39:21.215841 containerd[1574]: 2025-07-07 00:39:21.193 [INFO][4983] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Namespace="kube-system" Pod="coredns-668d6bf9bc-q45lm" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" Jul 7 00:39:21.215841 containerd[1574]: 2025-07-07 00:39:21.195 [INFO][4983] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Namespace="kube-system" Pod="coredns-668d6bf9bc-q45lm" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"41fb2441-eddf-4cab-a9b5-21c52d4bf3e5", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335", Pod:"coredns-668d6bf9bc-q45lm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.117.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44c5efc7edc", MAC:"e6:f2:3e:3e:76:2a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:21.215841 containerd[1574]: 2025-07-07 00:39:21.211 [INFO][4983] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" Namespace="kube-system" Pod="coredns-668d6bf9bc-q45lm" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--q45lm-eth0" Jul 7 00:39:21.243199 containerd[1574]: time="2025-07-07T00:39:21.243129460Z" level=info msg="connecting to shim aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335" address="unix:///run/containerd/s/4c3eecc5f251a87550e58d800f66c698a04fe84841c088fd0ae9fa0a35d47cb8" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:21.269815 systemd[1]: Started cri-containerd-aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335.scope - libcontainer container aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335. Jul 7 00:39:21.328040 systemd-networkd[1487]: calib5559616e66: Link UP Jul 7 00:39:21.328267 systemd-networkd[1487]: calib5559616e66: Gained carrier Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.104 [INFO][4985] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0 calico-apiserver-6f6ddc54db- calico-apiserver aebb6cf1-faa7-47e9-a8f2-33827fc086e0 805 0 2025-07-07 00:38:57 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f6ddc54db projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 calico-apiserver-6f6ddc54db-zkq79 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib5559616e66 [] [] }} ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-zkq79" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.104 [INFO][4985] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-zkq79" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.153 [INFO][5020] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" HandleID="k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.153 [INFO][5020] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" HandleID="k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5200), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"calico-apiserver-6f6ddc54db-zkq79", "timestamp":"2025-07-07 00:39:21.151416002 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.153 [INFO][5020] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.183 [INFO][5020] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.183 [INFO][5020] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.262 [INFO][5020] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.270 [INFO][5020] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.279 [INFO][5020] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.282 [INFO][5020] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.287 [INFO][5020] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.288 [INFO][5020] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.290 [INFO][5020] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.298 [INFO][5020] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.306 [INFO][5020] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.136/26] block=192.168.117.128/26 handle="k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.307 [INFO][5020] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.136/26] handle="k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.307 [INFO][5020] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:21.340613 containerd[1574]: 2025-07-07 00:39:21.307 [INFO][5020] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.136/26] IPv6=[] ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" HandleID="k8s-pod-network.1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" Jul 7 00:39:21.342228 containerd[1574]: 2025-07-07 00:39:21.311 [INFO][4985] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-zkq79" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0", GenerateName:"calico-apiserver-6f6ddc54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"aebb6cf1-faa7-47e9-a8f2-33827fc086e0", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f6ddc54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"calico-apiserver-6f6ddc54db-zkq79", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.117.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib5559616e66", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:21.342228 containerd[1574]: 2025-07-07 00:39:21.312 [INFO][4985] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.136/32] ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-zkq79" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" Jul 7 00:39:21.342228 containerd[1574]: 2025-07-07 00:39:21.312 [INFO][4985] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib5559616e66 ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-zkq79" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" Jul 7 00:39:21.342228 containerd[1574]: 2025-07-07 00:39:21.327 [INFO][4985] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-zkq79" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" Jul 7 00:39:21.342228 containerd[1574]: 2025-07-07 00:39:21.328 [INFO][4985] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-zkq79" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0", GenerateName:"calico-apiserver-6f6ddc54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"aebb6cf1-faa7-47e9-a8f2-33827fc086e0", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f6ddc54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b", Pod:"calico-apiserver-6f6ddc54db-zkq79", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.117.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib5559616e66", MAC:"ea:35:c2:53:0d:a5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:21.342228 containerd[1574]: 2025-07-07 00:39:21.337 [INFO][4985] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-zkq79" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--zkq79-eth0" Jul 7 00:39:21.350575 kubelet[2925]: I0707 00:39:21.343902 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-77d57f8f97-z9fxb" podStartSLOduration=19.953723024 podStartE2EDuration="22.329533354s" podCreationTimestamp="2025-07-07 00:38:59 +0000 UTC" firstStartedPulling="2025-07-07 00:39:18.233311443 +0000 UTC m=+38.294440283" lastFinishedPulling="2025-07-07 00:39:20.609121774 +0000 UTC m=+40.670250613" observedRunningTime="2025-07-07 00:39:21.329295947 +0000 UTC m=+41.390424797" watchObservedRunningTime="2025-07-07 00:39:21.329533354 +0000 UTC m=+41.390662193" Jul 7 00:39:21.358026 containerd[1574]: time="2025-07-07T00:39:21.357779563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-q45lm,Uid:41fb2441-eddf-4cab-a9b5-21c52d4bf3e5,Namespace:kube-system,Attempt:0,} returns sandbox id \"aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335\"" Jul 7 00:39:21.377058 containerd[1574]: time="2025-07-07T00:39:21.376992743Z" level=info msg="connecting to shim 1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b" address="unix:///run/containerd/s/e6f4c6e439134b13c2df9a38f6da54f043e83886ee4dec385f936a2790349605" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:21.377957 containerd[1574]: time="2025-07-07T00:39:21.377849684Z" level=info msg="CreateContainer within sandbox \"aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 7 00:39:21.405909 systemd[1]: Started cri-containerd-1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b.scope - libcontainer container 1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b. Jul 7 00:39:21.409231 containerd[1574]: time="2025-07-07T00:39:21.409191286Z" level=info msg="Container a2812569efca8bab577cea9252b4af3c552f8f874e3141cbf74fa1680c2f7f28: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:21.415551 systemd-networkd[1487]: cali505a63d752d: Link UP Jul 7 00:39:21.417918 systemd-networkd[1487]: cali505a63d752d: Gained carrier Jul 7 00:39:21.419144 containerd[1574]: time="2025-07-07T00:39:21.419118279Z" level=info msg="CreateContainer within sandbox \"aa45a565a2ff73c85b52b18e75e90816ba00360fdee845e5644153d5e48e7335\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a2812569efca8bab577cea9252b4af3c552f8f874e3141cbf74fa1680c2f7f28\"" Jul 7 00:39:21.421083 containerd[1574]: time="2025-07-07T00:39:21.421052306Z" level=info msg="StartContainer for \"a2812569efca8bab577cea9252b4af3c552f8f874e3141cbf74fa1680c2f7f28\"" Jul 7 00:39:21.423616 containerd[1574]: time="2025-07-07T00:39:21.423588256Z" level=info msg="connecting to shim a2812569efca8bab577cea9252b4af3c552f8f874e3141cbf74fa1680c2f7f28" address="unix:///run/containerd/s/4c3eecc5f251a87550e58d800f66c698a04fe84841c088fd0ae9fa0a35d47cb8" protocol=ttrpc version=3 Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.113 [INFO][4996] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0 coredns-668d6bf9bc- kube-system cf319c1e-3b73-474b-9ae9-b5573fcf8751 796 0 2025-07-07 00:38:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 coredns-668d6bf9bc-757mw eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali505a63d752d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Namespace="kube-system" Pod="coredns-668d6bf9bc-757mw" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.113 [INFO][4996] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Namespace="kube-system" Pod="coredns-668d6bf9bc-757mw" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.154 [INFO][5026] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" HandleID="k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.154 [INFO][5026] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" HandleID="k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c5210), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"coredns-668d6bf9bc-757mw", "timestamp":"2025-07-07 00:39:21.154502708 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.154 [INFO][5026] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.307 [INFO][5026] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.307 [INFO][5026] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.362 [INFO][5026] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.372 [INFO][5026] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.381 [INFO][5026] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.383 [INFO][5026] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.388 [INFO][5026] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.388 [INFO][5026] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.390 [INFO][5026] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.398 [INFO][5026] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.405 [INFO][5026] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.137/26] block=192.168.117.128/26 handle="k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.405 [INFO][5026] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.137/26] handle="k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.405 [INFO][5026] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:21.439262 containerd[1574]: 2025-07-07 00:39:21.405 [INFO][5026] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.137/26] IPv6=[] ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" HandleID="k8s-pod-network.65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" Jul 7 00:39:21.440264 containerd[1574]: 2025-07-07 00:39:21.411 [INFO][4996] cni-plugin/k8s.go 418: Populated endpoint ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Namespace="kube-system" Pod="coredns-668d6bf9bc-757mw" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cf319c1e-3b73-474b-9ae9-b5573fcf8751", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"coredns-668d6bf9bc-757mw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.117.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali505a63d752d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:21.440264 containerd[1574]: 2025-07-07 00:39:21.412 [INFO][4996] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.137/32] ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Namespace="kube-system" Pod="coredns-668d6bf9bc-757mw" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" Jul 7 00:39:21.440264 containerd[1574]: 2025-07-07 00:39:21.412 [INFO][4996] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali505a63d752d ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Namespace="kube-system" Pod="coredns-668d6bf9bc-757mw" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" Jul 7 00:39:21.440264 containerd[1574]: 2025-07-07 00:39:21.417 [INFO][4996] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Namespace="kube-system" Pod="coredns-668d6bf9bc-757mw" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" Jul 7 00:39:21.440264 containerd[1574]: 2025-07-07 00:39:21.418 [INFO][4996] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Namespace="kube-system" Pod="coredns-668d6bf9bc-757mw" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cf319c1e-3b73-474b-9ae9-b5573fcf8751", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 38, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e", Pod:"coredns-668d6bf9bc-757mw", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.117.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali505a63d752d", MAC:"7a:0f:24:c0:7a:eb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:21.440264 containerd[1574]: 2025-07-07 00:39:21.428 [INFO][4996] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" Namespace="kube-system" Pod="coredns-668d6bf9bc-757mw" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-coredns--668d6bf9bc--757mw-eth0" Jul 7 00:39:21.459894 systemd[1]: Started cri-containerd-a2812569efca8bab577cea9252b4af3c552f8f874e3141cbf74fa1680c2f7f28.scope - libcontainer container a2812569efca8bab577cea9252b4af3c552f8f874e3141cbf74fa1680c2f7f28. Jul 7 00:39:21.464152 systemd-networkd[1487]: cali5020c1349e4: Gained IPv6LL Jul 7 00:39:21.471686 containerd[1574]: time="2025-07-07T00:39:21.471652723Z" level=info msg="connecting to shim 65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e" address="unix:///run/containerd/s/bdc003a366d4b48cf949d2ba38e9ac58d24d528669109ea83fd259b634570a9c" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:21.497865 systemd[1]: Started cri-containerd-65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e.scope - libcontainer container 65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e. Jul 7 00:39:21.514374 containerd[1574]: time="2025-07-07T00:39:21.514271467Z" level=info msg="StartContainer for \"a2812569efca8bab577cea9252b4af3c552f8f874e3141cbf74fa1680c2f7f28\" returns successfully" Jul 7 00:39:21.515592 containerd[1574]: time="2025-07-07T00:39:21.515395281Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f6ddc54db-zkq79,Uid:aebb6cf1-faa7-47e9-a8f2-33827fc086e0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b\"" Jul 7 00:39:21.552494 containerd[1574]: time="2025-07-07T00:39:21.552427837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-757mw,Uid:cf319c1e-3b73-474b-9ae9-b5573fcf8751,Namespace:kube-system,Attempt:0,} returns sandbox id \"65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e\"" Jul 7 00:39:21.555542 containerd[1574]: time="2025-07-07T00:39:21.555333634Z" level=info msg="CreateContainer within sandbox \"65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 7 00:39:21.567809 containerd[1574]: time="2025-07-07T00:39:21.567774224Z" level=info msg="Container 8d7f7ff4a4f5acc3e9441e718512b9db02506d5b5eb5df197f3d96edfd60e3d8: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:21.573396 containerd[1574]: time="2025-07-07T00:39:21.573366953Z" level=info msg="CreateContainer within sandbox \"65eddaf52a5d0d1db3d01cc80a74f1bc31aa5232996237d99546c7c5fafed51e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8d7f7ff4a4f5acc3e9441e718512b9db02506d5b5eb5df197f3d96edfd60e3d8\"" Jul 7 00:39:21.573889 containerd[1574]: time="2025-07-07T00:39:21.573810759Z" level=info msg="StartContainer for \"8d7f7ff4a4f5acc3e9441e718512b9db02506d5b5eb5df197f3d96edfd60e3d8\"" Jul 7 00:39:21.574372 containerd[1574]: time="2025-07-07T00:39:21.574347759Z" level=info msg="connecting to shim 8d7f7ff4a4f5acc3e9441e718512b9db02506d5b5eb5df197f3d96edfd60e3d8" address="unix:///run/containerd/s/bdc003a366d4b48cf949d2ba38e9ac58d24d528669109ea83fd259b634570a9c" protocol=ttrpc version=3 Jul 7 00:39:21.592884 systemd[1]: Started cri-containerd-8d7f7ff4a4f5acc3e9441e718512b9db02506d5b5eb5df197f3d96edfd60e3d8.scope - libcontainer container 8d7f7ff4a4f5acc3e9441e718512b9db02506d5b5eb5df197f3d96edfd60e3d8. Jul 7 00:39:21.620898 containerd[1574]: time="2025-07-07T00:39:21.620693001Z" level=info msg="StartContainer for \"8d7f7ff4a4f5acc3e9441e718512b9db02506d5b5eb5df197f3d96edfd60e3d8\" returns successfully" Jul 7 00:39:22.215709 containerd[1574]: time="2025-07-07T00:39:22.215661480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:22.216386 containerd[1574]: time="2025-07-07T00:39:22.216335278Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Jul 7 00:39:22.217489 containerd[1574]: time="2025-07-07T00:39:22.217446378Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:22.219195 containerd[1574]: time="2025-07-07T00:39:22.219160622Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:22.220108 containerd[1574]: time="2025-07-07T00:39:22.219840681Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.610063686s" Jul 7 00:39:22.220108 containerd[1574]: time="2025-07-07T00:39:22.219879785Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Jul 7 00:39:22.221571 containerd[1574]: time="2025-07-07T00:39:22.221557931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 7 00:39:22.223120 containerd[1574]: time="2025-07-07T00:39:22.223104199Z" level=info msg="CreateContainer within sandbox \"f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 7 00:39:22.240511 containerd[1574]: time="2025-07-07T00:39:22.240491482Z" level=info msg="Container 04c11b24d2ce261a8388402c7c1f109d3abb2514a7c5de605c673db3acb35e7a: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:22.248288 containerd[1574]: time="2025-07-07T00:39:22.248113198Z" level=info msg="CreateContainer within sandbox \"f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"04c11b24d2ce261a8388402c7c1f109d3abb2514a7c5de605c673db3acb35e7a\"" Jul 7 00:39:22.251018 containerd[1574]: time="2025-07-07T00:39:22.249928032Z" level=info msg="StartContainer for \"04c11b24d2ce261a8388402c7c1f109d3abb2514a7c5de605c673db3acb35e7a\"" Jul 7 00:39:22.253716 containerd[1574]: time="2025-07-07T00:39:22.253392807Z" level=info msg="connecting to shim 04c11b24d2ce261a8388402c7c1f109d3abb2514a7c5de605c673db3acb35e7a" address="unix:///run/containerd/s/9fa82d8c44e7c11f8470837af1a511741618cd229fa37dd07353f450016871a3" protocol=ttrpc version=3 Jul 7 00:39:22.298086 systemd[1]: Started cri-containerd-04c11b24d2ce261a8388402c7c1f109d3abb2514a7c5de605c673db3acb35e7a.scope - libcontainer container 04c11b24d2ce261a8388402c7c1f109d3abb2514a7c5de605c673db3acb35e7a. Jul 7 00:39:22.298659 kubelet[2925]: I0707 00:39:22.298265 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:22.301997 kubelet[2925]: I0707 00:39:22.301958 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-q45lm" podStartSLOduration=35.301944868 podStartE2EDuration="35.301944868s" podCreationTimestamp="2025-07-07 00:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:39:22.301460357 +0000 UTC m=+42.362589196" watchObservedRunningTime="2025-07-07 00:39:22.301944868 +0000 UTC m=+42.363073708" Jul 7 00:39:22.361229 kubelet[2925]: I0707 00:39:22.361152 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-757mw" podStartSLOduration=35.360977249 podStartE2EDuration="35.360977249s" podCreationTimestamp="2025-07-07 00:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:39:22.343849785 +0000 UTC m=+42.404978644" watchObservedRunningTime="2025-07-07 00:39:22.360977249 +0000 UTC m=+42.422106089" Jul 7 00:39:22.398791 containerd[1574]: time="2025-07-07T00:39:22.398753201Z" level=info msg="StartContainer for \"04c11b24d2ce261a8388402c7c1f109d3abb2514a7c5de605c673db3acb35e7a\" returns successfully" Jul 7 00:39:22.550921 systemd-networkd[1487]: cali505a63d752d: Gained IPv6LL Jul 7 00:39:22.807093 systemd-networkd[1487]: calib5559616e66: Gained IPv6LL Jul 7 00:39:23.062879 systemd-networkd[1487]: cali44c5efc7edc: Gained IPv6LL Jul 7 00:39:25.442119 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3044257020.mount: Deactivated successfully. Jul 7 00:39:25.823181 containerd[1574]: time="2025-07-07T00:39:25.823089738Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:25.824347 containerd[1574]: time="2025-07-07T00:39:25.824319339Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Jul 7 00:39:25.824882 containerd[1574]: time="2025-07-07T00:39:25.824679687Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:25.826358 containerd[1574]: time="2025-07-07T00:39:25.826323349Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:25.827084 containerd[1574]: time="2025-07-07T00:39:25.826776872Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.605123962s" Jul 7 00:39:25.827084 containerd[1574]: time="2025-07-07T00:39:25.826804423Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Jul 7 00:39:25.827802 containerd[1574]: time="2025-07-07T00:39:25.827788263Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 7 00:39:25.830016 containerd[1574]: time="2025-07-07T00:39:25.829999421Z" level=info msg="CreateContainer within sandbox \"6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 7 00:39:25.838893 containerd[1574]: time="2025-07-07T00:39:25.838863742Z" level=info msg="Container 4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:25.844230 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2866010021.mount: Deactivated successfully. Jul 7 00:39:25.856391 containerd[1574]: time="2025-07-07T00:39:25.856364053Z" level=info msg="CreateContainer within sandbox \"6426820046b53ff0578138c39ae0b5380716100749566e8e69487229a469ea66\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\"" Jul 7 00:39:25.857078 containerd[1574]: time="2025-07-07T00:39:25.857042849Z" level=info msg="StartContainer for \"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\"" Jul 7 00:39:25.858152 containerd[1574]: time="2025-07-07T00:39:25.858128752Z" level=info msg="connecting to shim 4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38" address="unix:///run/containerd/s/5e7aa7e9dd097417631e4514c082fd781ecb75085a0b1718a3a80a65a6b07a1a" protocol=ttrpc version=3 Jul 7 00:39:25.881998 systemd[1]: Started cri-containerd-4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38.scope - libcontainer container 4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38. Jul 7 00:39:26.014679 containerd[1574]: time="2025-07-07T00:39:26.014161676Z" level=info msg="StartContainer for \"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" returns successfully" Jul 7 00:39:26.106030 kubelet[2925]: I0707 00:39:26.105961 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:26.239291 containerd[1574]: time="2025-07-07T00:39:26.239233867Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"0dd9acbe3fa0abc963b5f341e1d72f9d94d242c7c38d9bbba9815fd1d774729d\" pid:5376 exited_at:{seconds:1751848766 nanos:215036103}" Jul 7 00:39:26.285892 containerd[1574]: time="2025-07-07T00:39:26.285846814Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"08ab46e02633bad2ae5dcad7379f190c746b6fe3efcf5ea1da7ee817c4d03aa5\" pid:5398 exited_at:{seconds:1751848766 nanos:285520049}" Jul 7 00:39:26.338215 kubelet[2925]: I0707 00:39:26.334604 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-z472l" podStartSLOduration=22.03258417 podStartE2EDuration="28.334349023s" podCreationTimestamp="2025-07-07 00:38:58 +0000 UTC" firstStartedPulling="2025-07-07 00:39:19.525768282 +0000 UTC m=+39.586897121" lastFinishedPulling="2025-07-07 00:39:25.827533134 +0000 UTC m=+45.888661974" observedRunningTime="2025-07-07 00:39:26.333806714 +0000 UTC m=+46.394935553" watchObservedRunningTime="2025-07-07 00:39:26.334349023 +0000 UTC m=+46.395477863" Jul 7 00:39:27.326261 kubelet[2925]: I0707 00:39:27.326228 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:28.152386 containerd[1574]: time="2025-07-07T00:39:28.152097893Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:28.153062 containerd[1574]: time="2025-07-07T00:39:28.152887859Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Jul 7 00:39:28.153791 containerd[1574]: time="2025-07-07T00:39:28.153756241Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:28.155240 containerd[1574]: time="2025-07-07T00:39:28.155204875Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:28.155967 containerd[1574]: time="2025-07-07T00:39:28.155751263Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 2.327871989s" Jul 7 00:39:28.155967 containerd[1574]: time="2025-07-07T00:39:28.155789024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 7 00:39:28.156842 containerd[1574]: time="2025-07-07T00:39:28.156817868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 7 00:39:28.158428 containerd[1574]: time="2025-07-07T00:39:28.158411455Z" level=info msg="CreateContainer within sandbox \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 7 00:39:28.167722 containerd[1574]: time="2025-07-07T00:39:28.165830505Z" level=info msg="Container dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:28.183722 containerd[1574]: time="2025-07-07T00:39:28.183676331Z" level=info msg="CreateContainer within sandbox \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\"" Jul 7 00:39:28.193443 containerd[1574]: time="2025-07-07T00:39:28.193417699Z" level=info msg="StartContainer for \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\"" Jul 7 00:39:28.194441 containerd[1574]: time="2025-07-07T00:39:28.194362435Z" level=info msg="connecting to shim dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6" address="unix:///run/containerd/s/18118860ea5e3747f7a0bf4e0f68a0fb2d2682b276c5c60719b18f53ffc28c53" protocol=ttrpc version=3 Jul 7 00:39:28.214204 systemd[1]: Started cri-containerd-dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6.scope - libcontainer container dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6. Jul 7 00:39:28.283127 containerd[1574]: time="2025-07-07T00:39:28.283095836Z" level=info msg="StartContainer for \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" returns successfully" Jul 7 00:39:28.669625 containerd[1574]: time="2025-07-07T00:39:28.669571749Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:28.670494 containerd[1574]: time="2025-07-07T00:39:28.670447235Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 7 00:39:28.672195 containerd[1574]: time="2025-07-07T00:39:28.672150548Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 515.226159ms" Jul 7 00:39:28.672195 containerd[1574]: time="2025-07-07T00:39:28.672192097Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 7 00:39:28.673772 containerd[1574]: time="2025-07-07T00:39:28.673240357Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 7 00:39:28.676987 containerd[1574]: time="2025-07-07T00:39:28.676957627Z" level=info msg="CreateContainer within sandbox \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 7 00:39:28.685637 containerd[1574]: time="2025-07-07T00:39:28.685087033Z" level=info msg="Container 51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:28.689429 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount872533978.mount: Deactivated successfully. Jul 7 00:39:28.696198 containerd[1574]: time="2025-07-07T00:39:28.696099551Z" level=info msg="CreateContainer within sandbox \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\"" Jul 7 00:39:28.697716 containerd[1574]: time="2025-07-07T00:39:28.696911848Z" level=info msg="StartContainer for \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\"" Jul 7 00:39:28.697716 containerd[1574]: time="2025-07-07T00:39:28.697665324Z" level=info msg="connecting to shim 51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3" address="unix:///run/containerd/s/ff28752695859d42356e4841330350a9dee398db8f17f2837f2dd8b24c4054c3" protocol=ttrpc version=3 Jul 7 00:39:28.718991 systemd[1]: Started cri-containerd-51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3.scope - libcontainer container 51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3. Jul 7 00:39:28.771311 containerd[1574]: time="2025-07-07T00:39:28.771285617Z" level=info msg="StartContainer for \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" returns successfully" Jul 7 00:39:29.155721 containerd[1574]: time="2025-07-07T00:39:29.155676828Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:29.157597 containerd[1574]: time="2025-07-07T00:39:29.157576951Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 7 00:39:29.161798 containerd[1574]: time="2025-07-07T00:39:29.161750929Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 488.487429ms" Jul 7 00:39:29.161798 containerd[1574]: time="2025-07-07T00:39:29.161773772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 7 00:39:29.163556 containerd[1574]: time="2025-07-07T00:39:29.162904418Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 7 00:39:29.171180 containerd[1574]: time="2025-07-07T00:39:29.171153279Z" level=info msg="CreateContainer within sandbox \"1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 7 00:39:29.185373 containerd[1574]: time="2025-07-07T00:39:29.184891049Z" level=info msg="Container bbaca0bd25ce28dee1b81c55b69d6114bb710c7ba2c1b08cd87597046284b375: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:29.194343 containerd[1574]: time="2025-07-07T00:39:29.194318135Z" level=info msg="CreateContainer within sandbox \"1022f8c95d9653c06dffa89c7f52fff616a2dc36d47a673638dddca2ffd7960b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bbaca0bd25ce28dee1b81c55b69d6114bb710c7ba2c1b08cd87597046284b375\"" Jul 7 00:39:29.194853 containerd[1574]: time="2025-07-07T00:39:29.194788098Z" level=info msg="StartContainer for \"bbaca0bd25ce28dee1b81c55b69d6114bb710c7ba2c1b08cd87597046284b375\"" Jul 7 00:39:29.200764 containerd[1574]: time="2025-07-07T00:39:29.200744879Z" level=info msg="connecting to shim bbaca0bd25ce28dee1b81c55b69d6114bb710c7ba2c1b08cd87597046284b375" address="unix:///run/containerd/s/e6f4c6e439134b13c2df9a38f6da54f043e83886ee4dec385f936a2790349605" protocol=ttrpc version=3 Jul 7 00:39:29.219811 systemd[1]: Started cri-containerd-bbaca0bd25ce28dee1b81c55b69d6114bb710c7ba2c1b08cd87597046284b375.scope - libcontainer container bbaca0bd25ce28dee1b81c55b69d6114bb710c7ba2c1b08cd87597046284b375. Jul 7 00:39:29.278586 containerd[1574]: time="2025-07-07T00:39:29.278554613Z" level=info msg="StartContainer for \"bbaca0bd25ce28dee1b81c55b69d6114bb710c7ba2c1b08cd87597046284b375\" returns successfully" Jul 7 00:39:29.343778 kubelet[2925]: I0707 00:39:29.343752 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:29.353932 kubelet[2925]: I0707 00:39:29.353895 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c47bcf6f-2m6k2" podStartSLOduration=24.760232998 podStartE2EDuration="33.35388171s" podCreationTimestamp="2025-07-07 00:38:56 +0000 UTC" firstStartedPulling="2025-07-07 00:39:19.562905301 +0000 UTC m=+39.624034141" lastFinishedPulling="2025-07-07 00:39:28.156554012 +0000 UTC m=+48.217682853" observedRunningTime="2025-07-07 00:39:28.364122124 +0000 UTC m=+48.425250985" watchObservedRunningTime="2025-07-07 00:39:29.35388171 +0000 UTC m=+49.415010550" Jul 7 00:39:29.368647 kubelet[2925]: I0707 00:39:29.368600 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6f6ddc54db-zkq79" podStartSLOduration=24.723672333 podStartE2EDuration="32.36858727s" podCreationTimestamp="2025-07-07 00:38:57 +0000 UTC" firstStartedPulling="2025-07-07 00:39:21.517757424 +0000 UTC m=+41.578886264" lastFinishedPulling="2025-07-07 00:39:29.162672362 +0000 UTC m=+49.223801201" observedRunningTime="2025-07-07 00:39:29.367297936 +0000 UTC m=+49.428426776" watchObservedRunningTime="2025-07-07 00:39:29.36858727 +0000 UTC m=+49.429716111" Jul 7 00:39:29.368807 kubelet[2925]: I0707 00:39:29.368686 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c47bcf6f-c64ch" podStartSLOduration=25.054229983 podStartE2EDuration="33.368680956s" podCreationTimestamp="2025-07-07 00:38:56 +0000 UTC" firstStartedPulling="2025-07-07 00:39:20.358540196 +0000 UTC m=+40.419669036" lastFinishedPulling="2025-07-07 00:39:28.672991169 +0000 UTC m=+48.734120009" observedRunningTime="2025-07-07 00:39:29.354454666 +0000 UTC m=+49.415583507" watchObservedRunningTime="2025-07-07 00:39:29.368680956 +0000 UTC m=+49.429809796" Jul 7 00:39:30.223277 systemd[1]: Created slice kubepods-besteffort-pod5e3d6b56_4a75_49ef_b25d_5b1305872e2d.slice - libcontainer container kubepods-besteffort-pod5e3d6b56_4a75_49ef_b25d_5b1305872e2d.slice. Jul 7 00:39:30.312970 kubelet[2925]: I0707 00:39:30.312926 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmm5w\" (UniqueName: \"kubernetes.io/projected/5e3d6b56-4a75-49ef-b25d-5b1305872e2d-kube-api-access-tmm5w\") pod \"calico-apiserver-6f6ddc54db-bbxkn\" (UID: \"5e3d6b56-4a75-49ef-b25d-5b1305872e2d\") " pod="calico-apiserver/calico-apiserver-6f6ddc54db-bbxkn" Jul 7 00:39:30.320819 kubelet[2925]: I0707 00:39:30.320492 2925 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5e3d6b56-4a75-49ef-b25d-5b1305872e2d-calico-apiserver-certs\") pod \"calico-apiserver-6f6ddc54db-bbxkn\" (UID: \"5e3d6b56-4a75-49ef-b25d-5b1305872e2d\") " pod="calico-apiserver/calico-apiserver-6f6ddc54db-bbxkn" Jul 7 00:39:30.345556 kubelet[2925]: I0707 00:39:30.345541 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:30.527950 containerd[1574]: time="2025-07-07T00:39:30.527695232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f6ddc54db-bbxkn,Uid:5e3d6b56-4a75-49ef-b25d-5b1305872e2d,Namespace:calico-apiserver,Attempt:0,}" Jul 7 00:39:31.022906 systemd-networkd[1487]: cali5fa1b6b6225: Link UP Jul 7 00:39:31.023121 systemd-networkd[1487]: cali5fa1b6b6225: Gained carrier Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.744 [INFO][5542] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0 calico-apiserver-6f6ddc54db- calico-apiserver 5e3d6b56-4a75-49ef-b25d-5b1305872e2d 1045 0 2025-07-07 00:39:30 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f6ddc54db projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-6-69f6cda1f4 calico-apiserver-6f6ddc54db-bbxkn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5fa1b6b6225 [] [] }} ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-bbxkn" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.746 [INFO][5542] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-bbxkn" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.925 [INFO][5554] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" HandleID="k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.927 [INFO][5554] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" HandleID="k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001d5900), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-6-69f6cda1f4", "pod":"calico-apiserver-6f6ddc54db-bbxkn", "timestamp":"2025-07-07 00:39:30.92518712 +0000 UTC"}, Hostname:"ci-4344-1-1-6-69f6cda1f4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.928 [INFO][5554] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.928 [INFO][5554] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.928 [INFO][5554] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-6-69f6cda1f4' Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.946 [INFO][5554] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.966 [INFO][5554] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.971 [INFO][5554] ipam/ipam.go 511: Trying affinity for 192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.974 [INFO][5554] ipam/ipam.go 158: Attempting to load block cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.978 [INFO][5554] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.117.128/26 host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.978 [INFO][5554] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.117.128/26 handle="k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.980 [INFO][5554] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558 Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.986 [INFO][5554] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.117.128/26 handle="k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.999 [INFO][5554] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.117.138/26] block=192.168.117.128/26 handle="k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.999 [INFO][5554] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.117.138/26] handle="k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" host="ci-4344-1-1-6-69f6cda1f4" Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:30.999 [INFO][5554] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:31.053395 containerd[1574]: 2025-07-07 00:39:31.000 [INFO][5554] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.117.138/26] IPv6=[] ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" HandleID="k8s-pod-network.fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" Jul 7 00:39:31.055787 containerd[1574]: 2025-07-07 00:39:31.005 [INFO][5542] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-bbxkn" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0", GenerateName:"calico-apiserver-6f6ddc54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"5e3d6b56-4a75-49ef-b25d-5b1305872e2d", ResourceVersion:"1045", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 39, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f6ddc54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"", Pod:"calico-apiserver-6f6ddc54db-bbxkn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.117.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5fa1b6b6225", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:31.055787 containerd[1574]: 2025-07-07 00:39:31.005 [INFO][5542] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.117.138/32] ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-bbxkn" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" Jul 7 00:39:31.055787 containerd[1574]: 2025-07-07 00:39:31.005 [INFO][5542] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5fa1b6b6225 ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-bbxkn" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" Jul 7 00:39:31.055787 containerd[1574]: 2025-07-07 00:39:31.016 [INFO][5542] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-bbxkn" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" Jul 7 00:39:31.055787 containerd[1574]: 2025-07-07 00:39:31.017 [INFO][5542] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-bbxkn" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0", GenerateName:"calico-apiserver-6f6ddc54db-", Namespace:"calico-apiserver", SelfLink:"", UID:"5e3d6b56-4a75-49ef-b25d-5b1305872e2d", ResourceVersion:"1045", Generation:0, CreationTimestamp:time.Date(2025, time.July, 7, 0, 39, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f6ddc54db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-6-69f6cda1f4", ContainerID:"fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558", Pod:"calico-apiserver-6f6ddc54db-bbxkn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.117.138/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5fa1b6b6225", MAC:"5e:ec:ce:4f:1e:14", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 7 00:39:31.055787 containerd[1574]: 2025-07-07 00:39:31.041 [INFO][5542] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" Namespace="calico-apiserver" Pod="calico-apiserver-6f6ddc54db-bbxkn" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--6f6ddc54db--bbxkn-eth0" Jul 7 00:39:31.248926 containerd[1574]: time="2025-07-07T00:39:31.248881425Z" level=info msg="connecting to shim fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558" address="unix:///run/containerd/s/1d034695e63a2f8f2d4281906ef94e016e11639d2d5e3c9a6e2069defd2a7eb8" namespace=k8s.io protocol=ttrpc version=3 Jul 7 00:39:31.321992 systemd[1]: Started cri-containerd-fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558.scope - libcontainer container fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558. Jul 7 00:39:31.351549 containerd[1574]: time="2025-07-07T00:39:31.351424230Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:31.354168 containerd[1574]: time="2025-07-07T00:39:31.354125338Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Jul 7 00:39:31.371041 kubelet[2925]: I0707 00:39:31.352287 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:31.381123 containerd[1574]: time="2025-07-07T00:39:31.379749825Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:31.399269 containerd[1574]: time="2025-07-07T00:39:31.399169408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f6ddc54db-bbxkn,Uid:5e3d6b56-4a75-49ef-b25d-5b1305872e2d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558\"" Jul 7 00:39:31.401846 containerd[1574]: time="2025-07-07T00:39:31.401757785Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 7 00:39:31.403625 containerd[1574]: time="2025-07-07T00:39:31.403259147Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 2.239790939s" Jul 7 00:39:31.403625 containerd[1574]: time="2025-07-07T00:39:31.403284655Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Jul 7 00:39:31.407749 containerd[1574]: time="2025-07-07T00:39:31.407731635Z" level=info msg="CreateContainer within sandbox \"f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 7 00:39:31.408045 containerd[1574]: time="2025-07-07T00:39:31.407915772Z" level=info msg="CreateContainer within sandbox \"fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 7 00:39:31.419655 containerd[1574]: time="2025-07-07T00:39:31.419617061Z" level=info msg="Container 9442f34955012edce081529aa2fce7ba2ee0a22340458af485b0ae3b98ec9d07: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:31.434165 containerd[1574]: time="2025-07-07T00:39:31.434007267Z" level=info msg="StopContainer for \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" with timeout 30 (s)" Jul 7 00:39:31.435979 containerd[1574]: time="2025-07-07T00:39:31.435951082Z" level=info msg="Container 2d175aa157567570db95422cc5d94b0d3defe9d0f66fc582702fa52fdfea4474: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:39:31.439256 containerd[1574]: time="2025-07-07T00:39:31.439210680Z" level=info msg="Stop container \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" with signal terminated" Jul 7 00:39:31.443887 containerd[1574]: time="2025-07-07T00:39:31.443808656Z" level=info msg="CreateContainer within sandbox \"f0a2c456249fd43d32c766f400e99fb246b8342d56dea7cfd8ccb5212e4ecb5f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9442f34955012edce081529aa2fce7ba2ee0a22340458af485b0ae3b98ec9d07\"" Jul 7 00:39:31.445542 containerd[1574]: time="2025-07-07T00:39:31.445504333Z" level=info msg="StartContainer for \"9442f34955012edce081529aa2fce7ba2ee0a22340458af485b0ae3b98ec9d07\"" Jul 7 00:39:31.447105 containerd[1574]: time="2025-07-07T00:39:31.447048677Z" level=info msg="connecting to shim 9442f34955012edce081529aa2fce7ba2ee0a22340458af485b0ae3b98ec9d07" address="unix:///run/containerd/s/9fa82d8c44e7c11f8470837af1a511741618cd229fa37dd07353f450016871a3" protocol=ttrpc version=3 Jul 7 00:39:31.452016 containerd[1574]: time="2025-07-07T00:39:31.451984817Z" level=info msg="CreateContainer within sandbox \"fe26010ccbfcf63209d084eea772db511a82cbd71ee0e61750928f36a7c0a558\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2d175aa157567570db95422cc5d94b0d3defe9d0f66fc582702fa52fdfea4474\"" Jul 7 00:39:31.453649 containerd[1574]: time="2025-07-07T00:39:31.453608801Z" level=info msg="StartContainer for \"2d175aa157567570db95422cc5d94b0d3defe9d0f66fc582702fa52fdfea4474\"" Jul 7 00:39:31.458225 containerd[1574]: time="2025-07-07T00:39:31.458205162Z" level=info msg="connecting to shim 2d175aa157567570db95422cc5d94b0d3defe9d0f66fc582702fa52fdfea4474" address="unix:///run/containerd/s/1d034695e63a2f8f2d4281906ef94e016e11639d2d5e3c9a6e2069defd2a7eb8" protocol=ttrpc version=3 Jul 7 00:39:31.477812 systemd[1]: Started cri-containerd-9442f34955012edce081529aa2fce7ba2ee0a22340458af485b0ae3b98ec9d07.scope - libcontainer container 9442f34955012edce081529aa2fce7ba2ee0a22340458af485b0ae3b98ec9d07. Jul 7 00:39:31.479170 systemd[1]: cri-containerd-51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3.scope: Deactivated successfully. Jul 7 00:39:31.489601 systemd[1]: Started cri-containerd-2d175aa157567570db95422cc5d94b0d3defe9d0f66fc582702fa52fdfea4474.scope - libcontainer container 2d175aa157567570db95422cc5d94b0d3defe9d0f66fc582702fa52fdfea4474. Jul 7 00:39:31.504327 containerd[1574]: time="2025-07-07T00:39:31.504271604Z" level=info msg="received exit event container_id:\"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" id:\"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" pid:5468 exit_status:1 exited_at:{seconds:1751848771 nanos:483352634}" Jul 7 00:39:31.505218 containerd[1574]: time="2025-07-07T00:39:31.505191133Z" level=info msg="TaskExit event in podsandbox handler container_id:\"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" id:\"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" pid:5468 exit_status:1 exited_at:{seconds:1751848771 nanos:483352634}" Jul 7 00:39:31.549015 containerd[1574]: time="2025-07-07T00:39:31.548873091Z" level=info msg="StartContainer for \"9442f34955012edce081529aa2fce7ba2ee0a22340458af485b0ae3b98ec9d07\" returns successfully" Jul 7 00:39:31.556168 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3-rootfs.mount: Deactivated successfully. Jul 7 00:39:31.580573 containerd[1574]: time="2025-07-07T00:39:31.579680684Z" level=info msg="StopContainer for \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" returns successfully" Jul 7 00:39:31.580573 containerd[1574]: time="2025-07-07T00:39:31.580435803Z" level=info msg="StopPodSandbox for \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\"" Jul 7 00:39:31.587790 containerd[1574]: time="2025-07-07T00:39:31.587759974Z" level=info msg="Container to stop \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 7 00:39:31.591719 containerd[1574]: time="2025-07-07T00:39:31.591683871Z" level=info msg="StartContainer for \"2d175aa157567570db95422cc5d94b0d3defe9d0f66fc582702fa52fdfea4474\" returns successfully" Jul 7 00:39:31.607467 systemd[1]: cri-containerd-da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1.scope: Deactivated successfully. Jul 7 00:39:31.609970 containerd[1574]: time="2025-07-07T00:39:31.609826251Z" level=info msg="TaskExit event in podsandbox handler container_id:\"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" id:\"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" pid:4927 exit_status:137 exited_at:{seconds:1751848771 nanos:609236703}" Jul 7 00:39:31.656113 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1-rootfs.mount: Deactivated successfully. Jul 7 00:39:31.683223 containerd[1574]: time="2025-07-07T00:39:31.683174236Z" level=info msg="shim disconnected" id=da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1 namespace=k8s.io Jul 7 00:39:31.683223 containerd[1574]: time="2025-07-07T00:39:31.683208992Z" level=warning msg="cleaning up after shim disconnected" id=da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1 namespace=k8s.io Jul 7 00:39:31.698016 containerd[1574]: time="2025-07-07T00:39:31.683216415Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 7 00:39:31.701109 containerd[1574]: time="2025-07-07T00:39:31.701024035Z" level=info msg="received exit event sandbox_id:\"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" exit_status:137 exited_at:{seconds:1751848771 nanos:609236703}" Jul 7 00:39:31.706203 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1-shm.mount: Deactivated successfully. Jul 7 00:39:31.825108 systemd-networkd[1487]: cali5020c1349e4: Link DOWN Jul 7 00:39:31.825386 systemd-networkd[1487]: cali5020c1349e4: Lost carrier Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.822 [INFO][5746] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.823 [INFO][5746] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" iface="eth0" netns="/var/run/netns/cni-921e0a8a-8e0a-45dc-2c79-cd1116983215" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.824 [INFO][5746] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" iface="eth0" netns="/var/run/netns/cni-921e0a8a-8e0a-45dc-2c79-cd1116983215" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.833 [INFO][5746] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" after=9.164771ms iface="eth0" netns="/var/run/netns/cni-921e0a8a-8e0a-45dc-2c79-cd1116983215" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.833 [INFO][5746] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.833 [INFO][5746] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.876 [INFO][5759] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.876 [INFO][5759] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.876 [INFO][5759] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.950 [INFO][5759] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.950 [INFO][5759] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.956 [INFO][5759] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:31.963979 containerd[1574]: 2025-07-07 00:39:31.959 [INFO][5746] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:31.966922 containerd[1574]: time="2025-07-07T00:39:31.964834386Z" level=info msg="TearDown network for sandbox \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" successfully" Jul 7 00:39:31.966922 containerd[1574]: time="2025-07-07T00:39:31.965097020Z" level=info msg="StopPodSandbox for \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" returns successfully" Jul 7 00:39:32.048693 kubelet[2925]: I0707 00:39:32.048211 2925 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57w6f\" (UniqueName: \"kubernetes.io/projected/d64b203e-aa4d-4795-9e34-4cec7d892738-kube-api-access-57w6f\") pod \"d64b203e-aa4d-4795-9e34-4cec7d892738\" (UID: \"d64b203e-aa4d-4795-9e34-4cec7d892738\") " Jul 7 00:39:32.049282 kubelet[2925]: I0707 00:39:32.048829 2925 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d64b203e-aa4d-4795-9e34-4cec7d892738-calico-apiserver-certs\") pod \"d64b203e-aa4d-4795-9e34-4cec7d892738\" (UID: \"d64b203e-aa4d-4795-9e34-4cec7d892738\") " Jul 7 00:39:32.067794 kubelet[2925]: I0707 00:39:32.064721 2925 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d64b203e-aa4d-4795-9e34-4cec7d892738-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "d64b203e-aa4d-4795-9e34-4cec7d892738" (UID: "d64b203e-aa4d-4795-9e34-4cec7d892738"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 7 00:39:32.067914 kubelet[2925]: I0707 00:39:32.065543 2925 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64b203e-aa4d-4795-9e34-4cec7d892738-kube-api-access-57w6f" (OuterVolumeSpecName: "kube-api-access-57w6f") pod "d64b203e-aa4d-4795-9e34-4cec7d892738" (UID: "d64b203e-aa4d-4795-9e34-4cec7d892738"). InnerVolumeSpecName "kube-api-access-57w6f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 7 00:39:32.149689 kubelet[2925]: I0707 00:39:32.149642 2925 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57w6f\" (UniqueName: \"kubernetes.io/projected/d64b203e-aa4d-4795-9e34-4cec7d892738-kube-api-access-57w6f\") on node \"ci-4344-1-1-6-69f6cda1f4\" DevicePath \"\"" Jul 7 00:39:32.149689 kubelet[2925]: I0707 00:39:32.149668 2925 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d64b203e-aa4d-4795-9e34-4cec7d892738-calico-apiserver-certs\") on node \"ci-4344-1-1-6-69f6cda1f4\" DevicePath \"\"" Jul 7 00:39:32.205015 kubelet[2925]: I0707 00:39:32.204952 2925 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 7 00:39:32.211053 kubelet[2925]: I0707 00:39:32.210991 2925 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 7 00:39:32.372236 kubelet[2925]: I0707 00:39:32.370976 2925 scope.go:117] "RemoveContainer" containerID="51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3" Jul 7 00:39:32.371267 systemd[1]: Removed slice kubepods-besteffort-podd64b203e_aa4d_4795_9e34_4cec7d892738.slice - libcontainer container kubepods-besteffort-podd64b203e_aa4d_4795_9e34_4cec7d892738.slice. Jul 7 00:39:32.391890 containerd[1574]: time="2025-07-07T00:39:32.391807358Z" level=info msg="RemoveContainer for \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\"" Jul 7 00:39:32.405065 containerd[1574]: time="2025-07-07T00:39:32.405035458Z" level=info msg="RemoveContainer for \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" returns successfully" Jul 7 00:39:32.406864 kubelet[2925]: I0707 00:39:32.404913 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6f6ddc54db-bbxkn" podStartSLOduration=2.40489829 podStartE2EDuration="2.40489829s" podCreationTimestamp="2025-07-07 00:39:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-07 00:39:32.37892619 +0000 UTC m=+52.440055030" watchObservedRunningTime="2025-07-07 00:39:32.40489829 +0000 UTC m=+52.466027130" Jul 7 00:39:32.407377 kubelet[2925]: I0707 00:39:32.407179 2925 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-sw6gt" podStartSLOduration=21.394474927 podStartE2EDuration="33.407172034s" podCreationTimestamp="2025-07-07 00:38:59 +0000 UTC" firstStartedPulling="2025-07-07 00:39:19.392494813 +0000 UTC m=+39.453623653" lastFinishedPulling="2025-07-07 00:39:31.40519192 +0000 UTC m=+51.466320760" observedRunningTime="2025-07-07 00:39:32.404247045 +0000 UTC m=+52.465375885" watchObservedRunningTime="2025-07-07 00:39:32.407172034 +0000 UTC m=+52.468300885" Jul 7 00:39:32.416484 kubelet[2925]: I0707 00:39:32.416429 2925 scope.go:117] "RemoveContainer" containerID="51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3" Jul 7 00:39:32.421952 containerd[1574]: time="2025-07-07T00:39:32.417006734Z" level=error msg="ContainerStatus for \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\": not found" Jul 7 00:39:32.426258 kubelet[2925]: E0707 00:39:32.425503 2925 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\": not found" containerID="51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3" Jul 7 00:39:32.435098 kubelet[2925]: I0707 00:39:32.427372 2925 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3"} err="failed to get container status \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\": rpc error: code = NotFound desc = an error occurred when try to find container \"51bea2a4e3844d8d694cf218aee06e0c3956a0efe6beef413e3639717a35f8d3\": not found" Jul 7 00:39:32.456302 systemd[1]: run-netns-cni\x2d921e0a8a\x2d8e0a\x2d45dc\x2d2c79\x2dcd1116983215.mount: Deactivated successfully. Jul 7 00:39:32.456569 systemd[1]: var-lib-kubelet-pods-d64b203e\x2daa4d\x2d4795\x2d9e34\x2d4cec7d892738-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 7 00:39:32.456654 systemd[1]: var-lib-kubelet-pods-d64b203e\x2daa4d\x2d4795\x2d9e34\x2d4cec7d892738-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d57w6f.mount: Deactivated successfully. Jul 7 00:39:32.854951 systemd-networkd[1487]: cali5fa1b6b6225: Gained IPv6LL Jul 7 00:39:33.388656 kubelet[2925]: I0707 00:39:33.388383 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:34.039497 kubelet[2925]: I0707 00:39:34.039444 2925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d64b203e-aa4d-4795-9e34-4cec7d892738" path="/var/lib/kubelet/pods/d64b203e-aa4d-4795-9e34-4cec7d892738/volumes" Jul 7 00:39:36.218736 kubelet[2925]: I0707 00:39:36.218251 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:36.466821 containerd[1574]: time="2025-07-07T00:39:36.466740695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"d950e81c0756ee68202938c9f4154ed6dd4b04e09cfebace78a20bf93ac7103b\" pid:5789 exited_at:{seconds:1751848776 nanos:466036260}" Jul 7 00:39:36.572845 containerd[1574]: time="2025-07-07T00:39:36.572689369Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"6282f23fd2a36d8314a637673dff0ee05cd30563a0b2ace8d2eadc0abd168f77\" pid:5810 exited_at:{seconds:1751848776 nanos:572202093}" Jul 7 00:39:40.074662 containerd[1574]: time="2025-07-07T00:39:40.074621624Z" level=info msg="StopPodSandbox for \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\"" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.122 [WARNING][5840] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.122 [INFO][5840] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.122 [INFO][5840] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" iface="eth0" netns="" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.122 [INFO][5840] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.122 [INFO][5840] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.147 [INFO][5847] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.147 [INFO][5847] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.147 [INFO][5847] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.153 [WARNING][5847] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.153 [INFO][5847] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.154 [INFO][5847] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:40.159086 containerd[1574]: 2025-07-07 00:39:40.156 [INFO][5840] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:40.159086 containerd[1574]: time="2025-07-07T00:39:40.158761974Z" level=info msg="TearDown network for sandbox \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" successfully" Jul 7 00:39:40.159086 containerd[1574]: time="2025-07-07T00:39:40.158781420Z" level=info msg="StopPodSandbox for \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" returns successfully" Jul 7 00:39:40.163228 containerd[1574]: time="2025-07-07T00:39:40.162995711Z" level=info msg="RemovePodSandbox for \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\"" Jul 7 00:39:40.163228 containerd[1574]: time="2025-07-07T00:39:40.163016340Z" level=info msg="Forcibly stopping sandbox \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\"" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.191 [WARNING][5861] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.191 [INFO][5861] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.191 [INFO][5861] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" iface="eth0" netns="" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.191 [INFO][5861] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.191 [INFO][5861] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.208 [INFO][5868] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.209 [INFO][5868] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.209 [INFO][5868] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.213 [WARNING][5868] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.213 [INFO][5868] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" HandleID="k8s-pod-network.da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--c64ch-eth0" Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.214 [INFO][5868] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:39:40.217375 containerd[1574]: 2025-07-07 00:39:40.215 [INFO][5861] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1" Jul 7 00:39:40.218823 containerd[1574]: time="2025-07-07T00:39:40.217414918Z" level=info msg="TearDown network for sandbox \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" successfully" Jul 7 00:39:40.220796 containerd[1574]: time="2025-07-07T00:39:40.220771577Z" level=info msg="Ensure that sandbox da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1 in task-service has been cleanup successfully" Jul 7 00:39:40.224602 containerd[1574]: time="2025-07-07T00:39:40.224580587Z" level=info msg="RemovePodSandbox \"da69fe7161962d03d3fb3a0ac18dba4e5844969638c0900b0515b4dc5b5368a1\" returns successfully" Jul 7 00:39:43.565183 containerd[1574]: time="2025-07-07T00:39:43.564943521Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"4824bb3ff34a94c2bafbd704c194f5fbb184862381a73d843b8eceaa974466cc\" pid:5890 exited_at:{seconds:1751848783 nanos:564535333}" Jul 7 00:39:47.163251 kubelet[2925]: I0707 00:39:47.163180 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:39:48.254552 update_engine[1536]: I20250707 00:39:48.254490 1536 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jul 7 00:39:48.254552 update_engine[1536]: I20250707 00:39:48.254544 1536 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jul 7 00:39:48.255951 update_engine[1536]: I20250707 00:39:48.255925 1536 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jul 7 00:39:48.257113 update_engine[1536]: I20250707 00:39:48.257083 1536 omaha_request_params.cc:62] Current group set to beta Jul 7 00:39:48.257380 update_engine[1536]: I20250707 00:39:48.257336 1536 update_attempter.cc:499] Already updated boot flags. Skipping. Jul 7 00:39:48.257380 update_engine[1536]: I20250707 00:39:48.257350 1536 update_attempter.cc:643] Scheduling an action processor start. Jul 7 00:39:48.257380 update_engine[1536]: I20250707 00:39:48.257367 1536 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 7 00:39:48.257995 update_engine[1536]: I20250707 00:39:48.257395 1536 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jul 7 00:39:48.257995 update_engine[1536]: I20250707 00:39:48.257436 1536 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 7 00:39:48.257995 update_engine[1536]: I20250707 00:39:48.257442 1536 omaha_request_action.cc:272] Request: Jul 7 00:39:48.257995 update_engine[1536]: Jul 7 00:39:48.257995 update_engine[1536]: Jul 7 00:39:48.257995 update_engine[1536]: Jul 7 00:39:48.257995 update_engine[1536]: Jul 7 00:39:48.257995 update_engine[1536]: Jul 7 00:39:48.257995 update_engine[1536]: Jul 7 00:39:48.257995 update_engine[1536]: Jul 7 00:39:48.257995 update_engine[1536]: Jul 7 00:39:48.257995 update_engine[1536]: I20250707 00:39:48.257447 1536 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 7 00:39:48.276662 locksmithd[1578]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jul 7 00:39:48.285726 update_engine[1536]: I20250707 00:39:48.285683 1536 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 7 00:39:48.286999 update_engine[1536]: I20250707 00:39:48.286119 1536 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 7 00:39:48.288283 update_engine[1536]: E20250707 00:39:48.288198 1536 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 7 00:39:48.288283 update_engine[1536]: I20250707 00:39:48.288264 1536 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jul 7 00:39:51.498510 containerd[1574]: time="2025-07-07T00:39:51.498465464Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"daa0b837ee9109aa3d7bb02af29e2e2de169692db9ab5472918b04f449631182\" pid:5922 exited_at:{seconds:1751848791 nanos:497881917}" Jul 7 00:39:56.360097 containerd[1574]: time="2025-07-07T00:39:56.360038919Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"645f9517ec5a74b54651cec1ddc3cc059adf4c118f797ed341c28a123fc085b8\" pid:5946 exited_at:{seconds:1751848796 nanos:314417190}" Jul 7 00:39:58.196006 update_engine[1536]: I20250707 00:39:58.195932 1536 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 7 00:39:58.197011 update_engine[1536]: I20250707 00:39:58.196154 1536 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 7 00:39:58.198678 update_engine[1536]: I20250707 00:39:58.198642 1536 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 7 00:39:58.198778 update_engine[1536]: E20250707 00:39:58.198755 1536 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 7 00:39:58.198815 update_engine[1536]: I20250707 00:39:58.198796 1536 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jul 7 00:40:06.156741 kubelet[2925]: I0707 00:40:06.156302 2925 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 7 00:40:06.341643 containerd[1574]: time="2025-07-07T00:40:06.341614665Z" level=info msg="StopContainer for \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" with timeout 30 (s)" Jul 7 00:40:06.357499 containerd[1574]: time="2025-07-07T00:40:06.357444014Z" level=info msg="Stop container \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" with signal terminated" Jul 7 00:40:06.402075 systemd[1]: cri-containerd-dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6.scope: Deactivated successfully. Jul 7 00:40:06.436147 containerd[1574]: time="2025-07-07T00:40:06.435569576Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" id:\"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" pid:5432 exit_status:1 exited_at:{seconds:1751848806 nanos:435135891}" Jul 7 00:40:06.436682 containerd[1574]: time="2025-07-07T00:40:06.436636991Z" level=info msg="received exit event container_id:\"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" id:\"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" pid:5432 exit_status:1 exited_at:{seconds:1751848806 nanos:435135891}" Jul 7 00:40:06.491508 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6-rootfs.mount: Deactivated successfully. Jul 7 00:40:06.528233 containerd[1574]: time="2025-07-07T00:40:06.528189048Z" level=info msg="StopContainer for \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" returns successfully" Jul 7 00:40:06.536813 containerd[1574]: time="2025-07-07T00:40:06.536653449Z" level=info msg="StopPodSandbox for \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\"" Jul 7 00:40:06.539596 containerd[1574]: time="2025-07-07T00:40:06.539513572Z" level=info msg="Container to stop \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 7 00:40:06.558682 systemd[1]: cri-containerd-b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea.scope: Deactivated successfully. Jul 7 00:40:06.559882 containerd[1574]: time="2025-07-07T00:40:06.559444607Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" pid:4843 exit_status:137 exited_at:{seconds:1751848806 nanos:559004450}" Jul 7 00:40:06.591659 containerd[1574]: time="2025-07-07T00:40:06.590271379Z" level=info msg="shim disconnected" id=b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea namespace=k8s.io Jul 7 00:40:06.591659 containerd[1574]: time="2025-07-07T00:40:06.590400794Z" level=warning msg="cleaning up after shim disconnected" id=b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea namespace=k8s.io Jul 7 00:40:06.591659 containerd[1574]: time="2025-07-07T00:40:06.590411704Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 7 00:40:06.596037 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea-rootfs.mount: Deactivated successfully. Jul 7 00:40:06.657089 containerd[1574]: time="2025-07-07T00:40:06.657045006Z" level=info msg="received exit event sandbox_id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" exit_status:137 exited_at:{seconds:1751848806 nanos:559004450}" Jul 7 00:40:06.661559 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea-shm.mount: Deactivated successfully. Jul 7 00:40:06.672356 containerd[1574]: time="2025-07-07T00:40:06.672204917Z" level=error msg="Failed to handle event container_id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" pid:4843 exit_status:137 exited_at:{seconds:1751848806 nanos:559004450} for b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" error="failed to handle container TaskExit event: failed to stop sandbox: failed to delete task: ttrpc: closed" Jul 7 00:40:06.672356 containerd[1574]: time="2025-07-07T00:40:06.672300836Z" level=info msg="Events for \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" is in backoff, enqueue event container_id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" pid:4843 exit_status:137 exited_at:{seconds:1751848806 nanos:649074462}" Jul 7 00:40:06.718739 kubelet[2925]: I0707 00:40:06.717106 2925 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:06.878473 systemd-networkd[1487]: cali9bd30d67777: Link DOWN Jul 7 00:40:06.878487 systemd-networkd[1487]: cali9bd30d67777: Lost carrier Jul 7 00:40:06.962544 containerd[1574]: time="2025-07-07T00:40:06.962491089Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"a83ed40e59b7788d91f86660fcd57abe8e6ee1463ffeab74d85cd9b3790be7bb\" pid:6000 exited_at:{seconds:1751848806 nanos:961843442}" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:06.861 [INFO][6057] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:06.863 [INFO][6057] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" iface="eth0" netns="/var/run/netns/cni-9ff5854a-67c4-b417-beee-352b12448ae7" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:06.863 [INFO][6057] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" iface="eth0" netns="/var/run/netns/cni-9ff5854a-67c4-b417-beee-352b12448ae7" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:06.870 [INFO][6057] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" after=7.699655ms iface="eth0" netns="/var/run/netns/cni-9ff5854a-67c4-b417-beee-352b12448ae7" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:06.870 [INFO][6057] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:06.870 [INFO][6057] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:07.054 [INFO][6067] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:07.058 [INFO][6067] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:07.058 [INFO][6067] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:07.218 [INFO][6067] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:07.218 [INFO][6067] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:07.220 [INFO][6067] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:40:07.226834 containerd[1574]: 2025-07-07 00:40:07.224 [INFO][6057] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:07.229032 containerd[1574]: time="2025-07-07T00:40:07.228764665Z" level=info msg="TearDown network for sandbox \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" successfully" Jul 7 00:40:07.229032 containerd[1574]: time="2025-07-07T00:40:07.228791054Z" level=info msg="StopPodSandbox for \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" returns successfully" Jul 7 00:40:07.232991 systemd[1]: run-netns-cni\x2d9ff5854a\x2d67c4\x2db417\x2dbeee\x2d352b12448ae7.mount: Deactivated successfully. Jul 7 00:40:07.521680 kubelet[2925]: I0707 00:40:07.521028 2925 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c58790dd-630c-4e7b-9398-a00ca059225e-calico-apiserver-certs\") pod \"c58790dd-630c-4e7b-9398-a00ca059225e\" (UID: \"c58790dd-630c-4e7b-9398-a00ca059225e\") " Jul 7 00:40:07.521680 kubelet[2925]: I0707 00:40:07.521090 2925 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmpkp\" (UniqueName: \"kubernetes.io/projected/c58790dd-630c-4e7b-9398-a00ca059225e-kube-api-access-kmpkp\") pod \"c58790dd-630c-4e7b-9398-a00ca059225e\" (UID: \"c58790dd-630c-4e7b-9398-a00ca059225e\") " Jul 7 00:40:07.564254 systemd[1]: var-lib-kubelet-pods-c58790dd\x2d630c\x2d4e7b\x2d9398\x2da00ca059225e-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 7 00:40:07.564359 systemd[1]: var-lib-kubelet-pods-c58790dd\x2d630c\x2d4e7b\x2d9398\x2da00ca059225e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dkmpkp.mount: Deactivated successfully. Jul 7 00:40:07.572689 kubelet[2925]: I0707 00:40:07.563852 2925 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58790dd-630c-4e7b-9398-a00ca059225e-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "c58790dd-630c-4e7b-9398-a00ca059225e" (UID: "c58790dd-630c-4e7b-9398-a00ca059225e"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 7 00:40:07.572689 kubelet[2925]: I0707 00:40:07.571980 2925 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58790dd-630c-4e7b-9398-a00ca059225e-kube-api-access-kmpkp" (OuterVolumeSpecName: "kube-api-access-kmpkp") pod "c58790dd-630c-4e7b-9398-a00ca059225e" (UID: "c58790dd-630c-4e7b-9398-a00ca059225e"). InnerVolumeSpecName "kube-api-access-kmpkp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 7 00:40:07.622419 kubelet[2925]: I0707 00:40:07.622358 2925 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c58790dd-630c-4e7b-9398-a00ca059225e-calico-apiserver-certs\") on node \"ci-4344-1-1-6-69f6cda1f4\" DevicePath \"\"" Jul 7 00:40:07.622419 kubelet[2925]: I0707 00:40:07.622408 2925 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmpkp\" (UniqueName: \"kubernetes.io/projected/c58790dd-630c-4e7b-9398-a00ca059225e-kube-api-access-kmpkp\") on node \"ci-4344-1-1-6-69f6cda1f4\" DevicePath \"\"" Jul 7 00:40:07.684017 containerd[1574]: time="2025-07-07T00:40:07.683940400Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" pid:4843 exit_status:137 exited_at:{seconds:1751848806 nanos:559004450}" Jul 7 00:40:07.685118 containerd[1574]: time="2025-07-07T00:40:07.685044334Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" id:\"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" pid:4843 exit_status:137 exited_at:{seconds:1751848806 nanos:649074462}" Jul 7 00:40:07.735995 systemd[1]: Removed slice kubepods-besteffort-podc58790dd_630c_4e7b_9398_a00ca059225e.slice - libcontainer container kubepods-besteffort-podc58790dd_630c_4e7b_9398_a00ca059225e.slice. Jul 7 00:40:08.049243 kubelet[2925]: I0707 00:40:08.043418 2925 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58790dd-630c-4e7b-9398-a00ca059225e" path="/var/lib/kubelet/pods/c58790dd-630c-4e7b-9398-a00ca059225e/volumes" Jul 7 00:40:08.190299 update_engine[1536]: I20250707 00:40:08.189754 1536 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 7 00:40:08.190299 update_engine[1536]: I20250707 00:40:08.190012 1536 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 7 00:40:08.190299 update_engine[1536]: I20250707 00:40:08.190245 1536 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 7 00:40:08.190978 update_engine[1536]: E20250707 00:40:08.190943 1536 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 7 00:40:08.191087 update_engine[1536]: I20250707 00:40:08.191063 1536 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jul 7 00:40:13.539602 containerd[1574]: time="2025-07-07T00:40:13.539562407Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"3196f59a0f7f6580f13e01907ee2ffdc94cf2a8ace5da3603a2a9c3ec0f61ed1\" pid:6098 exited_at:{seconds:1751848813 nanos:526495360}" Jul 7 00:40:16.420942 containerd[1574]: time="2025-07-07T00:40:16.420876535Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"b835a3fde7ad4be76e8390c71fa9ea39b7fe65c91b2daabe29ef85beca135111\" pid:6120 exited_at:{seconds:1751848816 nanos:420576120}" Jul 7 00:40:18.197795 update_engine[1536]: I20250707 00:40:18.197731 1536 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 7 00:40:18.198096 update_engine[1536]: I20250707 00:40:18.197979 1536 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 7 00:40:18.198250 update_engine[1536]: I20250707 00:40:18.198212 1536 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 7 00:40:18.198812 update_engine[1536]: E20250707 00:40:18.198509 1536 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 7 00:40:18.198812 update_engine[1536]: I20250707 00:40:18.198536 1536 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jul 7 00:40:18.198812 update_engine[1536]: I20250707 00:40:18.198542 1536 omaha_request_action.cc:617] Omaha request response: Jul 7 00:40:18.198812 update_engine[1536]: E20250707 00:40:18.198611 1536 omaha_request_action.cc:636] Omaha request network transfer failed. Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201458 1536 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201473 1536 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201478 1536 update_attempter.cc:306] Processing Done. Jul 7 00:40:18.201866 update_engine[1536]: E20250707 00:40:18.201490 1536 update_attempter.cc:619] Update failed. Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201493 1536 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201498 1536 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201500 1536 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201563 1536 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201581 1536 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201584 1536 omaha_request_action.cc:272] Request: Jul 7 00:40:18.201866 update_engine[1536]: Jul 7 00:40:18.201866 update_engine[1536]: Jul 7 00:40:18.201866 update_engine[1536]: Jul 7 00:40:18.201866 update_engine[1536]: Jul 7 00:40:18.201866 update_engine[1536]: Jul 7 00:40:18.201866 update_engine[1536]: Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201588 1536 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201689 1536 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 7 00:40:18.201866 update_engine[1536]: I20250707 00:40:18.201843 1536 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 7 00:40:18.203892 update_engine[1536]: E20250707 00:40:18.202507 1536 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 7 00:40:18.203892 update_engine[1536]: I20250707 00:40:18.202529 1536 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jul 7 00:40:18.203892 update_engine[1536]: I20250707 00:40:18.202534 1536 omaha_request_action.cc:617] Omaha request response: Jul 7 00:40:18.203892 update_engine[1536]: I20250707 00:40:18.202538 1536 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 7 00:40:18.203892 update_engine[1536]: I20250707 00:40:18.202541 1536 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 7 00:40:18.203892 update_engine[1536]: I20250707 00:40:18.202544 1536 update_attempter.cc:306] Processing Done. Jul 7 00:40:18.203892 update_engine[1536]: I20250707 00:40:18.202548 1536 update_attempter.cc:310] Error event sent. Jul 7 00:40:18.203892 update_engine[1536]: I20250707 00:40:18.203383 1536 update_check_scheduler.cc:74] Next update check in 43m34s Jul 7 00:40:18.204329 locksmithd[1578]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jul 7 00:40:18.204329 locksmithd[1578]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jul 7 00:40:26.276336 containerd[1574]: time="2025-07-07T00:40:26.276194364Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"47d5155f91f005574b361241895e5f8ea62373b6a06a2b10a0ef03df8aa4910f\" pid:6146 exited_at:{seconds:1751848826 nanos:275849116}" Jul 7 00:40:36.549787 containerd[1574]: time="2025-07-07T00:40:36.549742249Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"62c4a9dc7ea6a45b267cf659441cc43e154566faafbed584534640bf11881dcc\" pid:6170 exited_at:{seconds:1751848836 nanos:543548206}" Jul 7 00:40:40.244412 kubelet[2925]: I0707 00:40:40.244373 2925 scope.go:117] "RemoveContainer" containerID="dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6" Jul 7 00:40:40.247455 containerd[1574]: time="2025-07-07T00:40:40.247425410Z" level=info msg="RemoveContainer for \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\"" Jul 7 00:40:40.273747 containerd[1574]: time="2025-07-07T00:40:40.273658424Z" level=info msg="RemoveContainer for \"dca58601c2eda1ae05c00213daa6638c44f38179cd8061ada6430c6e49784ad6\" returns successfully" Jul 7 00:40:40.275187 containerd[1574]: time="2025-07-07T00:40:40.275099441Z" level=info msg="StopPodSandbox for \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\"" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.380 [WARNING][6196] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.383 [INFO][6196] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.383 [INFO][6196] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" iface="eth0" netns="" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.383 [INFO][6196] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.385 [INFO][6196] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.503 [INFO][6203] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.503 [INFO][6203] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.503 [INFO][6203] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.513 [WARNING][6203] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.513 [INFO][6203] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.515 [INFO][6203] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:40:40.521694 containerd[1574]: 2025-07-07 00:40:40.518 [INFO][6196] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:40.521694 containerd[1574]: time="2025-07-07T00:40:40.521470105Z" level=info msg="TearDown network for sandbox \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" successfully" Jul 7 00:40:40.521694 containerd[1574]: time="2025-07-07T00:40:40.521499410Z" level=info msg="StopPodSandbox for \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" returns successfully" Jul 7 00:40:40.525651 containerd[1574]: time="2025-07-07T00:40:40.522378997Z" level=info msg="RemovePodSandbox for \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\"" Jul 7 00:40:40.525651 containerd[1574]: time="2025-07-07T00:40:40.522414605Z" level=info msg="Forcibly stopping sandbox \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\"" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.557 [WARNING][6217] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" WorkloadEndpoint="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.557 [INFO][6217] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.557 [INFO][6217] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" iface="eth0" netns="" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.557 [INFO][6217] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.557 [INFO][6217] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.576 [INFO][6225] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.576 [INFO][6225] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.577 [INFO][6225] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.581 [WARNING][6225] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.581 [INFO][6225] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" HandleID="k8s-pod-network.b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Workload="ci--4344--1--1--6--69f6cda1f4-k8s-calico--apiserver--5c47bcf6f--2m6k2-eth0" Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.582 [INFO][6225] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 7 00:40:40.586026 containerd[1574]: 2025-07-07 00:40:40.584 [INFO][6217] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea" Jul 7 00:40:40.586412 containerd[1574]: time="2025-07-07T00:40:40.586052624Z" level=info msg="TearDown network for sandbox \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" successfully" Jul 7 00:40:40.588307 containerd[1574]: time="2025-07-07T00:40:40.588282023Z" level=info msg="Ensure that sandbox b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea in task-service has been cleanup successfully" Jul 7 00:40:40.591043 containerd[1574]: time="2025-07-07T00:40:40.591014731Z" level=info msg="RemovePodSandbox \"b428b5a04a612e811f606793716e44bab7b9ac733c8a92ade52f2b3db085ecea\" returns successfully" Jul 7 00:40:43.381660 containerd[1574]: time="2025-07-07T00:40:43.381516162Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"c41b2bce3138759b410a4b3db18bb37531e3b48f052197c2d2c40a19c777ebe4\" pid:6243 exited_at:{seconds:1751848843 nanos:380769169}" Jul 7 00:40:51.463217 containerd[1574]: time="2025-07-07T00:40:51.463162068Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"05513bffc9b5affd9a606dea7c31e891e24b2d0320dd3133c3aa7cb35928ec77\" pid:6272 exited_at:{seconds:1751848851 nanos:462942651}" Jul 7 00:40:56.281851 containerd[1574]: time="2025-07-07T00:40:56.281798277Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"62a7a89961011f6a4273eca63549da13a598efbef0d0eb6c49f05d02dc4b21b4\" pid:6315 exited_at:{seconds:1751848856 nanos:281438223}" Jul 7 00:41:06.574841 containerd[1574]: time="2025-07-07T00:41:06.574730633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"36ab192933918c8cdb4aab569dc5965cf1a985cf720dc0aef576d0fc2875196d\" pid:6337 exited_at:{seconds:1751848866 nanos:574429201}" Jul 7 00:41:13.259870 containerd[1574]: time="2025-07-07T00:41:13.259820010Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"4701c22eccb694292e873955d3a7ee7018c9a33d1201d550848af0035bbe3cab\" pid:6360 exited_at:{seconds:1751848873 nanos:259501217}" Jul 7 00:41:16.280963 containerd[1574]: time="2025-07-07T00:41:16.280897938Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"0f6c5589761af5c6fcbb1de9be68e6cba07410f6c18aa9143d78b36cb794a643\" pid:6387 exited_at:{seconds:1751848876 nanos:280597489}" Jul 7 00:41:26.279536 containerd[1574]: time="2025-07-07T00:41:26.279463663Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"22772bab8db865f901c90f7c67c217f06362440027c76482f45484ae30dd8076\" pid:6412 exited_at:{seconds:1751848886 nanos:278837889}" Jul 7 00:41:36.549777 containerd[1574]: time="2025-07-07T00:41:36.549729325Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"ced147c9a662462c7fe0f5025f6b8c9166b467ba8464f1e4feb6192674ff3413\" pid:6436 exited_at:{seconds:1751848896 nanos:549289903}" Jul 7 00:41:43.267348 containerd[1574]: time="2025-07-07T00:41:43.267308299Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"f96667e3eb2c80dd214d331b327673cfdafec20e9f8a30201bfa3e202f485b46\" pid:6461 exited_at:{seconds:1751848903 nanos:266949442}" Jul 7 00:41:51.440773 systemd[1]: Started sshd@9-65.108.89.120:22-147.75.109.163:36438.service - OpenSSH per-connection server daemon (147.75.109.163:36438). Jul 7 00:41:51.468360 containerd[1574]: time="2025-07-07T00:41:51.468326804Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"1ff038acf196ba13dcef4c27eba82f3bf35dc8f2a88a5e611bf0c0395a91bbe6\" pid:6493 exited_at:{seconds:1751848911 nanos:467845596}" Jul 7 00:41:52.479033 sshd[6499]: Accepted publickey for core from 147.75.109.163 port 36438 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:41:52.485044 sshd-session[6499]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:41:52.496131 systemd-logind[1531]: New session 8 of user core. Jul 7 00:41:52.501870 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 7 00:41:53.621403 sshd[6504]: Connection closed by 147.75.109.163 port 36438 Jul 7 00:41:53.622202 sshd-session[6499]: pam_unix(sshd:session): session closed for user core Jul 7 00:41:53.631846 systemd[1]: sshd@9-65.108.89.120:22-147.75.109.163:36438.service: Deactivated successfully. Jul 7 00:41:53.636293 systemd[1]: session-8.scope: Deactivated successfully. Jul 7 00:41:53.639937 systemd-logind[1531]: Session 8 logged out. Waiting for processes to exit. Jul 7 00:41:53.643026 systemd-logind[1531]: Removed session 8. Jul 7 00:41:56.306339 containerd[1574]: time="2025-07-07T00:41:56.306288110Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"5aee9b53008901e0b2c4b12881acfa13f177471bdcbcb0e86db6fe444633bb0e\" pid:6532 exited_at:{seconds:1751848916 nanos:297479521}" Jul 7 00:41:58.793986 systemd[1]: Started sshd@10-65.108.89.120:22-147.75.109.163:43524.service - OpenSSH per-connection server daemon (147.75.109.163:43524). Jul 7 00:41:59.836768 sshd[6542]: Accepted publickey for core from 147.75.109.163 port 43524 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:41:59.838441 sshd-session[6542]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:41:59.843317 systemd-logind[1531]: New session 9 of user core. Jul 7 00:41:59.849859 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 7 00:42:00.667193 sshd[6550]: Connection closed by 147.75.109.163 port 43524 Jul 7 00:42:00.669404 sshd-session[6542]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:00.673627 systemd-logind[1531]: Session 9 logged out. Waiting for processes to exit. Jul 7 00:42:00.673774 systemd[1]: sshd@10-65.108.89.120:22-147.75.109.163:43524.service: Deactivated successfully. Jul 7 00:42:00.675687 systemd[1]: session-9.scope: Deactivated successfully. Jul 7 00:42:00.677854 systemd-logind[1531]: Removed session 9. Jul 7 00:42:00.848321 systemd[1]: Started sshd@11-65.108.89.120:22-147.75.109.163:43528.service - OpenSSH per-connection server daemon (147.75.109.163:43528). Jul 7 00:42:01.871438 sshd[6564]: Accepted publickey for core from 147.75.109.163 port 43528 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:01.872648 sshd-session[6564]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:01.877662 systemd-logind[1531]: New session 10 of user core. Jul 7 00:42:01.881841 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 7 00:42:02.670974 sshd[6566]: Connection closed by 147.75.109.163 port 43528 Jul 7 00:42:02.671351 sshd-session[6564]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:02.675468 systemd[1]: sshd@11-65.108.89.120:22-147.75.109.163:43528.service: Deactivated successfully. Jul 7 00:42:02.677560 systemd[1]: session-10.scope: Deactivated successfully. Jul 7 00:42:02.680630 systemd-logind[1531]: Session 10 logged out. Waiting for processes to exit. Jul 7 00:42:02.682955 systemd-logind[1531]: Removed session 10. Jul 7 00:42:02.849421 systemd[1]: Started sshd@12-65.108.89.120:22-147.75.109.163:43540.service - OpenSSH per-connection server daemon (147.75.109.163:43540). Jul 7 00:42:03.893672 sshd[6576]: Accepted publickey for core from 147.75.109.163 port 43540 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:03.895031 sshd-session[6576]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:03.900296 systemd-logind[1531]: New session 11 of user core. Jul 7 00:42:03.902831 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 7 00:42:04.672708 sshd[6578]: Connection closed by 147.75.109.163 port 43540 Jul 7 00:42:04.673233 sshd-session[6576]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:04.676625 systemd-logind[1531]: Session 11 logged out. Waiting for processes to exit. Jul 7 00:42:04.676744 systemd[1]: sshd@12-65.108.89.120:22-147.75.109.163:43540.service: Deactivated successfully. Jul 7 00:42:04.678438 systemd[1]: session-11.scope: Deactivated successfully. Jul 7 00:42:04.680141 systemd-logind[1531]: Removed session 11. Jul 7 00:42:06.575186 containerd[1574]: time="2025-07-07T00:42:06.575144570Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"3e9fdce2cfb05e0c8ec2e8f63f9a80623e52397f4754cd510ee0dac5c53b76b5\" pid:6606 exited_at:{seconds:1751848926 nanos:574533056}" Jul 7 00:42:09.842939 systemd[1]: Started sshd@13-65.108.89.120:22-147.75.109.163:58614.service - OpenSSH per-connection server daemon (147.75.109.163:58614). Jul 7 00:42:10.894150 sshd[6620]: Accepted publickey for core from 147.75.109.163 port 58614 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:10.895960 sshd-session[6620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:10.904786 systemd-logind[1531]: New session 12 of user core. Jul 7 00:42:10.911997 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 7 00:42:11.683785 sshd[6622]: Connection closed by 147.75.109.163 port 58614 Jul 7 00:42:11.686619 sshd-session[6620]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:11.691559 systemd-logind[1531]: Session 12 logged out. Waiting for processes to exit. Jul 7 00:42:11.692019 systemd[1]: sshd@13-65.108.89.120:22-147.75.109.163:58614.service: Deactivated successfully. Jul 7 00:42:11.693878 systemd[1]: session-12.scope: Deactivated successfully. Jul 7 00:42:11.695455 systemd-logind[1531]: Removed session 12. Jul 7 00:42:13.348328 containerd[1574]: time="2025-07-07T00:42:13.348286698Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"3372187fa0a11d7c9ffc4052be21c4c663af8c4bc1932df012872faa815a704a\" pid:6648 exited_at:{seconds:1751848933 nanos:347861525}" Jul 7 00:42:16.311900 containerd[1574]: time="2025-07-07T00:42:16.311827408Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"1fe5029ab45b699b36e170bced7b7f9173a518cf62a92941588056857a211ec3\" pid:6674 exited_at:{seconds:1751848936 nanos:311376958}" Jul 7 00:42:16.863122 systemd[1]: Started sshd@14-65.108.89.120:22-147.75.109.163:34774.service - OpenSSH per-connection server daemon (147.75.109.163:34774). Jul 7 00:42:17.897059 sshd[6685]: Accepted publickey for core from 147.75.109.163 port 34774 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:17.898474 sshd-session[6685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:17.903905 systemd-logind[1531]: New session 13 of user core. Jul 7 00:42:17.908846 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 7 00:42:18.712370 sshd[6687]: Connection closed by 147.75.109.163 port 34774 Jul 7 00:42:18.713650 sshd-session[6685]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:18.716914 systemd[1]: sshd@14-65.108.89.120:22-147.75.109.163:34774.service: Deactivated successfully. Jul 7 00:42:18.719783 systemd[1]: session-13.scope: Deactivated successfully. Jul 7 00:42:18.721223 systemd-logind[1531]: Session 13 logged out. Waiting for processes to exit. Jul 7 00:42:18.723332 systemd-logind[1531]: Removed session 13. Jul 7 00:42:19.696454 systemd[1]: Started sshd@15-65.108.89.120:22-152.32.189.21:41882.service - OpenSSH per-connection server daemon (152.32.189.21:41882). Jul 7 00:42:20.774764 sshd[6701]: Received disconnect from 152.32.189.21 port 41882:11: Bye Bye [preauth] Jul 7 00:42:20.774764 sshd[6701]: Disconnected from authenticating user root 152.32.189.21 port 41882 [preauth] Jul 7 00:42:20.776662 systemd[1]: sshd@15-65.108.89.120:22-152.32.189.21:41882.service: Deactivated successfully. Jul 7 00:42:23.892592 systemd[1]: Started sshd@16-65.108.89.120:22-147.75.109.163:34790.service - OpenSSH per-connection server daemon (147.75.109.163:34790). Jul 7 00:42:24.935723 sshd[6720]: Accepted publickey for core from 147.75.109.163 port 34790 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:24.937134 sshd-session[6720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:24.942580 systemd-logind[1531]: New session 14 of user core. Jul 7 00:42:24.946797 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 7 00:42:25.744693 sshd[6722]: Connection closed by 147.75.109.163 port 34790 Jul 7 00:42:25.745269 sshd-session[6720]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:25.750480 systemd[1]: sshd@16-65.108.89.120:22-147.75.109.163:34790.service: Deactivated successfully. Jul 7 00:42:25.752472 systemd[1]: session-14.scope: Deactivated successfully. Jul 7 00:42:25.754311 systemd-logind[1531]: Session 14 logged out. Waiting for processes to exit. Jul 7 00:42:25.755629 systemd-logind[1531]: Removed session 14. Jul 7 00:42:25.922756 systemd[1]: Started sshd@17-65.108.89.120:22-147.75.109.163:34796.service - OpenSSH per-connection server daemon (147.75.109.163:34796). Jul 7 00:42:26.272189 containerd[1574]: time="2025-07-07T00:42:26.272151505Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"ec462505f61faeed03c7dac80f5fa38b7d2881afe6d4412e1735e5da255ac8c5\" pid:6748 exited_at:{seconds:1751848946 nanos:271846409}" Jul 7 00:42:26.950353 sshd[6733]: Accepted publickey for core from 147.75.109.163 port 34796 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:26.950234 sshd-session[6733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:26.954896 systemd-logind[1531]: New session 15 of user core. Jul 7 00:42:26.963830 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 7 00:42:27.900841 sshd[6757]: Connection closed by 147.75.109.163 port 34796 Jul 7 00:42:27.914202 sshd-session[6733]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:27.923122 systemd-logind[1531]: Session 15 logged out. Waiting for processes to exit. Jul 7 00:42:27.925378 systemd[1]: sshd@17-65.108.89.120:22-147.75.109.163:34796.service: Deactivated successfully. Jul 7 00:42:27.928827 systemd[1]: session-15.scope: Deactivated successfully. Jul 7 00:42:27.932278 systemd-logind[1531]: Removed session 15. Jul 7 00:42:28.078328 systemd[1]: Started sshd@18-65.108.89.120:22-147.75.109.163:58226.service - OpenSSH per-connection server daemon (147.75.109.163:58226). Jul 7 00:42:29.142102 sshd[6767]: Accepted publickey for core from 147.75.109.163 port 58226 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:29.143742 sshd-session[6767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:29.148727 systemd-logind[1531]: New session 16 of user core. Jul 7 00:42:29.152863 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 7 00:42:30.943152 sshd[6769]: Connection closed by 147.75.109.163 port 58226 Jul 7 00:42:30.944075 sshd-session[6767]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:30.950989 systemd-logind[1531]: Session 16 logged out. Waiting for processes to exit. Jul 7 00:42:30.951074 systemd[1]: sshd@18-65.108.89.120:22-147.75.109.163:58226.service: Deactivated successfully. Jul 7 00:42:30.952866 systemd[1]: session-16.scope: Deactivated successfully. Jul 7 00:42:30.954582 systemd-logind[1531]: Removed session 16. Jul 7 00:42:31.117447 systemd[1]: Started sshd@19-65.108.89.120:22-147.75.109.163:58242.service - OpenSSH per-connection server daemon (147.75.109.163:58242). Jul 7 00:42:32.153570 sshd[6786]: Accepted publickey for core from 147.75.109.163 port 58242 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:32.154987 sshd-session[6786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:32.161751 systemd-logind[1531]: New session 17 of user core. Jul 7 00:42:32.165990 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 7 00:42:33.263244 sshd[6788]: Connection closed by 147.75.109.163 port 58242 Jul 7 00:42:33.263815 sshd-session[6786]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:33.269189 systemd-logind[1531]: Session 17 logged out. Waiting for processes to exit. Jul 7 00:42:33.269326 systemd[1]: sshd@19-65.108.89.120:22-147.75.109.163:58242.service: Deactivated successfully. Jul 7 00:42:33.270904 systemd[1]: session-17.scope: Deactivated successfully. Jul 7 00:42:33.271926 systemd-logind[1531]: Removed session 17. Jul 7 00:42:33.440863 systemd[1]: Started sshd@20-65.108.89.120:22-147.75.109.163:58250.service - OpenSSH per-connection server daemon (147.75.109.163:58250). Jul 7 00:42:34.508080 sshd[6798]: Accepted publickey for core from 147.75.109.163 port 58250 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:34.509605 sshd-session[6798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:34.514632 systemd-logind[1531]: New session 18 of user core. Jul 7 00:42:34.519834 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 7 00:42:35.299761 sshd[6800]: Connection closed by 147.75.109.163 port 58250 Jul 7 00:42:35.300416 sshd-session[6798]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:35.303887 systemd[1]: sshd@20-65.108.89.120:22-147.75.109.163:58250.service: Deactivated successfully. Jul 7 00:42:35.306286 systemd[1]: session-18.scope: Deactivated successfully. Jul 7 00:42:35.307982 systemd-logind[1531]: Session 18 logged out. Waiting for processes to exit. Jul 7 00:42:35.309327 systemd-logind[1531]: Removed session 18. Jul 7 00:42:36.593794 containerd[1574]: time="2025-07-07T00:42:36.593744692Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"ae89547f144b9ef93852d9675f7c4fd39a7f88289ecd751516c256731b54e59f\" pid:6832 exited_at:{seconds:1751848956 nanos:593469834}" Jul 7 00:42:40.393948 systemd[1]: Started sshd@21-65.108.89.120:22-101.36.119.98:54368.service - OpenSSH per-connection server daemon (101.36.119.98:54368). Jul 7 00:42:40.473208 systemd[1]: Started sshd@22-65.108.89.120:22-147.75.109.163:48820.service - OpenSSH per-connection server daemon (147.75.109.163:48820). Jul 7 00:42:41.505920 sshd[6849]: Accepted publickey for core from 147.75.109.163 port 48820 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:41.507375 sshd-session[6849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:41.513037 systemd-logind[1531]: New session 19 of user core. Jul 7 00:42:41.517937 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 7 00:42:41.543607 sshd[6845]: Received disconnect from 101.36.119.98 port 54368:11: Bye Bye [preauth] Jul 7 00:42:41.543607 sshd[6845]: Disconnected from authenticating user root 101.36.119.98 port 54368 [preauth] Jul 7 00:42:41.545270 systemd[1]: sshd@21-65.108.89.120:22-101.36.119.98:54368.service: Deactivated successfully. Jul 7 00:42:42.370614 sshd[6851]: Connection closed by 147.75.109.163 port 48820 Jul 7 00:42:42.371156 sshd-session[6849]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:42.373612 systemd[1]: sshd@22-65.108.89.120:22-147.75.109.163:48820.service: Deactivated successfully. Jul 7 00:42:42.375542 systemd-logind[1531]: Session 19 logged out. Waiting for processes to exit. Jul 7 00:42:42.376005 systemd[1]: session-19.scope: Deactivated successfully. Jul 7 00:42:42.380443 systemd-logind[1531]: Removed session 19. Jul 7 00:42:43.426801 containerd[1574]: time="2025-07-07T00:42:43.426761950Z" level=info msg="TaskExit event in podsandbox handler container_id:\"993fdf65cc8d77e549ed6b0da0b099c222286e72aaf918dbf1fccf0050b4604e\" id:\"a9f94754e39a9b04dbb6afd78bce1d906d93b43d9f07346b7744a560fdbe8419\" pid:6877 exited_at:{seconds:1751848963 nanos:426430295}" Jul 7 00:42:47.555860 systemd[1]: Started sshd@23-65.108.89.120:22-147.75.109.163:60388.service - OpenSSH per-connection server daemon (147.75.109.163:60388). Jul 7 00:42:48.656497 sshd[6891]: Accepted publickey for core from 147.75.109.163 port 60388 ssh2: RSA SHA256:KALlPJmdpwbf/DDhfNxo/xA6TxlZvS5MIxZXA5c/d8E Jul 7 00:42:48.657842 sshd-session[6891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 7 00:42:48.663051 systemd-logind[1531]: New session 20 of user core. Jul 7 00:42:48.667827 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 7 00:42:49.577265 sshd[6895]: Connection closed by 147.75.109.163 port 60388 Jul 7 00:42:49.578172 sshd-session[6891]: pam_unix(sshd:session): session closed for user core Jul 7 00:42:49.582638 systemd[1]: sshd@23-65.108.89.120:22-147.75.109.163:60388.service: Deactivated successfully. Jul 7 00:42:49.585539 systemd[1]: session-20.scope: Deactivated successfully. Jul 7 00:42:49.588323 systemd-logind[1531]: Session 20 logged out. Waiting for processes to exit. Jul 7 00:42:49.590322 systemd-logind[1531]: Removed session 20. Jul 7 00:42:51.444348 containerd[1574]: time="2025-07-07T00:42:51.444163394Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"b2fcdfdb4755a9fcc1defab4ad9ae0b6edea2e87fd9d84c89bbbc33c68af6050\" pid:6917 exited_at:{seconds:1751848971 nanos:443911439}" Jul 7 00:42:56.281313 containerd[1574]: time="2025-07-07T00:42:56.281274796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a447bfbff471536510e7a79b14ccd6ff17ce9b635d2b605f50a2d50113dd452\" id:\"e9a0e75be6b2fe7346297707e65c9a98277e57c821fd42097db28a7ce26734eb\" pid:6939 exited_at:{seconds:1751848976 nanos:280559950}" Jul 7 00:43:05.339084 kubelet[2925]: E0707 00:43:05.332846 2925 controller.go:195] "Failed to update lease" err="Put \"https://65.108.89.120:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-6-69f6cda1f4?timeout=10s\": context deadline exceeded" Jul 7 00:43:05.738754 kubelet[2925]: E0707 00:43:05.738708 2925 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:55964->10.0.0.2:2379: read: connection timed out" Jul 7 00:43:06.369171 systemd[1]: cri-containerd-3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d.scope: Deactivated successfully. Jul 7 00:43:06.369540 systemd[1]: cri-containerd-3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d.scope: Consumed 12.911s CPU time, 109.3M memory peak, 78.5M read from disk. Jul 7 00:43:06.464530 containerd[1574]: time="2025-07-07T00:43:06.464497873Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d\" id:\"3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d\" pid:3248 exit_status:1 exited_at:{seconds:1751848986 nanos:449357408}" Jul 7 00:43:06.472634 containerd[1574]: time="2025-07-07T00:43:06.472580069Z" level=info msg="received exit event container_id:\"3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d\" id:\"3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d\" pid:3248 exit_status:1 exited_at:{seconds:1751848986 nanos:449357408}" Jul 7 00:43:06.550183 containerd[1574]: time="2025-07-07T00:43:06.550106504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4c3a5b248992c46a7b0f4b69ab52459e113b3adf7aff9bb0b4a5ed4073385b38\" id:\"c11cbbda8434677c5b4a13cabf64435937879d5fd8bcc0988e58f261b03f6f2c\" pid:6961 exited_at:{seconds:1751848986 nanos:549490933}" Jul 7 00:43:06.561104 systemd[1]: cri-containerd-d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa.scope: Deactivated successfully. Jul 7 00:43:06.561317 systemd[1]: cri-containerd-d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa.scope: Consumed 3.463s CPU time, 88.4M memory peak, 125.1M read from disk. Jul 7 00:43:06.569482 containerd[1574]: time="2025-07-07T00:43:06.569459053Z" level=info msg="received exit event container_id:\"d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa\" id:\"d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa\" pid:2770 exit_status:1 exited_at:{seconds:1751848986 nanos:566965485}" Jul 7 00:43:06.573746 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d-rootfs.mount: Deactivated successfully. Jul 7 00:43:06.581565 containerd[1574]: time="2025-07-07T00:43:06.581541635Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa\" id:\"d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa\" pid:2770 exit_status:1 exited_at:{seconds:1751848986 nanos:566965485}" Jul 7 00:43:06.635344 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa-rootfs.mount: Deactivated successfully. Jul 7 00:43:07.381196 kubelet[2925]: I0707 00:43:07.381140 2925 scope.go:117] "RemoveContainer" containerID="d7f3fa3313f60db5dfa440d2975492cb764ee8c95f8eb2bb1ba4ec5388778efa" Jul 7 00:43:07.396914 kubelet[2925]: I0707 00:43:07.396732 2925 scope.go:117] "RemoveContainer" containerID="3c36ede7da8b639ab41ea0fcb52398142cd6cd230707715135d82c323159e90d" Jul 7 00:43:07.443183 containerd[1574]: time="2025-07-07T00:43:07.443118516Z" level=info msg="CreateContainer within sandbox \"7695b4a177a4f119d0d7005b02fce5f001ee0ed7c99912ad02e438c66aba5e20\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jul 7 00:43:07.443877 containerd[1574]: time="2025-07-07T00:43:07.443130929Z" level=info msg="CreateContainer within sandbox \"6002560c68c019b3531b9065b47e969a3b6e5bd348f17df71e060ab43cb39070\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jul 7 00:43:07.561740 containerd[1574]: time="2025-07-07T00:43:07.559794782Z" level=info msg="Container 4ec6e5b00c7fafd3a7dbbd0e63997cf9050c1afe4efd83e9536eba01d12fe784: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:43:07.568064 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2055101148.mount: Deactivated successfully. Jul 7 00:43:07.570715 containerd[1574]: time="2025-07-07T00:43:07.569167489Z" level=info msg="Container 2db7be7f816daf2c1e901fb76567ee6902eb7a0c484f85bf7eeb6b8074c56d1c: CDI devices from CRI Config.CDIDevices: []" Jul 7 00:43:07.578071 containerd[1574]: time="2025-07-07T00:43:07.578025708Z" level=info msg="CreateContainer within sandbox \"7695b4a177a4f119d0d7005b02fce5f001ee0ed7c99912ad02e438c66aba5e20\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"2db7be7f816daf2c1e901fb76567ee6902eb7a0c484f85bf7eeb6b8074c56d1c\"" Jul 7 00:43:07.579550 containerd[1574]: time="2025-07-07T00:43:07.579534609Z" level=info msg="StartContainer for \"2db7be7f816daf2c1e901fb76567ee6902eb7a0c484f85bf7eeb6b8074c56d1c\"" Jul 7 00:43:07.581409 containerd[1574]: time="2025-07-07T00:43:07.580369191Z" level=info msg="CreateContainer within sandbox \"6002560c68c019b3531b9065b47e969a3b6e5bd348f17df71e060ab43cb39070\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"4ec6e5b00c7fafd3a7dbbd0e63997cf9050c1afe4efd83e9536eba01d12fe784\"" Jul 7 00:43:07.581409 containerd[1574]: time="2025-07-07T00:43:07.581134073Z" level=info msg="StartContainer for \"4ec6e5b00c7fafd3a7dbbd0e63997cf9050c1afe4efd83e9536eba01d12fe784\"" Jul 7 00:43:07.584202 containerd[1574]: time="2025-07-07T00:43:07.584183297Z" level=info msg="connecting to shim 2db7be7f816daf2c1e901fb76567ee6902eb7a0c484f85bf7eeb6b8074c56d1c" address="unix:///run/containerd/s/d339e5c9b5c61f99c7e2538bb8c3d182380e64e4e279083cd17ffd74585c8181" protocol=ttrpc version=3 Jul 7 00:43:07.584876 containerd[1574]: time="2025-07-07T00:43:07.584763831Z" level=info msg="connecting to shim 4ec6e5b00c7fafd3a7dbbd0e63997cf9050c1afe4efd83e9536eba01d12fe784" address="unix:///run/containerd/s/8e8c573145a79817b036a2fb81776c45971040aaa96ce5ce956db00a8bef5ccf" protocol=ttrpc version=3 Jul 7 00:43:07.642883 systemd[1]: Started cri-containerd-4ec6e5b00c7fafd3a7dbbd0e63997cf9050c1afe4efd83e9536eba01d12fe784.scope - libcontainer container 4ec6e5b00c7fafd3a7dbbd0e63997cf9050c1afe4efd83e9536eba01d12fe784. Jul 7 00:43:07.659828 systemd[1]: Started cri-containerd-2db7be7f816daf2c1e901fb76567ee6902eb7a0c484f85bf7eeb6b8074c56d1c.scope - libcontainer container 2db7be7f816daf2c1e901fb76567ee6902eb7a0c484f85bf7eeb6b8074c56d1c. Jul 7 00:43:07.707385 containerd[1574]: time="2025-07-07T00:43:07.707354460Z" level=info msg="StartContainer for \"4ec6e5b00c7fafd3a7dbbd0e63997cf9050c1afe4efd83e9536eba01d12fe784\" returns successfully" Jul 7 00:43:07.728493 containerd[1574]: time="2025-07-07T00:43:07.728457896Z" level=info msg="StartContainer for \"2db7be7f816daf2c1e901fb76567ee6902eb7a0c484f85bf7eeb6b8074c56d1c\" returns successfully" Jul 7 00:43:11.360428 kubelet[2925]: E0707 00:43:11.353975 2925 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:55764->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4344-1-1-6-69f6cda1f4.184fd15dcca9edbc kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4344-1-1-6-69f6cda1f4,UID:e3806d3611dc8643e5c566a0aefacd7f,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4344-1-1-6-69f6cda1f4,},FirstTimestamp:2025-07-07 00:43:00.866264508 +0000 UTC m=+260.927393378,LastTimestamp:2025-07-07 00:43:00.866264508 +0000 UTC m=+260.927393378,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344-1-1-6-69f6cda1f4,}" Jul 7 00:43:11.786648 systemd[1]: cri-containerd-5747a57f08fd0b3fa42f3117b83a4d5bc641f4306680ca2f2e62e7ef364fb1d9.scope: Deactivated successfully.